WO2017185575A1 - 一种触摸屏轨迹识别方法及装置 - Google Patents
一种触摸屏轨迹识别方法及装置 Download PDFInfo
- Publication number
- WO2017185575A1 WO2017185575A1 PCT/CN2016/097316 CN2016097316W WO2017185575A1 WO 2017185575 A1 WO2017185575 A1 WO 2017185575A1 CN 2016097316 W CN2016097316 W CN 2016097316W WO 2017185575 A1 WO2017185575 A1 WO 2017185575A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch screen
- gesture
- track
- touch
- pen
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present application relates to the field of touch screen operation technologies, and in particular, to a touch screen track recognition method and apparatus.
- the operation result of the touch screen pen is consistent with the result of the gesture operation, but there are many limitations in the touch screen pen and gesture operation in the touch screen operation state. Such confusing operations as touch screen pens and gesture operations result in difficulty in distinguishing actual operational attributes, making the operation too cumbersome and not straightforward, resulting in technical problems with low user experience.
- a method and device for recognizing a touch screen track provided by an embodiment of the present invention solves many limitations of the touch screen pen and gesture operation in the working state of the touch screen. For example, the chaotic operation of the touch screen pen and the gesture operation makes it difficult to distinguish actual operation attributes. This makes the operation too cumbersome and not straightforward, resulting in technical problems with low user experience.
- Determining the type of the touch operation command if it is a touch screen pen trigger command, performing corresponding operation processing according to the captured touch screen pen track, and if it is a gesture trigger command, performing corresponding operation processing according to the captured gesture track.
- the monitoring by the touch operation instruction triggered on the touch screen, specifically includes:
- the corresponding touch operation instruction is determined after the trigger mode automatically matches the input source.
- determining the corresponding touch operation instruction after the triggering mode automatically matches the input source includes:
- the triggering mode automatically matches the encapsulation api corresponding to the input source, and determines the corresponding touch operation instruction according to the encapsulation api.
- performing corresponding operation processing according to the captured gesture trajectory specifically includes:
- the gesture operation process includes: a zoom interface, and/or a scroll interface, and/or selection of text content, and/or a fill gesture swipe area, and/or a cut/copy gesture swipe area.
- performing corresponding operation processing according to the captured touch screen pen track specifically includes:
- the touch screen pen operation process includes: ink drawing processing, and/or laser pen processing, and/or regular image/object drawing processing, and/or eraser processing.
- the touch screen track identification method further includes:
- the corresponding priority order operation processing is performed according to the preset priority level
- a monitoring unit for monitoring a touch operation instruction triggered by the touch screen
- a determining unit configured to determine a type of the touch operation instruction, if the touch screen pen triggers the instruction, triggers the touch screen pen processing unit, and if the gesture triggers the instruction, triggers the gesture processing unit;
- the touch screen pen processing unit is configured to perform corresponding operation processing according to the captured touch screen pen track;
- the gesture processing unit is configured to perform corresponding operation processing according to the captured gesture track.
- the monitoring unit specifically includes:
- the determining subunit is configured to determine the corresponding touch operation instruction after the trigger mode automatically matches the input source.
- the determining subunit is specifically configured to automatically match the triggering manner to the encapsulation api corresponding to the input source, and determine the corresponding touch operation instruction according to the encapsulation api.
- the gesture processing unit performs corresponding gesture operation processing according to the gesture track captured by the single-finger operation or the multi-finger operation;
- the gesture operation process includes: a zoom interface, and/or a scroll interface, and/or selection of text content, and/or a fill gesture swipe area, and/or a cut/copy gesture swipe area.
- the touch screen pen processing unit is specifically configured to perform corresponding touch screen pen operation processing according to the captured touch screen pen track;
- the touch screen pen operation process includes: ink drawing processing, and/or laser pen processing, and/or regular image/object drawing processing, and/or eraser processing.
- the touch screen track identification device further includes:
- a processing unit configured to: when the determining unit determines that the type of the touch operation instruction is the touch screen triggering instruction and the gesture triggering instruction, performing corresponding priority order operation processing according to the preset priority, or based on The preset priority is determined by a special case. If the special case is used, the priority order is adjusted again, or an operation corresponding to the touch screen triggering instruction and the gesture triggering instruction is simultaneously performed.
- processor a memory, a communication interface, and a bus
- the processor, the memory, and the communication interface are connected by the bus and complete communication with each other;
- the memory stores executable program code
- the processor runs a program corresponding to the executable program code by reading executable program code stored in the memory for performing a touch screen track recognition method as described herein.
- a storage medium is provided in an embodiment of the present application, wherein the storage medium is configured to store executable program code, and the executable program code is configured to execute a touch screen track identification method according to the present application at runtime.
- An application program provided by an embodiment of the present application is configured to perform a touch screen track identification method according to the present application at runtime.
- the embodiments of the present application have the following advantages:
- a method and device for recognizing a touch screen track includes: monitoring a touch operation instruction triggered on the touch screen, and then determining a type of the touch operation instruction, if the touch screen pen triggers the instruction Then, according to the captured touch screen pen track, the corresponding operation process is performed. If the gesture trigger command is executed, the corresponding operation process is performed according to the captured gesture track. In this embodiment, by determining the type of the touch operation command, if the touch screen pen triggers the instruction, the corresponding operation processing is performed according to the captured touch screen pen track, and if the gesture trigger instruction is performed, the corresponding gesture is performed according to the captured gesture track.
- the operation processing solves many limitations of the touch screen pen and gesture operation in the current working state of the touch screen. For example, the chaotic operation of the touch screen pen and the gesture operation makes it difficult to distinguish the actual operation attribute, making the operation too troublesome and not direct, resulting in Low user experience with technical issues.
- FIG. 1 is a schematic flowchart diagram of an embodiment of a method for recognizing a touch screen track according to an embodiment of the present disclosure
- FIG. 2 is a schematic flowchart diagram of another embodiment of a method for recognizing a touch screen track according to an embodiment of the present disclosure
- FIG. 3 is a schematic structural diagram of an embodiment of a touch screen track recognizing device according to an embodiment of the present disclosure
- FIG. 4 is a schematic structural diagram of another embodiment of a touch screen track recognizing device according to an embodiment of the present disclosure
- FIG. 5 is a schematic diagram of an application example of FIG. 2.
- an embodiment of a method for recognizing a touch screen track provided by an embodiment of the present application includes:
- step 102 determine the type of the touch operation command, if it is a touch screen pen trigger instruction, step 103 is performed, if it is a gesture trigger instruction, step 104 is performed;
- step 103 is performed, and if the gesture triggers the instruction, step 104 is performed.
- the type of the touch operation instruction is a touch screen pen trigger instruction
- corresponding operation processing is performed according to the captured touch screen pen track.
- the corresponding operation processing is performed according to the captured gesture trajectory.
- the operation processing solves many limitations of the touch screen pen and gesture operation in the current working state of the touch screen. For example, the chaotic operation of the touch screen pen and the gesture operation makes it difficult to distinguish the actual operation attribute, making the operation too troublesome and not direct, resulting in Low user experience with technical issues.
- FIG. 2 another embodiment of the touch screen track identification method provided by the embodiment of the present application includes:
- the trigger mode returned through the touch screen is first acquired.
- the corresponding touch operation command needs to be determined after the trigger mode automatically matches the input source.
- the foregoing determining the corresponding touch operation command after the trigger mode automatically matches the input source may be a package api corresponding to the trigger mode automatically matching the input source, and determining a corresponding touch operation instruction according to the package api.
- the ios9 system has an attribute type, which can be a finger or a touch screen pen. If the input source is different, the api gesture of the system package is different. For example, when the gesture is coming, the UITouch gesture source encapsulates the type type, which is well known in the art, and will not be described in detail.
- step 204 Determine the type of the touch operation command. If the touch screen pen triggers the command, step 204 is performed. If the gesture triggers the command, step 205 is performed, if the touch screen trigger command and the gesture trigger are simultaneously performed. The instruction proceeds to step 206;
- step 204 After determining the corresponding touch operation command after the trigger mode is automatically matched with the input source, it is necessary to determine the type of the touch operation command. If the touch screen pen triggers the instruction, step 204 is performed, and if the gesture triggers the command, step 205 is performed. If the touch screen pen triggers the command and the gesture trigger command at the same time, step 206 is performed.
- the type of the touch operation instruction is a touch screen pen trigger instruction
- corresponding operation processing is performed according to the captured touch screen pen track.
- touch screen pen operation processing includes: ink drawing processing, and/or laser pen processing, and/or regular image/object drawing processing, and/or eraser processing.
- the corresponding operation processing is performed according to the captured gesture trajectory.
- the corresponding operation processing according to the captured gesture trajectory may specifically be performed according to the gesture trajectory captured by the single-finger operation or the multi-finger operation.
- the gesture operation process includes: a zoom interface and/or a scroll interface, and/or selection of text content, and/or a fill gesture swipe area, and/or a cut/copy gesture swipe area, which is actually processed by the gesture operation. The effect is not limited.
- the corresponding priority order operation processing is performed according to the preset priority level, or the special case determination is performed based on the preset priority level, if it is a special case, Perform the re-adjustment of the priority order, or perform the operations corresponding to the touch-screen pen trigger command and the gesture trigger command at the same time.
- the preset priority is that the hand first responds to the hand when the hand first touches, and the pen first responds to the pen first, but adds a special judgment, such as a hand. When the contact area is too large, it will not respond to the hand.
- the priority decision can be made through some recognition. For example, if the contact area of the hand is too large or meets certain characteristics, it can be determined that the user is in the state of holding the pen, and the palm or the finger supports the screen drawing, in which case the contact behavior of the hand is automatically ignored.
- the subsequent processing is not necessarily mutually exclusive. If the two actions do not conflict, the two actions of the hand and the pen can be performed at the same time.
- the action of the finger can also be extended to any other action.
- the single-finger slip can be defined as the erase behavior.
- the pen's behavior is not necessarily only to draw handwriting, such as a rectangle, a circle, etc., which can be defined as a rule, or even a pen as an eraser.
- the foregoing preset priority may be that the touch screen ratio operation corresponding to the touch screen pen trigger instruction is performed first, and then the gesture operation corresponding to the gesture trigger instruction is performed, or the processing is reversed, and details are not described herein again.
- the foregoing gesture track may be a multi-finger, single-finger operation, and is not specifically limited herein.
- the operation of distinguishing the touch screen pen and the gesture is used, and when the touch screen pen is used, the operation of the touch screen pen is responded to, the operation result is a handwriting effect, and when the finger is used, the gesture operation is responded, such as a scroll interface. Zoom the interface, select text, and more.
- the operation processing solves many limitations of the touch screen pen and gesture operation in the current working state of the touch screen. For example, the chaotic operation of the touch screen pen and the gesture operation makes it difficult to distinguish the actual operation attribute, making the operation too troublesome and not direct, resulting in Low user experience with technical issues.
- the interface can be operated by gestures during the process of using the touch screen pen, such as scrolling and zooming the interface, so that the user can scroll through the side and help the user to complete the labeling process conveniently and quickly.
- the drawing behavior is clearly distinguished from other behaviors, and more possibilities can be provided in the actual design, such as when the user draws ink through the touch screen pen, or by single fingering. Scroll the page content.
- an embodiment of a touch screen track identification apparatus provided in an embodiment of the present application includes:
- the monitoring unit 301 is configured to monitor a touch operation instruction triggered by the touch screen
- the determining unit 302 is configured to determine the type of the touch operation command, if it is a touch screen pen triggering instruction, trigger the touch screen pen processing unit 303, if it is a gesture triggering instruction, trigger the gesture processing unit 304;
- the touch screen pen processing unit 303 is configured to perform corresponding operation processing according to the captured touch screen pen track;
- the gesture processing unit 304 is configured to perform corresponding operation processing according to the captured gesture trajectory.
- the determining unit 302 determines the type of the touch operation command. If the touch screen pen triggers the command, the touch screen pen processing unit 303 performs corresponding operation processing according to the captured touch screen pen track.
- the gesture processing unit 304 performs corresponding operation processing according to the captured gesture trajectory, and solves many limitations of the current touch screen pen and gesture operation in the touch screen working state, such as the chaotic operation of the touch screen pen and the gesture operation, which makes it difficult to distinguish the actual operation. Attributes make the operation too cumbersome and not straightforward, resulting in technical problems with low user experience.
- FIG. 4 another embodiment of the touch screen track recognizing device provided in the embodiment of the present application includes:
- a monitoring unit 401 configured to monitor a touch operation instruction on a touch screen triggered on the touch screen;
- the monitoring unit 401 specifically includes:
- the determining sub-unit 4012 is configured to determine a corresponding touch operation instruction after the trigger mode automatically matches the input source, and determine the sub-unit 4012, where the trigger unit automatically matches the encapsulation api corresponding to the input source, and determines the corresponding touch according to the encapsulation api. Touch the operation command.
- the determining unit 402 is configured to determine the type of the touch operation instruction, if it is a touch screen pen triggering instruction, trigger the touch screen pen processing unit 403, if it is a gesture triggering instruction, trigger the gesture processing unit 404;
- the touch screen pen processing unit 403 is configured to perform corresponding operation processing according to the captured touch screen pen track, and the touch screen pen processing unit is specifically configured to perform corresponding touch screen pen operation processing according to the captured touch screen pen track;
- the touch screen pen operation processing includes: ink drawing processing, and/or laser pen processing, and/or regular image/object drawing processing, and/or eraser processing.
- the gesture processing unit 404 is configured to perform corresponding operation processing according to the captured gesture trajectory, and the gesture processing unit 404 is specifically configured to perform corresponding gesture operation processing according to the gesture trajectory captured by the single-finger operation or the multi-finger operation;
- the gesture operation process includes: a zoom interface, and/or a scroll interface, and/or selection of text content, and/or a fill gesture swipe area, and/or a cut/copy gesture swipe area.
- the processing unit 405 is configured to: when the determining unit 402 determines that the type of the touch operation instruction is a touch screen triggering instruction and a gesture triggering instruction, perform corresponding priority order operation processing according to the preset priority level, or based on the preset priority level The special case is determined. If it is a special case, the priority order is adjusted again, or the operation corresponding to the touch screen trigger command and the gesture trigger command is performed at the same time.
- the determining unit 402 determines the type of the touch operation command. If the touch screen pen triggers the command, the touch screen pen processing unit 403 performs corresponding operation processing according to the captured touch screen pen track.
- the gesture processing unit 404 performs corresponding operation processing according to the captured gesture trajectory, and solves many limitations of the current touch screen pen and gesture operation in the touch screen working state, such as the chaotic operation of the touch screen pen and the gesture operation, which makes it difficult to distinguish The actual operational attributes make the operation too cumbersome and not straightforward, resulting in technical problems with low user experience.
- the interface can be operated by gestures during the process of using the touch screen pen, such as scrolling and zooming the interface, so that the user can scroll through the side and help the user to complete the labeling process conveniently and quickly.
- an electronic device which may include:
- processor a memory, a communication interface, and a bus
- the processor, the memory, and the communication interface are connected by the bus and complete communication with each other;
- the memory stores executable program code
- the processor runs a program corresponding to the executable program code by reading executable program code stored in the memory for performing a touch screen track recognition method provided by the present application at runtime.
- the operation processing solves many limitations of the touch screen pen and gesture operation in the current working state of the touch screen. For example, the chaotic operation of the touch screen pen and the gesture operation makes it difficult to distinguish the actual operation attribute, making the operation too troublesome and not direct, resulting in Low user experience with technical issues.
- the embodiment of the present application further provides a storage medium, where the storage medium is used to store executable program code, and the executable program code is used to execute a touch screen track recognition method provided by the present application at runtime. .
- the operation processing solves the current limitation of the touch screen pen and the gesture operation in the working state of the touch screen, such as the confusion of the touch screen pen and the gesture operation.
- the operation causes technical problems that are difficult to distinguish actual operational attributes, making the operation too cumbersome and not straightforward, resulting in low user experience.
- the embodiment of the present application further provides an application, where the application is used to execute a touch screen track recognition method provided by the present application at runtime.
- the operation processing solves many limitations of the touch screen pen and gesture operation in the current working state of the touch screen. For example, the chaotic operation of the touch screen pen and the gesture operation makes it difficult to distinguish the actual operation attribute, making the operation too troublesome and not direct, resulting in Low user experience with technical issues.
- the description is relatively simple, and the relevant parts can be referred to the description of the method embodiment.
- the steps may be completed by a program instructing related hardware, and the program may be stored in a computer readable storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, or the like.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (15)
- 一种触摸屏轨迹识别方法,其特征在于,包括:监测通过在触摸屏上触发的触碰操作指令;判断所述触碰操作指令的类型,若为触摸屏笔触发指令,则根据捕捉到的触摸屏笔轨迹进行对应的操作处理,若为手势触发指令,则根据捕捉到的手势轨迹进行对应的操作处理。
- 根据权利要求1所述的触摸屏轨迹识别方法,其特征在于,所述监测通过在触摸屏上触发的触碰操作指令具体包括:当墨迹状态时,获取通过所述触摸屏返回的触发方式;对所述触发方式自动匹配输入源后确定对应的所述触碰操作指令。
- 根据权利要求2所述的触摸屏轨迹识别方法,其特征在于,对所述触发方式自动匹配输入源后确定对应的所述触碰操作指令具体包括:对所述触发方式自动匹配输入源对应的封装api,根据所述封装api确定对应的所述触碰操作指令。
- 根据权利要求1所述的触摸屏轨迹识别方法,其特征在于,根据捕捉到的手势轨迹进行对应的操作处理具体包括:根据通过单指操作或多指操作捕捉到的手势轨迹进行对应的手势操作处理;其中,所述手势操作处理包括:缩放界面、和/或滚动界面、和/或选择文本内容、和/或填充手势划过区域、和/或剪切/复制手势划过区域。
- 根据权利要求1所述的触摸屏轨迹识别方法,其特征在于,根据捕捉到的触摸屏笔轨迹进行对应的操作处理具体包括:根据捕捉到的触摸屏笔轨迹进行对应的触摸屏笔操作处理;其中,所述触摸屏笔操作处理包括:墨迹绘制处理、和/或激光笔处理、和/或规则图像/对象绘制处理、和/或橡皮擦处理。
- 根据权利要求1至5中任意一项所述的触摸屏轨迹识别方法,其特征在于,所述触摸屏轨迹识别方法还包括:当判断所述触碰操作指令的类型同时为所述触摸屏笔触发指令和所述手势触发指令时,根据预置优先级进行对应优先顺序操作处理;或基于所述预置优先级进行特殊情况确定,若为所述特殊情况,则进行优先级顺序的再次调整;或同时进行与所述触摸屏笔触发指令和所述手势触发指令相对应的操作。
- 一种触摸屏轨迹识别装置,其特征在于,包括:监测单元,用于监测通过在触摸屏上触发的触摸屏上的触碰操作指令;判断单元,用于判断所述触碰操作指令的类型,若为触摸屏笔触发指令,则触发触摸屏笔处理单元,若为手势触发指令,则触发手势处理单元;所述触摸屏笔处理单元,用于根据捕捉到的触摸屏笔轨迹进行对应的操作处理;所述手势处理单元,用于根据捕捉到的手势轨迹进行对应的操作处理。
- 根据权利要求7所述的触摸屏轨迹识别装置,其特征在于,所述监测单元具体包括:返回子单元,用于当墨迹状态时,获取通过所述触摸屏返回的触发方式;确定子单元,用于对所述触发方式自动匹配输入源后确定对应的所述触碰操作指令。
- 根据权利要求8所述的触摸屏轨迹识别装置,其特征在于,所述确定子单元,具体用于对所述触发方式自动匹配输入源对应的封装api,根据所述封装api确定对应的所述触碰操作指令。
- 根据权利要求7所述的触摸屏轨迹识别装置,其特征在于,所述手势处理单元,具体用于根据通过单指操作或多指操作捕捉到的手势轨迹进行对应的手势操作处理;其中,所述手势操作处理包括:缩放界面、和/或滚动界面、和/或选择文本内容、和/或填充手势划过区域、和/或剪切/复制手势划过区域。
- 根据权利要求7所述的触摸屏轨迹识别装置,其特征在于,所述触摸屏笔处理单元,具体用于据捕捉到的触摸屏笔轨迹进行对应的触摸屏笔操作处理;其中,所述触摸屏笔操作处理包括:墨迹绘制处理、和/或激光笔处理、和/或规则图像/对象绘制处理、和/或橡皮擦处理。
- 根据权利要求7至11中任意一项所述的触摸屏轨迹识别装置,其特征在于,所述触摸屏轨迹识别装置还包括:同时处理单元,用于当所述判断单元判断所述触碰操作指令的类型同时为所述触摸屏笔触发指令和所述手势触发指令时,根据预置优先级进行对应优先顺序操作处理,或基于所述预置优先级进行特殊情况确定,若为所述特殊情况,则进行优先级顺序的再次调整,或同时进行与所述触摸屏笔触发指令和所述手势触发指令相对应的操作。
- 一种电子设备,其特征在于,包括:处理器、存储器、通信接口和总线;所述处理器、所述存储器和所述通信接口通过所述总线连接并完成相互间的通信;所述存储器存储可执行程序代码;所述处理器通过读取所述存储器中存储的可执行程序代码来运行与所述可执行程序代码对应的程序,以用于执行如权利要求1-6任一项所述的一种触摸屏轨迹识别方法。
- 一种存储介质,其特征在于,所述存储介质用于存储可执行程序代码,所述可执行程序代码用于在运行时执行如权利要求1-6任一项所述的一种触摸屏轨迹识别方法。
- 一种应用程序,其特征在于,所述应用程序用于在运行时执行如权利要求1-6任一项所述的一种触摸屏轨迹识别方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/096,656 US11042290B2 (en) | 2016-04-28 | 2016-08-30 | Touch screen track recognition method and apparatus |
JP2018556431A JP6847127B2 (ja) | 2016-04-28 | 2016-08-30 | タッチスクリーントラック認識方法及び装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610281833.9 | 2016-04-28 | ||
CN201610281833.9A CN107329602A (zh) | 2016-04-28 | 2016-04-28 | 一种触摸屏轨迹识别方法及装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017185575A1 true WO2017185575A1 (zh) | 2017-11-02 |
Family
ID=60161829
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/097316 WO2017185575A1 (zh) | 2016-04-28 | 2016-08-30 | 一种触摸屏轨迹识别方法及装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US11042290B2 (zh) |
JP (1) | JP6847127B2 (zh) |
CN (1) | CN107329602A (zh) |
WO (1) | WO2017185575A1 (zh) |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013169843A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for manipulating framed graphical objects |
WO2013169842A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for selecting object within a group of objects |
JP6031186B2 (ja) | 2012-05-09 | 2016-11-24 | アップル インコーポレイテッド | ユーザインタフェースオブジェクトを選択するためのデバイス、方法及びグラフィカルユーザインタフェース |
WO2013169865A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
JP6082458B2 (ja) | 2012-05-09 | 2017-02-15 | アップル インコーポレイテッド | ユーザインタフェース内で実行される動作の触知フィードバックを提供するデバイス、方法、及びグラフィカルユーザインタフェース |
WO2013169849A2 (en) | 2012-05-09 | 2013-11-14 | Industries Llc Yknots | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
AU2013259630B2 (en) | 2012-05-09 | 2016-07-07 | Apple Inc. | Device, method, and graphical user interface for transitioning between display states in response to gesture |
WO2013169854A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
EP2847657B1 (en) | 2012-05-09 | 2016-08-10 | Apple Inc. | Device, method, and graphical user interface for displaying additional information in response to a user contact |
CN107832003B (zh) * | 2012-12-29 | 2021-01-22 | 苹果公司 | 用于放大内容的方法和设备、电子设备和介质 |
US9632664B2 (en) | 2015-03-08 | 2017-04-25 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10095396B2 (en) | 2015-03-08 | 2018-10-09 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US9639184B2 (en) | 2015-03-19 | 2017-05-02 | Apple Inc. | Touch input cursor manipulation |
US20170045981A1 (en) | 2015-08-10 | 2017-02-16 | Apple Inc. | Devices and Methods for Processing Touch Inputs Based on Their Intensities |
US9830048B2 (en) | 2015-06-07 | 2017-11-28 | Apple Inc. | Devices and methods for processing touch inputs with instructions in a web page |
US9860451B2 (en) | 2015-06-07 | 2018-01-02 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US9880735B2 (en) | 2015-08-10 | 2018-01-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10235035B2 (en) | 2015-08-10 | 2019-03-19 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
US10719230B2 (en) * | 2018-09-27 | 2020-07-21 | Atlassian Pty Ltd | Recognition and processing of gestures in a graphical user interface using machine learning |
KR20230022766A (ko) * | 2021-08-09 | 2023-02-16 | 삼성전자주식회사 | 스타일러스 펜의 입력을 처리하는 전자 장치와 이의 동작 방법 |
US11922008B2 (en) | 2021-08-09 | 2024-03-05 | Samsung Electronics Co., Ltd. | Electronic device processing input of stylus pen and method for operating the same |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101882042A (zh) * | 2010-06-08 | 2010-11-10 | 苏州瀚瑞微电子有限公司 | 电容式触摸屏手掌判别方法 |
CN102063236A (zh) * | 2010-12-29 | 2011-05-18 | 福州锐达数码科技有限公司 | 一种压感电子白板实现触摸识别的方法 |
CN102707861A (zh) * | 2011-03-28 | 2012-10-03 | 联想(北京)有限公司 | 电子设备及其显示方法 |
CN103207718A (zh) * | 2013-03-18 | 2013-07-17 | 敦泰科技有限公司 | 一种互电容式触摸屏及其触摸感应方法 |
US20140028605A1 (en) * | 2012-07-26 | 2014-01-30 | Texas Instruments Incorporated | Touch Profiling on Capacitive-Touch Screens |
CN104915136A (zh) * | 2014-03-14 | 2015-09-16 | Lg电子株式会社 | 移动终端及其控制方法 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4857385B2 (ja) * | 2010-01-12 | 2012-01-18 | パナソニック株式会社 | 電子ペンシステム |
WO2012099584A1 (en) * | 2011-01-19 | 2012-07-26 | Hewlett-Packard Development Company, L.P. | Method and system for multimodal and gestural control |
KR102063952B1 (ko) * | 2012-10-10 | 2020-01-08 | 삼성전자주식회사 | 멀티 디스플레이 장치 및 멀티 디스플레이 방법 |
TWI514229B (zh) * | 2013-11-22 | 2015-12-21 | Elan Microelectronics Corp | 圖形編輯方法以及電子裝置 |
JP6235349B2 (ja) | 2014-01-16 | 2017-11-22 | シャープ株式会社 | タッチ操作機能付表示装置 |
-
2016
- 2016-04-28 CN CN201610281833.9A patent/CN107329602A/zh active Pending
- 2016-08-30 US US16/096,656 patent/US11042290B2/en active Active
- 2016-08-30 WO PCT/CN2016/097316 patent/WO2017185575A1/zh active Application Filing
- 2016-08-30 JP JP2018556431A patent/JP6847127B2/ja active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101882042A (zh) * | 2010-06-08 | 2010-11-10 | 苏州瀚瑞微电子有限公司 | 电容式触摸屏手掌判别方法 |
CN102063236A (zh) * | 2010-12-29 | 2011-05-18 | 福州锐达数码科技有限公司 | 一种压感电子白板实现触摸识别的方法 |
CN102707861A (zh) * | 2011-03-28 | 2012-10-03 | 联想(北京)有限公司 | 电子设备及其显示方法 |
US20140028605A1 (en) * | 2012-07-26 | 2014-01-30 | Texas Instruments Incorporated | Touch Profiling on Capacitive-Touch Screens |
CN103207718A (zh) * | 2013-03-18 | 2013-07-17 | 敦泰科技有限公司 | 一种互电容式触摸屏及其触摸感应方法 |
CN104915136A (zh) * | 2014-03-14 | 2015-09-16 | Lg电子株式会社 | 移动终端及其控制方法 |
Also Published As
Publication number | Publication date |
---|---|
JP2019516189A (ja) | 2019-06-13 |
US11042290B2 (en) | 2021-06-22 |
CN107329602A (zh) | 2017-11-07 |
US20200210059A1 (en) | 2020-07-02 |
JP6847127B2 (ja) | 2021-03-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017185575A1 (zh) | 一种触摸屏轨迹识别方法及装置 | |
US10048748B2 (en) | Audio-visual interaction with user devices | |
US9665276B2 (en) | Character deletion during keyboard gesture | |
CN110058782B (zh) | 基于交互式电子白板的触摸操作方法及其系统 | |
JP6427559B6 (ja) | 手書き入力のための永久同期システム | |
US20220214784A1 (en) | Systems and methods for a touchscreen user interface for a collaborative editing tool | |
US9423908B2 (en) | Distinguishing between touch gestures and handwriting | |
US20140189482A1 (en) | Method for manipulating tables on an interactive input system and interactive input system executing the method | |
US20150077358A1 (en) | Electronic device and method of controlling the same | |
US10031667B2 (en) | Terminal device, display control method, and non-transitory computer-readable recording medium | |
US20130044061A1 (en) | Method and apparatus for providing a no-tap zone for touch screen displays | |
US20160170632A1 (en) | Interacting With Application Beneath Transparent Layer | |
US10162501B2 (en) | Terminal device, display control method, and non-transitory computer-readable recording medium | |
EP2899623A2 (en) | Information processing apparatus, information processing method, and program | |
WO2020118491A1 (zh) | 基于指纹识别的交互方法、电子设备及相关装置 | |
US10613645B2 (en) | Mechanism for pen interoperability with pressure sensor design | |
US10222866B2 (en) | Information processing method and electronic device | |
TWI547863B (zh) | 手寫輸入識別方法、系統與電子裝置 | |
US9430702B2 (en) | Character input apparatus and method based on handwriting | |
WO2020124422A1 (zh) | 手写系统的控制方法和手写系统 | |
EP3101522A1 (en) | Information processing device, information processing method, and program | |
US10001839B2 (en) | Gesture recognition of ink strokes | |
WO2020062121A1 (zh) | 页面操作方法、电子装置及计算机可读存储介质 | |
US20130222278A1 (en) | Electronic device and method for setting editing tools of the electronic device | |
US10474886B2 (en) | Motion input system, motion input method and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2018556431 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16900084 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 140219) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16900084 Country of ref document: EP Kind code of ref document: A1 |