WO2017185575A1 - 一种触摸屏轨迹识别方法及装置 - Google Patents

一种触摸屏轨迹识别方法及装置 Download PDF

Info

Publication number
WO2017185575A1
WO2017185575A1 PCT/CN2016/097316 CN2016097316W WO2017185575A1 WO 2017185575 A1 WO2017185575 A1 WO 2017185575A1 CN 2016097316 W CN2016097316 W CN 2016097316W WO 2017185575 A1 WO2017185575 A1 WO 2017185575A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch screen
gesture
track
touch
pen
Prior art date
Application number
PCT/CN2016/097316
Other languages
English (en)
French (fr)
Inventor
胡娟
黄兰花
Original Assignee
北京金山办公软件有限公司
珠海金山办公软件有限公司
广州金山移动科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京金山办公软件有限公司, 珠海金山办公软件有限公司, 广州金山移动科技有限公司 filed Critical 北京金山办公软件有限公司
Priority to US16/096,656 priority Critical patent/US11042290B2/en
Priority to JP2018556431A priority patent/JP6847127B2/ja
Publication of WO2017185575A1 publication Critical patent/WO2017185575A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present application relates to the field of touch screen operation technologies, and in particular, to a touch screen track recognition method and apparatus.
  • the operation result of the touch screen pen is consistent with the result of the gesture operation, but there are many limitations in the touch screen pen and gesture operation in the touch screen operation state. Such confusing operations as touch screen pens and gesture operations result in difficulty in distinguishing actual operational attributes, making the operation too cumbersome and not straightforward, resulting in technical problems with low user experience.
  • a method and device for recognizing a touch screen track provided by an embodiment of the present invention solves many limitations of the touch screen pen and gesture operation in the working state of the touch screen. For example, the chaotic operation of the touch screen pen and the gesture operation makes it difficult to distinguish actual operation attributes. This makes the operation too cumbersome and not straightforward, resulting in technical problems with low user experience.
  • Determining the type of the touch operation command if it is a touch screen pen trigger command, performing corresponding operation processing according to the captured touch screen pen track, and if it is a gesture trigger command, performing corresponding operation processing according to the captured gesture track.
  • the monitoring by the touch operation instruction triggered on the touch screen, specifically includes:
  • the corresponding touch operation instruction is determined after the trigger mode automatically matches the input source.
  • determining the corresponding touch operation instruction after the triggering mode automatically matches the input source includes:
  • the triggering mode automatically matches the encapsulation api corresponding to the input source, and determines the corresponding touch operation instruction according to the encapsulation api.
  • performing corresponding operation processing according to the captured gesture trajectory specifically includes:
  • the gesture operation process includes: a zoom interface, and/or a scroll interface, and/or selection of text content, and/or a fill gesture swipe area, and/or a cut/copy gesture swipe area.
  • performing corresponding operation processing according to the captured touch screen pen track specifically includes:
  • the touch screen pen operation process includes: ink drawing processing, and/or laser pen processing, and/or regular image/object drawing processing, and/or eraser processing.
  • the touch screen track identification method further includes:
  • the corresponding priority order operation processing is performed according to the preset priority level
  • a monitoring unit for monitoring a touch operation instruction triggered by the touch screen
  • a determining unit configured to determine a type of the touch operation instruction, if the touch screen pen triggers the instruction, triggers the touch screen pen processing unit, and if the gesture triggers the instruction, triggers the gesture processing unit;
  • the touch screen pen processing unit is configured to perform corresponding operation processing according to the captured touch screen pen track;
  • the gesture processing unit is configured to perform corresponding operation processing according to the captured gesture track.
  • the monitoring unit specifically includes:
  • the determining subunit is configured to determine the corresponding touch operation instruction after the trigger mode automatically matches the input source.
  • the determining subunit is specifically configured to automatically match the triggering manner to the encapsulation api corresponding to the input source, and determine the corresponding touch operation instruction according to the encapsulation api.
  • the gesture processing unit performs corresponding gesture operation processing according to the gesture track captured by the single-finger operation or the multi-finger operation;
  • the gesture operation process includes: a zoom interface, and/or a scroll interface, and/or selection of text content, and/or a fill gesture swipe area, and/or a cut/copy gesture swipe area.
  • the touch screen pen processing unit is specifically configured to perform corresponding touch screen pen operation processing according to the captured touch screen pen track;
  • the touch screen pen operation process includes: ink drawing processing, and/or laser pen processing, and/or regular image/object drawing processing, and/or eraser processing.
  • the touch screen track identification device further includes:
  • a processing unit configured to: when the determining unit determines that the type of the touch operation instruction is the touch screen triggering instruction and the gesture triggering instruction, performing corresponding priority order operation processing according to the preset priority, or based on The preset priority is determined by a special case. If the special case is used, the priority order is adjusted again, or an operation corresponding to the touch screen triggering instruction and the gesture triggering instruction is simultaneously performed.
  • processor a memory, a communication interface, and a bus
  • the processor, the memory, and the communication interface are connected by the bus and complete communication with each other;
  • the memory stores executable program code
  • the processor runs a program corresponding to the executable program code by reading executable program code stored in the memory for performing a touch screen track recognition method as described herein.
  • a storage medium is provided in an embodiment of the present application, wherein the storage medium is configured to store executable program code, and the executable program code is configured to execute a touch screen track identification method according to the present application at runtime.
  • An application program provided by an embodiment of the present application is configured to perform a touch screen track identification method according to the present application at runtime.
  • the embodiments of the present application have the following advantages:
  • a method and device for recognizing a touch screen track includes: monitoring a touch operation instruction triggered on the touch screen, and then determining a type of the touch operation instruction, if the touch screen pen triggers the instruction Then, according to the captured touch screen pen track, the corresponding operation process is performed. If the gesture trigger command is executed, the corresponding operation process is performed according to the captured gesture track. In this embodiment, by determining the type of the touch operation command, if the touch screen pen triggers the instruction, the corresponding operation processing is performed according to the captured touch screen pen track, and if the gesture trigger instruction is performed, the corresponding gesture is performed according to the captured gesture track.
  • the operation processing solves many limitations of the touch screen pen and gesture operation in the current working state of the touch screen. For example, the chaotic operation of the touch screen pen and the gesture operation makes it difficult to distinguish the actual operation attribute, making the operation too troublesome and not direct, resulting in Low user experience with technical issues.
  • FIG. 1 is a schematic flowchart diagram of an embodiment of a method for recognizing a touch screen track according to an embodiment of the present disclosure
  • FIG. 2 is a schematic flowchart diagram of another embodiment of a method for recognizing a touch screen track according to an embodiment of the present disclosure
  • FIG. 3 is a schematic structural diagram of an embodiment of a touch screen track recognizing device according to an embodiment of the present disclosure
  • FIG. 4 is a schematic structural diagram of another embodiment of a touch screen track recognizing device according to an embodiment of the present disclosure
  • FIG. 5 is a schematic diagram of an application example of FIG. 2.
  • an embodiment of a method for recognizing a touch screen track provided by an embodiment of the present application includes:
  • step 102 determine the type of the touch operation command, if it is a touch screen pen trigger instruction, step 103 is performed, if it is a gesture trigger instruction, step 104 is performed;
  • step 103 is performed, and if the gesture triggers the instruction, step 104 is performed.
  • the type of the touch operation instruction is a touch screen pen trigger instruction
  • corresponding operation processing is performed according to the captured touch screen pen track.
  • the corresponding operation processing is performed according to the captured gesture trajectory.
  • the operation processing solves many limitations of the touch screen pen and gesture operation in the current working state of the touch screen. For example, the chaotic operation of the touch screen pen and the gesture operation makes it difficult to distinguish the actual operation attribute, making the operation too troublesome and not direct, resulting in Low user experience with technical issues.
  • FIG. 2 another embodiment of the touch screen track identification method provided by the embodiment of the present application includes:
  • the trigger mode returned through the touch screen is first acquired.
  • the corresponding touch operation command needs to be determined after the trigger mode automatically matches the input source.
  • the foregoing determining the corresponding touch operation command after the trigger mode automatically matches the input source may be a package api corresponding to the trigger mode automatically matching the input source, and determining a corresponding touch operation instruction according to the package api.
  • the ios9 system has an attribute type, which can be a finger or a touch screen pen. If the input source is different, the api gesture of the system package is different. For example, when the gesture is coming, the UITouch gesture source encapsulates the type type, which is well known in the art, and will not be described in detail.
  • step 204 Determine the type of the touch operation command. If the touch screen pen triggers the command, step 204 is performed. If the gesture triggers the command, step 205 is performed, if the touch screen trigger command and the gesture trigger are simultaneously performed. The instruction proceeds to step 206;
  • step 204 After determining the corresponding touch operation command after the trigger mode is automatically matched with the input source, it is necessary to determine the type of the touch operation command. If the touch screen pen triggers the instruction, step 204 is performed, and if the gesture triggers the command, step 205 is performed. If the touch screen pen triggers the command and the gesture trigger command at the same time, step 206 is performed.
  • the type of the touch operation instruction is a touch screen pen trigger instruction
  • corresponding operation processing is performed according to the captured touch screen pen track.
  • touch screen pen operation processing includes: ink drawing processing, and/or laser pen processing, and/or regular image/object drawing processing, and/or eraser processing.
  • the corresponding operation processing is performed according to the captured gesture trajectory.
  • the corresponding operation processing according to the captured gesture trajectory may specifically be performed according to the gesture trajectory captured by the single-finger operation or the multi-finger operation.
  • the gesture operation process includes: a zoom interface and/or a scroll interface, and/or selection of text content, and/or a fill gesture swipe area, and/or a cut/copy gesture swipe area, which is actually processed by the gesture operation. The effect is not limited.
  • the corresponding priority order operation processing is performed according to the preset priority level, or the special case determination is performed based on the preset priority level, if it is a special case, Perform the re-adjustment of the priority order, or perform the operations corresponding to the touch-screen pen trigger command and the gesture trigger command at the same time.
  • the preset priority is that the hand first responds to the hand when the hand first touches, and the pen first responds to the pen first, but adds a special judgment, such as a hand. When the contact area is too large, it will not respond to the hand.
  • the priority decision can be made through some recognition. For example, if the contact area of the hand is too large or meets certain characteristics, it can be determined that the user is in the state of holding the pen, and the palm or the finger supports the screen drawing, in which case the contact behavior of the hand is automatically ignored.
  • the subsequent processing is not necessarily mutually exclusive. If the two actions do not conflict, the two actions of the hand and the pen can be performed at the same time.
  • the action of the finger can also be extended to any other action.
  • the single-finger slip can be defined as the erase behavior.
  • the pen's behavior is not necessarily only to draw handwriting, such as a rectangle, a circle, etc., which can be defined as a rule, or even a pen as an eraser.
  • the foregoing preset priority may be that the touch screen ratio operation corresponding to the touch screen pen trigger instruction is performed first, and then the gesture operation corresponding to the gesture trigger instruction is performed, or the processing is reversed, and details are not described herein again.
  • the foregoing gesture track may be a multi-finger, single-finger operation, and is not specifically limited herein.
  • the operation of distinguishing the touch screen pen and the gesture is used, and when the touch screen pen is used, the operation of the touch screen pen is responded to, the operation result is a handwriting effect, and when the finger is used, the gesture operation is responded, such as a scroll interface. Zoom the interface, select text, and more.
  • the operation processing solves many limitations of the touch screen pen and gesture operation in the current working state of the touch screen. For example, the chaotic operation of the touch screen pen and the gesture operation makes it difficult to distinguish the actual operation attribute, making the operation too troublesome and not direct, resulting in Low user experience with technical issues.
  • the interface can be operated by gestures during the process of using the touch screen pen, such as scrolling and zooming the interface, so that the user can scroll through the side and help the user to complete the labeling process conveniently and quickly.
  • the drawing behavior is clearly distinguished from other behaviors, and more possibilities can be provided in the actual design, such as when the user draws ink through the touch screen pen, or by single fingering. Scroll the page content.
  • an embodiment of a touch screen track identification apparatus provided in an embodiment of the present application includes:
  • the monitoring unit 301 is configured to monitor a touch operation instruction triggered by the touch screen
  • the determining unit 302 is configured to determine the type of the touch operation command, if it is a touch screen pen triggering instruction, trigger the touch screen pen processing unit 303, if it is a gesture triggering instruction, trigger the gesture processing unit 304;
  • the touch screen pen processing unit 303 is configured to perform corresponding operation processing according to the captured touch screen pen track;
  • the gesture processing unit 304 is configured to perform corresponding operation processing according to the captured gesture trajectory.
  • the determining unit 302 determines the type of the touch operation command. If the touch screen pen triggers the command, the touch screen pen processing unit 303 performs corresponding operation processing according to the captured touch screen pen track.
  • the gesture processing unit 304 performs corresponding operation processing according to the captured gesture trajectory, and solves many limitations of the current touch screen pen and gesture operation in the touch screen working state, such as the chaotic operation of the touch screen pen and the gesture operation, which makes it difficult to distinguish the actual operation. Attributes make the operation too cumbersome and not straightforward, resulting in technical problems with low user experience.
  • FIG. 4 another embodiment of the touch screen track recognizing device provided in the embodiment of the present application includes:
  • a monitoring unit 401 configured to monitor a touch operation instruction on a touch screen triggered on the touch screen;
  • the monitoring unit 401 specifically includes:
  • the determining sub-unit 4012 is configured to determine a corresponding touch operation instruction after the trigger mode automatically matches the input source, and determine the sub-unit 4012, where the trigger unit automatically matches the encapsulation api corresponding to the input source, and determines the corresponding touch according to the encapsulation api. Touch the operation command.
  • the determining unit 402 is configured to determine the type of the touch operation instruction, if it is a touch screen pen triggering instruction, trigger the touch screen pen processing unit 403, if it is a gesture triggering instruction, trigger the gesture processing unit 404;
  • the touch screen pen processing unit 403 is configured to perform corresponding operation processing according to the captured touch screen pen track, and the touch screen pen processing unit is specifically configured to perform corresponding touch screen pen operation processing according to the captured touch screen pen track;
  • the touch screen pen operation processing includes: ink drawing processing, and/or laser pen processing, and/or regular image/object drawing processing, and/or eraser processing.
  • the gesture processing unit 404 is configured to perform corresponding operation processing according to the captured gesture trajectory, and the gesture processing unit 404 is specifically configured to perform corresponding gesture operation processing according to the gesture trajectory captured by the single-finger operation or the multi-finger operation;
  • the gesture operation process includes: a zoom interface, and/or a scroll interface, and/or selection of text content, and/or a fill gesture swipe area, and/or a cut/copy gesture swipe area.
  • the processing unit 405 is configured to: when the determining unit 402 determines that the type of the touch operation instruction is a touch screen triggering instruction and a gesture triggering instruction, perform corresponding priority order operation processing according to the preset priority level, or based on the preset priority level The special case is determined. If it is a special case, the priority order is adjusted again, or the operation corresponding to the touch screen trigger command and the gesture trigger command is performed at the same time.
  • the determining unit 402 determines the type of the touch operation command. If the touch screen pen triggers the command, the touch screen pen processing unit 403 performs corresponding operation processing according to the captured touch screen pen track.
  • the gesture processing unit 404 performs corresponding operation processing according to the captured gesture trajectory, and solves many limitations of the current touch screen pen and gesture operation in the touch screen working state, such as the chaotic operation of the touch screen pen and the gesture operation, which makes it difficult to distinguish The actual operational attributes make the operation too cumbersome and not straightforward, resulting in technical problems with low user experience.
  • the interface can be operated by gestures during the process of using the touch screen pen, such as scrolling and zooming the interface, so that the user can scroll through the side and help the user to complete the labeling process conveniently and quickly.
  • an electronic device which may include:
  • processor a memory, a communication interface, and a bus
  • the processor, the memory, and the communication interface are connected by the bus and complete communication with each other;
  • the memory stores executable program code
  • the processor runs a program corresponding to the executable program code by reading executable program code stored in the memory for performing a touch screen track recognition method provided by the present application at runtime.
  • the operation processing solves many limitations of the touch screen pen and gesture operation in the current working state of the touch screen. For example, the chaotic operation of the touch screen pen and the gesture operation makes it difficult to distinguish the actual operation attribute, making the operation too troublesome and not direct, resulting in Low user experience with technical issues.
  • the embodiment of the present application further provides a storage medium, where the storage medium is used to store executable program code, and the executable program code is used to execute a touch screen track recognition method provided by the present application at runtime. .
  • the operation processing solves the current limitation of the touch screen pen and the gesture operation in the working state of the touch screen, such as the confusion of the touch screen pen and the gesture operation.
  • the operation causes technical problems that are difficult to distinguish actual operational attributes, making the operation too cumbersome and not straightforward, resulting in low user experience.
  • the embodiment of the present application further provides an application, where the application is used to execute a touch screen track recognition method provided by the present application at runtime.
  • the operation processing solves many limitations of the touch screen pen and gesture operation in the current working state of the touch screen. For example, the chaotic operation of the touch screen pen and the gesture operation makes it difficult to distinguish the actual operation attribute, making the operation too troublesome and not direct, resulting in Low user experience with technical issues.
  • the description is relatively simple, and the relevant parts can be referred to the description of the method embodiment.
  • the steps may be completed by a program instructing related hardware, and the program may be stored in a computer readable storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种触摸屏轨迹识别方法及装置,解决了目前在触摸屏工作状态下通过触摸屏笔和手势操作存在很多局限性,如触摸屏笔和手势操作的混乱操作导致了难以区分实际操作属性,使得操作过于麻烦,不直接,而导致的用户体验性低的技术问题。其中触摸屏轨迹识别方法包括:监测通过在触摸屏上触发的触碰操作指令(101),然后判断触碰操作指令的类型(102),若为触摸屏笔触发指令,则根据捕捉到的触摸屏笔轨迹进行对应的操作处理(103),若为手势触发指令,则根据捕捉到的手势轨迹进行对应的操作处理(104)。

Description

一种触摸屏轨迹识别方法及装置
本申请要求于2016年4月28日提交中国专利局、申请号为201610281833.9发明名称为“一种触摸屏轨迹识别方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及触摸屏操作技术领域,尤其涉及一种触摸屏轨迹识别方法及装置。
背景技术
近年来移动设备开始逐渐普及,在移动设备上写字/画画的需求也越来越多。触摸屏笔由于可以很好地满足在移动设备上写字和画画的需求,因此得到了广泛的应用。
通常触摸屏笔的操作结果和手势操作结果是一致的,但在触摸屏工作状态下通过触摸屏笔和手势操作存在很多局限性。如触摸屏笔和手势操作的混乱操作导致了难以区分实际操作属性,使得操作过于麻烦,不直接,从而导致了用户体验性低的技术问题。
发明内容
本申请实施例提供的一种触摸屏轨迹识别方法及装置,解决了目前在触摸屏工作状态下通过触摸屏笔和手势操作存在很多局限性,如触摸屏笔和手势操作的混乱操作导致了难以区分实际操作属性,使得操作过于麻烦,不直接,而导致的用户体验性低的技术问题。
本申请实施例提供的一种触摸屏轨迹识别方法,包括:
监测通过在触摸屏上触发的触碰操作指令;
判断所述触碰操作指令的类型,若为触摸屏笔触发指令,则根据捕捉到的触摸屏笔轨迹进行对应的操作处理,若为手势触发指令,则根据捕捉到的手势轨迹进行对应的操作处理。
可选地,所述监测通过在触摸屏上触发的触碰操作指令具体包括:
当墨迹状态时,获取通过触摸屏返回的触发方式;
对所述触发方式自动匹配输入源后确定对应的所述触碰操作指令。
可选地,对所述触发方式自动匹配输入源后确定对应的所述触碰操作指令具体包括:
对所述触发方式自动匹配输入源对应的封装api,根据所述封装api确定对应的所述触碰操作指令。
可选地,根据捕捉到的手势轨迹进行对应的操作处理具体包括:
根据通过单指操作或多指操作捕捉到的手势轨迹进行对应的手势操作处理;
其中,所述手势操作处理包括:缩放界面、和/或滚动界面、和/或选择文本内容、和/或填充手势划过区域、和/或剪切/复制手势划过区域。
可选地,根据捕捉到的触摸屏笔轨迹进行对应的操作处理具体包括:
根据捕捉到的触摸屏笔轨迹进行对应的触摸屏笔操作处理;
其中,所述触摸屏笔操作处理包括:墨迹绘制处理、和/或激光笔处理、和/或规则图像/对象绘制处理、和/或橡皮擦处理。
可选地,所述触摸屏轨迹识别方法还包括:
当判断所述触碰操作指令的类型同时为所述触摸屏笔触发指令和所述手势触发指令时,根据预置优先级进行对应优先顺序操作处理;
或基于所述预置优先级进行特殊情况确定,若为所述特殊情况,则进行优先级顺序的再次调整;
或同时进行与所述触摸屏笔触发指令和所述手势触发指令相对应的操作。
本申请实施例提供的一种触摸屏轨迹识别装置,包括:
监测单元,用于监测通过在触摸屏上触发的触碰操作指令;
判断单元,用于判断所述触碰操作指令的类型,若为触摸屏笔触发指令,则触发触摸屏笔处理单元,若为手势触发指令,则触发手势处理单元;
所述触摸屏笔处理单元,用于根据捕捉到的触摸屏笔轨迹进行对应的操作处理;
所述手势处理单元,用于根据捕捉到的手势轨迹进行对应的操作处理。
可选地,所述监测单元具体包括:
返回子单元,用于当墨迹状态时,获取通过所述触摸屏返回的触发方式;
确定子单元,用于对所述触发方式自动匹配输入源后确定对应的所述触碰操作指令。
可选地,所述确定子单元,具体用于对所述触发方式自动匹配输入源对应的封装api,根据所述封装api确定对应的所述触碰操作指令。
可选地,所述手势处理单元,具体根据通过单指操作或多指操作捕捉到的手势轨迹进行对应的手势操作处理;
其中,所述手势操作处理包括:缩放界面、和/或滚动界面、和/或选择文本内容、和/或填充手势划过区域、和/或剪切/复制手势划过区域。
可选地,所述触摸屏笔处理单元,具体用于据捕捉到的触摸屏笔轨迹进行对应的触摸屏笔操作处理;
其中,所述触摸屏笔操作处理包括:墨迹绘制处理、和/或激光笔处理、和/或规则图像/对象绘制处理、和/或橡皮擦处理。
可选地,所述触摸屏轨迹识别装置还包括:
同时处理单元,用于当所述判断单元判断所述触碰操作指令的类型同时为所述触摸屏笔触发指令和所述手势触发指令时,根据预置优先级进行对应优先顺序操作处理,或基于所述预置优先级进行特殊情况确定,若为所述特殊情况,则进行优先级顺序的再次调整,或同时进行与所述触摸屏笔触发指令和所述手势触发指令相对应的操作。
本申请实施例提供的一种电子设备,包括:
处理器、存储器、通信接口和总线;
所述处理器、所述存储器和所述通信接口通过所述总线连接并完成相互间的通信;
所述存储器存储可执行程序代码;
所述处理器通过读取所述存储器中存储的可执行程序代码来运行与所述可执行程序代码对应的程序,以用于执行如本申请所述的一种触摸屏轨迹识别方法。
本申请实施例提供的一种存储介质,其中,该存储介质用于存储可执行程序代码,所述可执行程序代码用于在运行时执行本申请所述的一种触摸屏轨迹识别方法。
本申请实施例提供的一种应用程序,其中,该应用程序用于在运行时执行本申请所述的一种触摸屏轨迹识别方法。
从以上技术方案可以看出,本申请实施例具有以下优点:
本申请实施例提供的一种触摸屏轨迹识别方法及装置,其中,触摸屏轨迹识别方法包括:监测通过在触摸屏上触发的触碰操作指令,然后判断触碰操作指令的类型,若为触摸屏笔触发指令,则根据捕捉到的触摸屏笔轨迹进行对应的操作处理,若为手势触发指令,则根据捕捉到的手势轨迹进行对应的操作处理。本实施例中,通过判断触碰操作指令的类型,若为触摸屏笔触发指令,则根据捕捉到的触摸屏笔轨迹进行对应的操作处理,若为手势触发指令,则根据捕捉到的手势轨迹进行对应的操作处理,解决了目前在触摸屏工作状态下通过触摸屏笔和手势操作存在很多局限性,如触摸屏笔和手势操作的混乱操作导致了难以区分实际操作属性,使得操作过于麻烦,不直接,而导致的用户体验性低的技术问题。
附图说明
为了更清楚地说明本申请实施例和现有技术的技术方案,下面对实施例和现有技术中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出 创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例提供的一种触摸屏轨迹识别方法的一个实施例的流程示意图;
图2为本申请实施例提供的一种触摸屏轨迹识别方法的另一个实施例的流程示意图;
图3为本申请实施例提供的一种触摸屏轨迹识别装置的一个实施例的结构示意图;
图4为本申请实施例提供的一种触摸屏轨迹识别装置的另一个实施例的结构示意图;
图5为图2的应用例示意图。
具体实施方式
为了使本领域技术人员更好地理解本申请实施例中的技术方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
请参阅图1,本申请实施例提供的一种触摸屏轨迹识别方法的一个实施例包括:
101、监测通过在触摸屏上触发的触碰操作指令;
本实施例中,当需要在墨迹状态下确定触摸屏笔或手势触碰时,首先需要监测通过在触摸屏上触发的触碰操作指令。
102、判断触碰操作指令的类型,若为触摸屏笔触发指令,则执行步骤103,若为手势触发指令,则执行步骤104;
当监测到通过在触摸屏上触发的触碰操作指令之后,需要判断触碰操作指令的类型,若为触摸屏笔触发指令,则执行步骤103,若为手势触发指令,则执行步骤104。
103、根据捕捉到的触摸屏笔轨迹进行对应的操作处理;
当判断触碰操作指令的类型为触摸屏笔触发指令,则根据捕捉到的触摸屏笔轨迹进行对应的操作处理。
104、根据捕捉到的手势轨迹进行对应的操作处理。
当判断触碰操作指令的类型为手势触发指令,则根据捕捉到的手势轨迹进行对应的操作处理。
本实施例中,通过判断触碰操作指令的类型,若为触摸屏笔触发指令,则根据捕捉到的触摸屏笔轨迹进行对应的操作处理,若为手势触发指令,则根据捕捉到的手势轨迹进行对应的操作处理,解决了目前在触摸屏工作状态下通过触摸屏笔和手势操作存在很多局限性,如触摸屏笔和手势操作的混乱操作导致了难以区分实际操作属性,使得操作过于麻烦,不直接,而导致的用户体验性低的技术问题。
上面是对触摸屏轨迹识别方法的过程进行的详细描述,下面将对附加过程进行详细的描述,请参阅图2,本申请实施例提供的一种触摸屏轨迹识别方法的另一个实施例包括:
201、获取通过触摸屏返回的触发方式;
本实施例中,当需要确定触摸屏笔或手势触碰时,首先获取通过触摸屏返回的触发方式。
202、对触发方式自动匹配输入源后确定对应的触碰操作指令;
当获取到通过触摸屏返回的触发方式之后,需要对触发方式自动匹配输入源后确定对应的触碰操作指令。
前述的对触发方式自动匹配输入源后确定对应的触碰操作指令可以是对触发方式自动匹配输入源对应的封装api,根据封装api确定对应的触碰操作指令。例如ios9系统有个属性type,可以是区分手指或者触摸屏笔。如果输入源不同,则系统封装的api手势就不同,比如手势来的时候,UITouch手势源则封装了type类型,此处为本领域公知技术,便不再详细赘述。
203、判断触碰操作指令的类型,若为触摸屏笔触发指令,则执行步骤204,若为手势触发指令,则执行步骤205,若同时为触摸屏笔触发指令和手势触发 指令,则执行步骤206;
当对触发方式自动匹配输入源后确定对应的触碰操作指令之后,需要判断触碰操作指令的类型,若为触摸屏笔触发指令,则执行步骤204,若为手势触发指令,则执行步骤205,若同时为触摸屏笔触发指令和手势触发指令时,则执行步骤206。
204、根据捕捉到的触摸屏笔轨迹进行对应的操作处理;
当判断触碰操作指令的类型为触摸屏笔触发指令,则根据捕捉到的触摸屏笔轨迹进行对应的操作处理。
需要说明的是,触摸屏笔操作处理包括:墨迹绘制处理、和/或激光笔处理、和/或规则图像/对象绘制处理、和/或橡皮擦处理。
205、根据捕捉到的手势轨迹进行对应的操作处理;
当判断触碰操作指令的类型为手势触发指令,则根据捕捉到的手势轨迹进行对应的操作处理。
需要说明的是,根据捕捉到的手势轨迹进行对应的操作处理具体可以为,根据通过单指操作或多指操作捕捉到的手势轨迹进行对应的手势操作处理。其中,手势操作处理包括:缩放界面和/或滚动界面、和/或选择文本内容、和/或填充手势划过区域、和/或剪切/复制手势划过区域,实际上该手势操作处理的效果不限制。
206、根据预置优先级进行对应优先顺序操作处理,或基于预置优先级进行特殊情况确定,若为特殊情况,则进行优先级顺序的再次调整,或同时进行与触摸屏笔触发指令和手势触发指令相对应的操作。
当判断触碰操作指令的类型同时为触摸屏笔触发指令和手势触发指令时,则根据预置优先级进行对应优先顺序操作处理,或基于预置优先级进行特殊情况确定,若为特殊情况,则进行优先级顺序的再次调整,或同时进行与触摸屏笔触发指令和手势触发指令相对应的操作。
前述的再次调整可以是在同一个场景下的,比如通常预置的优先级是手先碰到就先响应手,笔先碰到就先响应笔,但增加一个特殊判断,比如手的 接触面积太大时就不响应手。
需要说明的是,笔与手同时接触摸屏幕时,可以通过一些识别来进行优先级决策。比如:如果手的接触面积太大或符合某些特性,可判定为用户是握笔状态下,手掌或手指支撑屏幕绘制中,这种情况下会自动忽略手的接触行为。笔与手同时接触摸屏幕时,后续的处理不一定是互斥的,如果两个动作没有冲突,可以同时执行手和笔的两个动作。手指的动作还可扩展为其他任何动作的,例如可以把单指滑动定义为擦除行为,当用笔绘制好一个墨迹后,用手指在墨迹位置滑动即可擦除掉手指活动轨迹内的墨迹内容。进一步地,笔的行为也不一定只有绘制笔迹,比如可以定义为绘制规则的矩形,圆圈等图形,甚至把笔当做橡皮擦。
需要说明的是,前述的预置优先级可以为先进行与触摸屏笔触发指令相对应的触摸屏比操作,再进行与手势触发指令对应的手势操作,或者反过来处理,此处不再赘述。前述的手势轨迹可以是多指、单指操作,此处具体不做限定。
如图5所示,在同一个界面内,区分触摸屏笔和手势的操作,使用触摸屏笔时响应触摸屏笔的操作,操作结果为画出笔迹效果,使用手指时则响应手势操作,比如滚动界面、缩放界面、选择文本等等。
本实施例中,通过判断触碰操作指令的类型,若为触摸屏笔触发指令,则根据捕捉到的触摸屏笔轨迹进行对应的操作处理,若为手势触发指令,则根据捕捉到的手势轨迹进行对应的操作处理,解决了目前在触摸屏工作状态下通过触摸屏笔和手势操作存在很多局限性,如触摸屏笔和手势操作的混乱操作导致了难以区分实际操作属性,使得操作过于麻烦,不直接,而导致的用户体验性低的技术问题。同时通过区分触摸屏笔和手势操作,针对触摸屏笔和手势给予不同的操作反馈,使得用户能够无需频繁切换状态,更方便快捷地完成操作任务,通过区分触摸屏笔和手势,区分出写墨迹和操作界面这两种行为目的。达到的效果是使用触摸屏笔标注的过程中还可以用手势来操作界面,比如滚动、缩放界面,使得用户能够一边翻阅一边标注,帮助用户方便快捷地完成标注过程。
以及在实际应用中,如果进入特定墨迹状态下,可以直接用触摸屏笔绘 制或写字,但普通的手指滑动也是绘制墨迹,通常这种情况下现有技术方案是支持复杂的手势的,比如此时可以用双指来滑动页面(用户很难发现和学习这个手势),但是这种处理有很大的局限性,主要是最直接的单指滑动不能直接滚动页面和单指也不能选择文档元素。
进一步地,通过区分触摸屏笔轨迹和手势轨迹,使得绘制行为和其他行为明确区分开来,在实际设计中可以提供更多的可能性,比如用户通过触摸屏笔绘制墨迹时,也可通过单指划动滚动页面内容。
请参阅图3,本申请实施例中提供的一种触摸屏轨迹识别装置的一个实施例包括:
监测单元301,用于监测通过在触摸屏上触发的的触碰操作指令;
判断单元302,用于判断触碰操作指令的类型,若为触摸屏笔触发指令,则触发触摸屏笔处理单元303,若为手势触发指令,则触发手势处理单元304;
触摸屏笔处理单元303,用于根据捕捉到的触摸屏笔轨迹进行对应的操作处理;
手势处理单元304,用于根据捕捉到的手势轨迹进行对应的操作处理。
本实施例中,通过判断单元302判断触碰操作指令的类型,若为触摸屏笔触发指令,则触摸屏笔处理单元303根据捕捉到的触摸屏笔轨迹进行对应的操作处理,若为手势触发指令,则手势处理单元304根据捕捉到的手势轨迹进行对应的操作处理,解决了目前在触摸屏工作状态下通过触摸屏笔和手势操作存在很多局限性,如触摸屏笔和手势操作的混乱操作导致了难以区分实际操作属性,使得操作过于麻烦,不直接,而导致的用户体验性低的技术问题。
上面是对触摸屏轨迹识别装置的各单元进行的详细描述,下面将对附加单元进行详细的描述,请参阅图4,本申请实施例中提供的一种触摸屏轨迹识别装置的另一个实施例包括:
监测单元401,用于监测通过在触摸屏上触发的触摸屏上的触碰操作指令;
监测单元401具体包括:
返回子单元4011,用于获取通过触摸屏返回的触发方式;
确定子单元4012,用于对触发方式自动匹配输入源后确定对应的触碰操作指令,确定子单元4012,具体用于对触发方式自动匹配输入源对应的封装api,根据封装api确定对应的触碰操作指令。
判断单元402,用于判断触碰操作指令的类型,若为触摸屏笔触发指令,则触发触摸屏笔处理单元403,若为手势触发指令,则触发手势处理单元404;
触摸屏笔处理单元403,用于根据捕捉到的触摸屏笔轨迹进行对应的操作处理,触摸屏笔处理单元,具体用于据捕捉到的触摸屏笔轨迹进行对应的触摸屏笔操作处理;
其中,触摸屏笔操作处理包括:墨迹绘制处理、和/或激光笔处理、和/或规则图像/对象绘制处理、和/或橡皮擦处理。
手势处理单元404,用于根据捕捉到的手势轨迹进行对应的操作处理,手势处理单元404,具体用于根据通过单指操作或多指操作捕捉到的手势轨迹进行对应的手势操作处理;
其中,手势操作处理包括:缩放界面、和/或滚动界面、和/或选择文本内容、和/或填充手势划过区域、和/或剪切/复制手势划过区域。
同时处理单元405,用于当所述判断单元402判断触碰操作指令的类型同时为触摸屏笔触发指令和手势触发指令时,根据预置优先级进行对应优先顺序操作处理,或基于预置优先级进行特殊情况确定,若为特殊情况,则进行优先级顺序的再次调整,或同时进行与触摸屏笔触发指令和手势触发指令相对应的操作。
本实施例中,通过判断单元402判断触碰操作指令的类型,若为触摸屏笔触发指令,则触摸屏笔处理单元403根据捕捉到的触摸屏笔轨迹进行对应的操作处理,若为手势触发指令,则手势处理单元404根据捕捉到的手势轨迹进行对应的操作处理,解决了目前在触摸屏工作状态下通过触摸屏笔和手势操作存在很多局限性,如触摸屏笔和手势操作的混乱操作导致了难以区分 实际操作属性,使得操作过于麻烦,不直接,而导致的用户体验性低的技术问题。同时通过区分触摸屏笔和手势操作,针对触摸屏笔和手势给予不同的操作反馈,使得用户能够无需频繁切换状态,更方便快捷地完成操作任务,通过区分触摸屏笔和手势,区分出写墨迹和操作界面这两种行为目的。达到的效果是使用触摸屏笔标注的过程中还可以用手势来操作界面,比如滚动、缩放界面,使得用户能够一边翻阅一边标注,帮助用户方便快捷地完成标注过程。
相应地,本申请实施例还提供了一种电子设备,可以包括:
处理器、存储器、通信接口和总线;
所述处理器、所述存储器和所述通信接口通过所述总线连接并完成相互间的通信;
所述存储器存储可执行程序代码;
所述处理器通过读取所述存储器中存储的可执行程序代码来运行与所述可执行程序代码对应的程序,以用于在运行时执行本申请提供的一种触摸屏轨迹识别方法。
本实施例中,通过判断触碰操作指令的类型,若为触摸屏笔触发指令,则根据捕捉到的触摸屏笔轨迹进行对应的操作处理,若为手势触发指令,则根据捕捉到的手势轨迹进行对应的操作处理,解决了目前在触摸屏工作状态下通过触摸屏笔和手势操作存在很多局限性,如触摸屏笔和手势操作的混乱操作导致了难以区分实际操作属性,使得操作过于麻烦,不直接,而导致的用户体验性低的技术问题。
相应地,本申请实施例还提供了一种存储介质,其中,该存储介质用于存储可执行程序代码,所述可执行程序代码用于在运行时执行本申请提供的一种触摸屏轨迹识别方法。
本实施例中,通过判断触碰操作指令的类型,若为触摸屏笔触发指令,则根据捕捉到的触摸屏笔轨迹进行对应的操作处理,若为手势触发指令,则根据捕捉到的手势轨迹进行对应的操作处理,解决了目前在触摸屏工作状态下通过触摸屏笔和手势操作存在很多局限性,如触摸屏笔和手势操作的混乱 操作导致了难以区分实际操作属性,使得操作过于麻烦,不直接,而导致的用户体验性低的技术问题。
相应地,本申请实施例还提供了一种应用程序,其中,该应用程序用于在运行时执行本申请提供的一种触摸屏轨迹识别方法。
本实施例中,通过判断触碰操作指令的类型,若为触摸屏笔触发指令,则根据捕捉到的触摸屏笔轨迹进行对应的操作处理,若为手势触发指令,则根据捕捉到的手势轨迹进行对应的操作处理,解决了目前在触摸屏工作状态下通过触摸屏笔和手势操作存在很多局限性,如触摸屏笔和手势操作的混乱操作导致了难以区分实际操作属性,使得操作过于麻烦,不直接,而导致的用户体验性低的技术问题。
对于装置/电子设备/存储介质/应用程序实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
本说明书中的各个实施例均采用相关的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于装置实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。
本领域普通技术人员可以理解实现上述方法实施方式中的全部或部分步 骤是可以通过程序来指令相关的硬件来完成,所述的程序可以存储于计算机可读取存储介质中,这里所称得的存储介质,如:ROM/RAM、磁碟、光盘等。
以上所述仅为本申请的较佳实施例而已,并非用于限定本申请的保护范围。凡在本申请的精神和原则之内所作的任何修改、等同替换、改进等,均包含在本申请的保护范围内。

Claims (15)

  1. 一种触摸屏轨迹识别方法,其特征在于,包括:
    监测通过在触摸屏上触发的触碰操作指令;
    判断所述触碰操作指令的类型,若为触摸屏笔触发指令,则根据捕捉到的触摸屏笔轨迹进行对应的操作处理,若为手势触发指令,则根据捕捉到的手势轨迹进行对应的操作处理。
  2. 根据权利要求1所述的触摸屏轨迹识别方法,其特征在于,所述监测通过在触摸屏上触发的触碰操作指令具体包括:
    当墨迹状态时,获取通过所述触摸屏返回的触发方式;
    对所述触发方式自动匹配输入源后确定对应的所述触碰操作指令。
  3. 根据权利要求2所述的触摸屏轨迹识别方法,其特征在于,对所述触发方式自动匹配输入源后确定对应的所述触碰操作指令具体包括:
    对所述触发方式自动匹配输入源对应的封装api,根据所述封装api确定对应的所述触碰操作指令。
  4. 根据权利要求1所述的触摸屏轨迹识别方法,其特征在于,根据捕捉到的手势轨迹进行对应的操作处理具体包括:
    根据通过单指操作或多指操作捕捉到的手势轨迹进行对应的手势操作处理;
    其中,所述手势操作处理包括:缩放界面、和/或滚动界面、和/或选择文本内容、和/或填充手势划过区域、和/或剪切/复制手势划过区域。
  5. 根据权利要求1所述的触摸屏轨迹识别方法,其特征在于,根据捕捉到的触摸屏笔轨迹进行对应的操作处理具体包括:
    根据捕捉到的触摸屏笔轨迹进行对应的触摸屏笔操作处理;
    其中,所述触摸屏笔操作处理包括:墨迹绘制处理、和/或激光笔处理、和/或规则图像/对象绘制处理、和/或橡皮擦处理。
  6. 根据权利要求1至5中任意一项所述的触摸屏轨迹识别方法,其特征在于,所述触摸屏轨迹识别方法还包括:
    当判断所述触碰操作指令的类型同时为所述触摸屏笔触发指令和所述手势触发指令时,根据预置优先级进行对应优先顺序操作处理;
    或基于所述预置优先级进行特殊情况确定,若为所述特殊情况,则进行优先级顺序的再次调整;
    或同时进行与所述触摸屏笔触发指令和所述手势触发指令相对应的操作。
  7. 一种触摸屏轨迹识别装置,其特征在于,包括:
    监测单元,用于监测通过在触摸屏上触发的触摸屏上的触碰操作指令;
    判断单元,用于判断所述触碰操作指令的类型,若为触摸屏笔触发指令,则触发触摸屏笔处理单元,若为手势触发指令,则触发手势处理单元;
    所述触摸屏笔处理单元,用于根据捕捉到的触摸屏笔轨迹进行对应的操作处理;
    所述手势处理单元,用于根据捕捉到的手势轨迹进行对应的操作处理。
  8. 根据权利要求7所述的触摸屏轨迹识别装置,其特征在于,所述监测单元具体包括:
    返回子单元,用于当墨迹状态时,获取通过所述触摸屏返回的触发方式;
    确定子单元,用于对所述触发方式自动匹配输入源后确定对应的所述触碰操作指令。
  9. 根据权利要求8所述的触摸屏轨迹识别装置,其特征在于,所述确定子单元,具体用于对所述触发方式自动匹配输入源对应的封装api,根据所述封装api确定对应的所述触碰操作指令。
  10. 根据权利要求7所述的触摸屏轨迹识别装置,其特征在于,所述手势处理单元,具体用于根据通过单指操作或多指操作捕捉到的手势轨迹进行对应的手势操作处理;
    其中,所述手势操作处理包括:缩放界面、和/或滚动界面、和/或选择文本内容、和/或填充手势划过区域、和/或剪切/复制手势划过区域。
  11. 根据权利要求7所述的触摸屏轨迹识别装置,其特征在于,所述触摸屏笔处理单元,具体用于据捕捉到的触摸屏笔轨迹进行对应的触摸屏笔操作处理;
    其中,所述触摸屏笔操作处理包括:墨迹绘制处理、和/或激光笔处理、和/或规则图像/对象绘制处理、和/或橡皮擦处理。
  12. 根据权利要求7至11中任意一项所述的触摸屏轨迹识别装置,其特征在于,所述触摸屏轨迹识别装置还包括:
    同时处理单元,用于当所述判断单元判断所述触碰操作指令的类型同时为所述触摸屏笔触发指令和所述手势触发指令时,根据预置优先级进行对应优先顺序操作处理,或基于所述预置优先级进行特殊情况确定,若为所述特殊情况,则进行优先级顺序的再次调整,或同时进行与所述触摸屏笔触发指令和所述手势触发指令相对应的操作。
  13. 一种电子设备,其特征在于,包括:
    处理器、存储器、通信接口和总线;
    所述处理器、所述存储器和所述通信接口通过所述总线连接并完成相互间的通信;
    所述存储器存储可执行程序代码;
    所述处理器通过读取所述存储器中存储的可执行程序代码来运行与所述可执行程序代码对应的程序,以用于执行如权利要求1-6任一项所述的一种触摸屏轨迹识别方法。
  14. 一种存储介质,其特征在于,所述存储介质用于存储可执行程序代码,所述可执行程序代码用于在运行时执行如权利要求1-6任一项所述的一种触摸屏轨迹识别方法。
  15. 一种应用程序,其特征在于,所述应用程序用于在运行时执行如权利要求1-6任一项所述的一种触摸屏轨迹识别方法。
PCT/CN2016/097316 2016-04-28 2016-08-30 一种触摸屏轨迹识别方法及装置 WO2017185575A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US16/096,656 US11042290B2 (en) 2016-04-28 2016-08-30 Touch screen track recognition method and apparatus
JP2018556431A JP6847127B2 (ja) 2016-04-28 2016-08-30 タッチスクリーントラック認識方法及び装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610281833.9 2016-04-28
CN201610281833.9A CN107329602A (zh) 2016-04-28 2016-04-28 一种触摸屏轨迹识别方法及装置

Publications (1)

Publication Number Publication Date
WO2017185575A1 true WO2017185575A1 (zh) 2017-11-02

Family

ID=60161829

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/097316 WO2017185575A1 (zh) 2016-04-28 2016-08-30 一种触摸屏轨迹识别方法及装置

Country Status (4)

Country Link
US (1) US11042290B2 (zh)
JP (1) JP6847127B2 (zh)
CN (1) CN107329602A (zh)
WO (1) WO2017185575A1 (zh)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013169843A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for manipulating framed graphical objects
WO2013169842A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for selecting object within a group of objects
JP6031186B2 (ja) 2012-05-09 2016-11-24 アップル インコーポレイテッド ユーザインタフェースオブジェクトを選択するためのデバイス、方法及びグラフィカルユーザインタフェース
WO2013169865A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
JP6082458B2 (ja) 2012-05-09 2017-02-15 アップル インコーポレイテッド ユーザインタフェース内で実行される動作の触知フィードバックを提供するデバイス、方法、及びグラフィカルユーザインタフェース
WO2013169849A2 (en) 2012-05-09 2013-11-14 Industries Llc Yknots Device, method, and graphical user interface for displaying user interface objects corresponding to an application
AU2013259630B2 (en) 2012-05-09 2016-07-07 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to gesture
WO2013169854A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
EP2847657B1 (en) 2012-05-09 2016-08-10 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
CN107832003B (zh) * 2012-12-29 2021-01-22 苹果公司 用于放大内容的方法和设备、电子设备和介质
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US20170045981A1 (en) 2015-08-10 2017-02-16 Apple Inc. Devices and Methods for Processing Touch Inputs Based on Their Intensities
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9880735B2 (en) 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10719230B2 (en) * 2018-09-27 2020-07-21 Atlassian Pty Ltd Recognition and processing of gestures in a graphical user interface using machine learning
KR20230022766A (ko) * 2021-08-09 2023-02-16 삼성전자주식회사 스타일러스 펜의 입력을 처리하는 전자 장치와 이의 동작 방법
US11922008B2 (en) 2021-08-09 2024-03-05 Samsung Electronics Co., Ltd. Electronic device processing input of stylus pen and method for operating the same

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882042A (zh) * 2010-06-08 2010-11-10 苏州瀚瑞微电子有限公司 电容式触摸屏手掌判别方法
CN102063236A (zh) * 2010-12-29 2011-05-18 福州锐达数码科技有限公司 一种压感电子白板实现触摸识别的方法
CN102707861A (zh) * 2011-03-28 2012-10-03 联想(北京)有限公司 电子设备及其显示方法
CN103207718A (zh) * 2013-03-18 2013-07-17 敦泰科技有限公司 一种互电容式触摸屏及其触摸感应方法
US20140028605A1 (en) * 2012-07-26 2014-01-30 Texas Instruments Incorporated Touch Profiling on Capacitive-Touch Screens
CN104915136A (zh) * 2014-03-14 2015-09-16 Lg电子株式会社 移动终端及其控制方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4857385B2 (ja) * 2010-01-12 2012-01-18 パナソニック株式会社 電子ペンシステム
WO2012099584A1 (en) * 2011-01-19 2012-07-26 Hewlett-Packard Development Company, L.P. Method and system for multimodal and gestural control
KR102063952B1 (ko) * 2012-10-10 2020-01-08 삼성전자주식회사 멀티 디스플레이 장치 및 멀티 디스플레이 방법
TWI514229B (zh) * 2013-11-22 2015-12-21 Elan Microelectronics Corp 圖形編輯方法以及電子裝置
JP6235349B2 (ja) 2014-01-16 2017-11-22 シャープ株式会社 タッチ操作機能付表示装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882042A (zh) * 2010-06-08 2010-11-10 苏州瀚瑞微电子有限公司 电容式触摸屏手掌判别方法
CN102063236A (zh) * 2010-12-29 2011-05-18 福州锐达数码科技有限公司 一种压感电子白板实现触摸识别的方法
CN102707861A (zh) * 2011-03-28 2012-10-03 联想(北京)有限公司 电子设备及其显示方法
US20140028605A1 (en) * 2012-07-26 2014-01-30 Texas Instruments Incorporated Touch Profiling on Capacitive-Touch Screens
CN103207718A (zh) * 2013-03-18 2013-07-17 敦泰科技有限公司 一种互电容式触摸屏及其触摸感应方法
CN104915136A (zh) * 2014-03-14 2015-09-16 Lg电子株式会社 移动终端及其控制方法

Also Published As

Publication number Publication date
JP2019516189A (ja) 2019-06-13
US11042290B2 (en) 2021-06-22
CN107329602A (zh) 2017-11-07
US20200210059A1 (en) 2020-07-02
JP6847127B2 (ja) 2021-03-24

Similar Documents

Publication Publication Date Title
WO2017185575A1 (zh) 一种触摸屏轨迹识别方法及装置
US10048748B2 (en) Audio-visual interaction with user devices
US9665276B2 (en) Character deletion during keyboard gesture
CN110058782B (zh) 基于交互式电子白板的触摸操作方法及其系统
JP6427559B6 (ja) 手書き入力のための永久同期システム
US20220214784A1 (en) Systems and methods for a touchscreen user interface for a collaborative editing tool
US9423908B2 (en) Distinguishing between touch gestures and handwriting
US20140189482A1 (en) Method for manipulating tables on an interactive input system and interactive input system executing the method
US20150077358A1 (en) Electronic device and method of controlling the same
US10031667B2 (en) Terminal device, display control method, and non-transitory computer-readable recording medium
US20130044061A1 (en) Method and apparatus for providing a no-tap zone for touch screen displays
US20160170632A1 (en) Interacting With Application Beneath Transparent Layer
US10162501B2 (en) Terminal device, display control method, and non-transitory computer-readable recording medium
EP2899623A2 (en) Information processing apparatus, information processing method, and program
WO2020118491A1 (zh) 基于指纹识别的交互方法、电子设备及相关装置
US10613645B2 (en) Mechanism for pen interoperability with pressure sensor design
US10222866B2 (en) Information processing method and electronic device
TWI547863B (zh) 手寫輸入識別方法、系統與電子裝置
US9430702B2 (en) Character input apparatus and method based on handwriting
WO2020124422A1 (zh) 手写系统的控制方法和手写系统
EP3101522A1 (en) Information processing device, information processing method, and program
US10001839B2 (en) Gesture recognition of ink strokes
WO2020062121A1 (zh) 页面操作方法、电子装置及计算机可读存储介质
US20130222278A1 (en) Electronic device and method for setting editing tools of the electronic device
US10474886B2 (en) Motion input system, motion input method and program

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018556431

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16900084

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 140219)

122 Ep: pct application non-entry in european phase

Ref document number: 16900084

Country of ref document: EP

Kind code of ref document: A1