CN113031817B - Multi-touch gesture recognition method and false touch prevention method - Google Patents

Multi-touch gesture recognition method and false touch prevention method Download PDF

Info

Publication number
CN113031817B
CN113031817B CN202110311452.1A CN202110311452A CN113031817B CN 113031817 B CN113031817 B CN 113031817B CN 202110311452 A CN202110311452 A CN 202110311452A CN 113031817 B CN113031817 B CN 113031817B
Authority
CN
China
Prior art keywords
touch
state
gesture recognition
group
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110311452.1A
Other languages
Chinese (zh)
Other versions
CN113031817A (en
Inventor
李广垒
方田
陈祖涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Baoxin Information Technology Co ltd
Original Assignee
Anhui Baoxin Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Baoxin Information Technology Co ltd filed Critical Anhui Baoxin Information Technology Co ltd
Priority to CN202110311452.1A priority Critical patent/CN113031817B/en
Publication of CN113031817A publication Critical patent/CN113031817A/en
Application granted granted Critical
Publication of CN113031817B publication Critical patent/CN113031817B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04162Control or interface arrangements specially adapted for digitisers for exchanging data with external devices, e.g. smart pens, via the digitiser sensing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04186Touch location disambiguation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04105Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04106Multi-sensing digitiser, i.e. digitiser using at least two different sensing technologies simultaneously or alternatively, e.g. for detecting pen and finger, for saving power or for improving position detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a gesture recognition method and an error touch prevention method for multi-point touch control, wherein the gesture recognition method comprises the following steps: defining a canvas touch management mode, and dividing canvas touch into gesture touch and drawing touch; defining a state transition process in a finite state machine FSM in canvas mode; judging the touch number and touch type through the touch area and the pressure intensity during touch; adding an ID field to each touch point and tracking the change of the touch state; eliminating touch jitter by moving a detection threshold according to the change of the tracking touch state; and combining the set IDs, so as to carry out touch grouping and realize gesture recognition of multi-point touch. The invention can simultaneously support a plurality of single-finger drawing and more than 3-finger erasing operations, and breaks through the situation that the system can only work for identifying single touch operation; moreover, by touch grouping, simultaneous single-finger drawing of multiple people and simultaneous 3-finger erasing operation of multiple people can be realized.

Description

Multi-touch gesture recognition method and false touch prevention method
Technical Field
The invention relates to the field of man-machine interaction, in particular to a gesture recognition method for multi-point touch and an error touch prevention method.
Background
Along with the advancement of intelligent wave, numerous industry fields gradually enter a digital remote era, and drawing software similar to an electronic whiteboard in large-scale display screen equipment is widely applied to related fields, in particular education and teaching, live broadcast interaction and teleconferencing directions. The man-machine interaction mode is also the most common keyboard and mouse display, and the multi-finger touch interaction mode is additionally added.
However, in the prior art, there are many problems in the multi-finger touch interaction method. For example, platform manufacturers of various operating systems have differences in development, so that developers in the software industry are difficult to achieve compatibility and unification when switching different platforms and operating systems, especially when developing multi-finger touch interactive modes on domestic operating systems, and development of unified software are hindered. In addition, the existing technical scheme only solves the problems that the multi-finger gesture recognition is realized in terms of design and implementation of the gesture recognition, the difference between the touch pen and the finger cannot be well distinguished, a plurality of contact points (the finger and the touch pen) exist when the system works, and if the contact points cannot be distinguished, the problem of false contact can be caused.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a gesture recognition method for multi-touch and an error touch prevention method.
In one aspect, the present invention provides a gesture recognition method for multi-touch, the method comprising steps S1 to S6 as follows:
step S1: and defining a canvas touch management mode, and dividing canvas touches into gesture touches and drawing touches.
Further, since the software canvas may be endless, no area outside the current screen need be used in drawing the touch state, and thus, a canvas coordinate system is employed. At this time, a touch operation such as a stylus or a mouse is defined as a single-point touch operation; the touch operation of the finger may be classified into a single touch and a multi-touch according to the touch point. Wherein, single touch supports: drawing notes and other operations; multi-touch support: erasure, etc.
Further, when a larger area canvas is required to be used, a gesture touch state is selected, and the canvas is panned and zoomed, so that a screen coordinate system is adopted.
Step S2: a state transition process in the finite state machine FSM in canvas mode is defined.
Further, the method specifically comprises the following steps: during the program running, the user can choose the state. If the user does not select the state, after entering the program, the working state of the FSM is initialized to be a drawing state; then, the user can reselect through state transition, and switch any current working state into the specified one: one of a selection state, an editing state, and a drawing state. Meanwhile, the user can forcibly switch to the editing state by an external operation such as inserting an object. When the operation is in the selection state and the object is selected, automatically switching to the editing state; when the operation is in the edit state, by single-touch of a portion other than the BindingBox, the edit state is automatically canceled and simultaneously switched to the single-touch: a drawing state or an eraser state.
Step S3: and judging the number and the type of the touches according to the touch area and the pressure intensity during the touches.
Further, the method is specifically divided into:
3.1: when the touch area and the pressure are within a certain threshold, recognizing the touch area and the pressure as single-point touch in finger touch;
3.2: when the pressure is higher than a certain threshold, the touch area is within a certain threshold, and the single-point touch of the touch pen is determined;
3.3: when the touch area exceeds a certain threshold, the points with larger pressure intensity in the touch area are selected, and the number of touch points is defined by calculating the pressure intensity of the touch center points, so that the touch points are considered to be gesture touch or multi-point touch under drawing touch.
Step S4: an ID field is added to each touch point and changes in touch status are tracked.
Further, because the computer processing mode is serial processing, even if a plurality of fingers are seemingly touching the screen at the same time, for the computer, the state of touching is recorded one by one according to the sequence in time, and the state is regarded as one single-point touching one by one. Therefore, it is necessary to facilitate the post-grouping by adding an ID field to each touch point.
Further, the touch state can be divided into:
4.1: an Enter state hovering over the touch graphical interface, but without pressing and moving;
4.2: a Press state in which touch and screen contact occur in a graphical interface supporting touch processing;
4.3: move state moving in the graphic interface supporting touch processing;
4.4: after the touch is released, the mouse or the hidden touch cursor is still in a Release state in the graphical interface supporting touch processing;
4.5: leave the Leave state of the graphical interface supporting touch processing.
Step S5: the touch jitter is removed by moving the detection threshold according to tracking the change of the touch state.
Further, the specific treatment process comprises the following steps:
s51: setting software so that the touch screen has a periodic operation of scanning the screen at a fixed frequency and detecting a touch state, preferably 5 times per second;
s52: when a user touches a finger or a touch pen on a screen, the system continuously receives the position where the touch occurs and the touch state, the state of the first touch on the screen is Press, and then the default state of the system is Move;
s53: and comparing the touch position information generated by each touch point through touch detection scanning, and when the latter position does not generate movement in the movement detection threshold value relative to the former position, considering that the corresponding touch point does not generate movement operation, so that the sent Move state is forcedly changed into a Press state.
Step S6: and combining the set IDs, so as to carry out touch grouping and realize gesture recognition of multi-point touch.
Further, the step of touch grouping is:
s61: dividing the touch points which enter successively into a group when no movement is generated after jitter is eliminated;
s62: if the first touch exceeds the movement detection threshold, dividing the touch into a group;
s63: if the later entered touch exceeds the movement detection threshold, the touch is classified as a group with the previous group of touches that have not moved.
Still further, the gesture recognition process is:
s64: if only one touch point exists in one of the touch groupings, then the state machine FSM operates at some operation under the draw touch;
s65: if one of the touch groupings has only two touch points and none of the other touch points previously exists, then the two touch points are considered to be a drag-and-zoom operation under the gesture touch; if one of the touch groupings has only two touch points and there were other touch points previously, the zoom drag operation and other modes of operation are mutually exclusive, so this group of operations is ignored;
s66: if there are three or more touch points in one of the touch groupings, then a gesture-implemented erase operation is deemed to be.
On the other hand, the invention also protects a multi-touch error-preventing method, the gesture recognition method described in the embodiment 1 is used for recognizing multi-touch operation, and error touch prevention is realized on the basis of accurate recognition.
Further, once there is movement in the touch points of the same group, not only the grouping is determined, but also even if the touch points of the group no longer exceed the movement detection threshold value, the state scanned by the detection program is forcedly modified to be a Press state, so that the gesture is determined or the working state of the finite state machine FSM is limited, and no change is made before all fingers leave; once the FSM is defined, other touch actions that are later in will also change to an operating state when the FSM is defined, and the FSM will allow the state to be changed unless all fingers leave, thereby realizing the multi-touch anti-false touch.
On the other hand, the invention also protects a gesture recognition and false touch prevention system for multi-point touch under the universal platform, and the gesture recognition method based on the design and the false touch prevention method based on the design. Firstly, a general platform supporting all operating systems is built, and then a gesture recognition and error touch prevention system for multi-point touch control under the general platform is developed based on the platform.
Specifically, developing general software supporting Android/iOS/windows and a domestic operating system based on a Microsoft general platform framework Xamarin; the Net Core is based on unified packaging of the platforms, the packaging supports the capability of each platform, and loses the unique characteristics of the respective platform, wherein the Net Core has the advantage of constructing modern, scalable and high-performance cross-platform software application programs; if the universality of multiple platforms needs to be supported, the unique functions supported by the requirements of the respective platforms are developed by using the extended Xamarin. Essentials, wherein the extended Xamarin. Essentials normalize the bottom layer interfaces with inconsistent codes caused by platform differences again, so that the development efficiency is improved.
The invention has the beneficial effects that: 1. simultaneously supports a plurality of single-finger drawing and more than 3-finger eraser operation, breaks through the condition that the system can only work for identifying single touch operation; 2. through touch grouping, the simultaneous single-finger drawing and the simultaneous 3-finger erasing operation of multiple people can be realized; 3. by setting the dragging and zooming to the exclusive mode and mutually exclusive with other working modes, the use experience of a user is not affected; 4. by building a universal platform supporting each operating system, accurate gesture recognition is achieved, and the method is not limited to one operating system.
Drawings
FIG. 1 is a flow chart of a gesture recognition method of multi-touch;
fig. 2 is a state transition flow diagram in a finite state machine FSM.
Detailed Description
The invention will be described in further detail with reference to the drawings and the detailed description. The embodiments of the invention have been presented for purposes of illustration and description, and are not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
Example 1
As shown in fig. 1, the gesture recognition method of multi-touch control includes the following steps S1 to S6:
s1: and defining a canvas touch management mode, and dividing canvas touches into gesture touches and drawing touches.
In particular, since the software canvas may be endless, no area outside the current screen need be used in drawing the touch state, and thus, a canvas coordinate system is employed. At this time, a touch operation such as a stylus or a mouse is defined as a single-point touch operation; the touch operation of the finger may be classified into a single touch and a multi-touch according to the touch point. Wherein, single touch supports: drawing notes and other operations; multi-touch support: erasure, etc.
When a larger area canvas is required to be used, the gesture touch state is selected, and the canvas is panned and zoomed, so that a screen coordinate system is adopted.
S2: a state transition process in the finite state machine FSM in canvas mode is defined.
As shown in fig. 2, the specific conversion is: during the program running, the user can choose the state. If the user does not select the state, after entering the program, the working state of the FSM is initialized to be a drawing state; then, the user can reselect through state transition, and switch any current working state into the specified one: one of a selection state, an editing state, and a drawing state. Meanwhile, the user can forcibly switch to the editing state by an external operation such as inserting an object. When the operation is in the selection state and the object is selected, automatically switching to the editing state; when the operation is in the edit state, by single-touch of a portion other than the BindingBox, the edit state is automatically canceled and simultaneously switched to the single-touch: a drawing state or an eraser state.
S3: and judging the number and the type of the touches according to the touch area and the pressure intensity during the touches.
The method is concretely divided into:
s31: when the touch area and the pressure are within a certain threshold, recognizing the touch area and the pressure as single-point touch in finger touch;
s32: when the pressure is higher than a certain threshold, the touch area is within a certain threshold, and the single-point touch of the touch pen is determined;
s33: when the touch area exceeds a certain threshold, the points with larger pressure intensity in the touch area are selected, and the number of touch points is defined by calculating the pressure intensity of the touch center points, so that the touch points are considered to be gesture touch or multi-point touch under drawing touch.
S4: an ID field is added to each touch point and changes in touch status are tracked.
Specifically, because the computer processing mode is serial processing, even if a plurality of fingers are seemingly touching the screen at the same time, the computer is also used for recording the touch occurrence states one by one according to the time sequence, and the touch is regarded as one single-point touch one by one. Therefore, it is necessary to facilitate the post-grouping by adding an ID field to each touch point.
Specifically, the touch state can be divided into:
s41: an Enter state hovering over the touch graphical interface, but without pressing and moving;
s42: a Press state in which touch and screen contact occur in a graphical interface supporting touch processing;
s43: move state moving in the graphic interface supporting touch processing;
s44: after the touch is released, the mouse or the hidden touch cursor is still in a Release state in the graphical interface supporting touch processing;
s45: leave the Leave state of the graphical interface supporting touch processing.
S5: the touch jitter is removed by moving the detection threshold according to tracking the change of the touch state.
S51: setting software so that the touch screen has a periodic operation of scanning the screen at a fixed frequency and detecting a touch state, preferably 5 times per second;
s52: when a user touches a finger or a touch pen on a screen, the system continuously receives the position where the touch occurs and the touch state, the state of the first touch on the screen is Press, and then the default state of the system is Move;
s53: and comparing the touch position information generated by each touch point through touch detection scanning, and when the latter position does not generate movement in the movement detection threshold value relative to the former position, considering that the corresponding touch point does not generate movement operation, so that the sent Move state is forcedly changed into a Press state.
S6: and combining the set IDs, so as to carry out touch grouping and realize gesture recognition of multi-point touch.
The specific touch grouping steps are as follows:
s61: dividing the touch points which enter successively into a group when no movement is generated after jitter is eliminated;
s62: if the first touch exceeds the movement detection threshold, dividing the touch into a group;
s63: if the later entered touch exceeds the movement detection threshold, the touch is classified as a group with the previous group of touches that have not moved.
Specific gesture recognition process: s64: if only one touch point exists in one of the touch groupings, then the state machine FSM operates at some operation under the draw touch;
s65: if one of the touch groupings has only two touch points and none of the other touch points previously exists, then the two touch points are considered to be a drag-and-zoom operation under the gesture touch; if one of the touch groupings has only two touch points and there were other touch points previously, the zoom drag operation and other modes of operation are mutually exclusive, so this group of operations is ignored;
s66: if there are three or more touch points in one of the touch groupings, then a gesture-implemented erase operation is deemed to be.
Example 2
The multi-touch error touch prevention method includes the steps of firstly identifying multi-touch operation by using the gesture identification method described in the embodiment 1, and realizing error touch prevention on the basis of accurate identification. The specific process is as follows:
once the touch points in the same group have movement, not only the grouping is determined, but also the state scanned by the detection program is forcedly modified into a Press state even if the touch points in the group do not exceed the movement detection threshold after the touch points in the group, so that the gesture is determined or the working state of the FSM is limited, and no change is made before all fingers leave; once the FSM is defined, other touch actions that are later in will also change to an operating state when the FSM is defined, and the FSM will allow the state to be changed unless all fingers leave, thereby realizing the multi-touch anti-false touch.
Example 3
A multi-touch gesture recognition and false touch prevention system under a general platform is based on the gesture recognition method designed in the above embodiment 1 and the false touch prevention method designed in the above embodiment 2. Firstly, a general platform supporting all operating systems is built, and then a gesture recognition and error touch prevention system for multi-point touch control under the general platform is developed based on the platform.
Specifically, developing general software supporting Android/iOS/windows and a domestic operating system based on a Microsoft general platform framework Xamarin; the Net Core is based on unified packaging of the platforms, the packaging supports the capability of each platform, and loses the unique characteristics of the respective platform, wherein the Net Core has the advantage of constructing modern, scalable and high-performance cross-platform software application programs; if the universality of multiple platforms needs to be supported, the unique functions supported by the requirements of the respective platforms are developed by using the extended Xamarin. Essentials, wherein the extended Xamarin. Essentials normalize the bottom layer interfaces with inconsistent codes caused by platform differences again, so that the development efficiency is improved.
It will be apparent that the described embodiments are only some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art and which are included in the embodiments of the present invention without the inventive step, are intended to be within the scope of the present invention. It will be apparent that the described embodiments are only some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art and which are included in the embodiments of the present invention without the inventive step, are intended to be within the scope of the present invention.

Claims (8)

1. The gesture recognition method of the multi-point touch control is characterized by realizing accurate recognition of gestures of a user by designing a finite state machine and touch grouping, and the gesture recognition method is constructed and comprises the following steps:
s1, defining a canvas touch management mode, and dividing canvas touch into gesture touch and drawing touch;
s2, defining a state conversion process in a finite state machine FSM in a canvas mode;
s3, judging the touch number and touch type through the touch area and the pressure intensity during touch;
s4, adding an ID field to each touch point, and tracking the change of the touch state;
s5, eliminating touch jitter through moving a detection threshold according to the change of the tracking touch state;
s6, combining the set IDs to perform touch grouping so as to realize gesture recognition of multi-point touch,
the touch grouping is as follows:
s61, dividing the touch points which enter successively into a group when no movement is generated after jitter is eliminated;
s62, if the first touch exceeds a movement detection threshold, dividing the touch into a group;
s63, if the later entered touch exceeds a movement detection threshold, classifying the touch and the previous group of touches which do not move into one group;
the gesture recognition process comprises the following steps:
s64, if one group in the touch group only has one touch point, the state machine FSM works at a certain operation under the drawing touch;
s65, if one group in the touch groups only has two touch points and no other touch points exist previously, the two touch points are recognized as a drag scaling operation under gesture touch; if one of the touch groupings has only two touch points and there were other touch points previously, the zoom drag operation and other modes of operation are mutually exclusive, so this group of operations is ignored;
s66, if one group of touch groups has three or more touch points, the operation is determined to be an erasure operation realized by one gesture.
2. The multi-touch gesture recognition method according to claim 1, wherein the drawing of the touch state in step S1 uses a canvas coordinate system; the gesture touch state adopts a screen coordinate system.
3. The method for recognizing multi-touch gestures according to claim 1, wherein in the step S2, during the running of the program, the user can select the state, and if the user does not select the state, the working state of the FSM is initialized to be the drawing state after entering the program; then, the user can reselect through state transition, and switch any current working state into the specified one: selecting one of a state, an editing state, and a drawing state; meanwhile, the user can forcedly switch to the editing state through external operations such as inserting objects; when the operation is in the selection state and the object is selected, automatically switching to the editing state; when the operation is in the edit state, by single-touch of a portion other than the BindingBox, the edit state is automatically canceled and simultaneously switched to single-touch.
4. The method for gesture recognition of multi-touch according to claim 1, wherein in step S5, the process of eliminating touch jitter by moving the detection threshold is:
s51, setting software to enable the touch screen to have a periodic operation of scanning the screen according to fixed frequency and detecting a touch state;
s52, when a user touches a finger or a touch pen on the screen, the system continuously receives the position where the touch occurs and the touch state, the state of the first touch on the screen is Press, and then the default state of the system is Move;
s53, comparing the touch position information generated by each touch point through touch detection scanning, and when the latter position does not generate movement in the movement detection threshold value relative to the former position, considering that the corresponding touch point does not generate movement operation, so as to forcedly change the sent Move state into a Press state.
5. The method according to claim 4, wherein in step S51, the number of scanning the screen is 5 times per second.
6. The multi-touch error-touch prevention method is characterized in that the gesture recognition method of claim 1 is used for recognizing multi-touch operation, and error touch prevention is realized in the accurate recognition process.
7. The multi-touch anti-false touch method according to claim 6, wherein if there is movement in the touch points of the same group, not only the group is determined, but also the state scanned by the detection program is forcedly modified to be a Press state even if the movement detection threshold is not exceeded after the touch points of the group, so as to determine the gesture or define the working state of the finite state machine FSM, and no change is made before all fingers leave; if the finite state machine FSM is limited, other touch behaviors which are later-in also become working states when the finite state machine FSM is limited, and the finite state machine FSM only allows the state to be changed unless all fingers leave, so that the multi-touch error touch prevention is realized.
8. A multi-touch gesture recognition and false touch prevention system under a general platform, which is characterized in that the gesture recognition method according to any one of claims 1-5 and the false touch prevention method according to any one of claims 6-7 are based; firstly, developing general software supporting Android/iOS/Windows and a domestic operating system based on a Microsoft general platform framework Xamarin; and then, uniformly packaging each platform based on the Net Core, building a universal platform, and developing a multi-touch gesture recognition and error touch prevention system under the universal platform based on the platform.
CN202110311452.1A 2021-03-19 2021-03-19 Multi-touch gesture recognition method and false touch prevention method Active CN113031817B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110311452.1A CN113031817B (en) 2021-03-19 2021-03-19 Multi-touch gesture recognition method and false touch prevention method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110311452.1A CN113031817B (en) 2021-03-19 2021-03-19 Multi-touch gesture recognition method and false touch prevention method

Publications (2)

Publication Number Publication Date
CN113031817A CN113031817A (en) 2021-06-25
CN113031817B true CN113031817B (en) 2024-01-19

Family

ID=76473154

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110311452.1A Active CN113031817B (en) 2021-03-19 2021-03-19 Multi-touch gesture recognition method and false touch prevention method

Country Status (1)

Country Link
CN (1) CN113031817B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114559886B (en) * 2022-02-26 2024-03-29 东莞市台铃车业有限公司 Anti-false touch control method, system, device, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102880344A (en) * 2012-09-13 2013-01-16 广东威创视讯科技股份有限公司 Multi-touch-point identification method
CN105511794A (en) * 2015-12-14 2016-04-20 中国电子科技集团公司第十五研究所 Plotting system supporting multi-point touch gesture operation and method of system
CN106293434A (en) * 2015-05-28 2017-01-04 惠州市德赛西威汽车电子股份有限公司 The multi-point gesture identification method of vehicular touch screen terminal and device
KR101777961B1 (en) * 2016-04-18 2017-09-13 주식회사 한글과컴퓨터 Method and system for recognizing multi touch gesture
CN107562366A (en) * 2017-09-28 2018-01-09 珠海普林芯驰科技有限公司 Gesture identification method, computer installation and computer-readable recording medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102880344A (en) * 2012-09-13 2013-01-16 广东威创视讯科技股份有限公司 Multi-touch-point identification method
CN106293434A (en) * 2015-05-28 2017-01-04 惠州市德赛西威汽车电子股份有限公司 The multi-point gesture identification method of vehicular touch screen terminal and device
CN105511794A (en) * 2015-12-14 2016-04-20 中国电子科技集团公司第十五研究所 Plotting system supporting multi-point touch gesture operation and method of system
KR101777961B1 (en) * 2016-04-18 2017-09-13 주식회사 한글과컴퓨터 Method and system for recognizing multi touch gesture
CN107562366A (en) * 2017-09-28 2018-01-09 珠海普林芯驰科技有限公司 Gesture identification method, computer installation and computer-readable recording medium

Also Published As

Publication number Publication date
CN113031817A (en) 2021-06-25

Similar Documents

Publication Publication Date Title
US9996176B2 (en) Multi-touch uses, gestures, and implementation
CN110058782B (en) Touch operation method and system based on interactive electronic whiteboard
CN102096548B (en) Touch-sensitive display is adopted to copy the method and system of object
US8810509B2 (en) Interfacing with a computing application using a multi-digit sensor
JP4800060B2 (en) Method for operating graphical user interface and graphical user interface device
US20050052427A1 (en) Hand gesture interaction with touch surface
US20090090567A1 (en) Gesture determination apparatus and method
CN111475097B (en) Handwriting selection method and device, computer equipment and storage medium
US20140189482A1 (en) Method for manipulating tables on an interactive input system and interactive input system executing the method
CN102768595B (en) A kind of method and device identifying touch control operation instruction on touch-screen
CN103186345A (en) Text segment selecting method and field selecting method, device and terminal
US9773329B2 (en) Interaction with a graph for device control
US8462113B2 (en) Method for executing mouse function of electronic device and electronic device thereof
US20150363037A1 (en) Control method of touch panel
CN104077066A (en) Portable device and operation method
CN103389876A (en) Function switching method based on touch display equipment and touch display equipment
US20140298275A1 (en) Method for recognizing input gestures
CN113031817B (en) Multi-touch gesture recognition method and false touch prevention method
CN103150111A (en) Symbol inputting method, device and terminal
CN112698735A (en) Information input method and device and electronic equipment
CN109739422B (en) Window control method, device and equipment
KR102491207B1 (en) Apparatus and method for multi-touch recognition
CN104407763A (en) Content input method and system
US10133346B2 (en) Gaze based prediction device and method
JP2014048894A (en) Display control device and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant