CN113093972A - Gesture recognition method and system, storage medium and touch screen device - Google Patents

Gesture recognition method and system, storage medium and touch screen device Download PDF

Info

Publication number
CN113093972A
CN113093972A CN201911338584.2A CN201911338584A CN113093972A CN 113093972 A CN113093972 A CN 113093972A CN 201911338584 A CN201911338584 A CN 201911338584A CN 113093972 A CN113093972 A CN 113093972A
Authority
CN
China
Prior art keywords
gesture
contact
touch screen
angle change
gesture operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911338584.2A
Other languages
Chinese (zh)
Inventor
闫俊超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chipone Technology Beijing Co Ltd
Original Assignee
Chipone Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chipone Technology Beijing Co Ltd filed Critical Chipone Technology Beijing Co Ltd
Priority to CN201911338584.2A priority Critical patent/CN113093972A/en
Publication of CN113093972A publication Critical patent/CN113093972A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Character Discrimination (AREA)

Abstract

The invention provides a gesture recognition method, a system, a storage medium and a touch screen device, which are applied to the touch screen device and comprise the following steps: when the touch screen equipment receives gesture operation, acquiring contact coordinate information in each frame of gesture data; calculating an angle change value of a current frame contact compared with a previous frame contact for gesture data of a preset number of frames at intervals, and determining the direction of a contact connecting line of the current frame and the previous frame according to the angle change value; and taking all direction changes corresponding to the gesture operation as direction code characteristics, and judging characters corresponding to the gesture operation according to the direction code characteristics. The gesture recognition method, the gesture recognition system, the storage medium and the touch screen device realize accurate recognition of the gesture according to the angle change characteristics in the gesture, and are high in practicability.

Description

Gesture recognition method and system, storage medium and touch screen device
Technical Field
The invention relates to the technical field of gesture recognition, in particular to a gesture recognition method, a gesture recognition system, a storage medium and touch screen equipment.
Background
In the prior art, a gesture recognition algorithm based on sensor data is mainly realized by matching direction codes. The calculation of the direction code is based on the determination of the direction of the scribe line between the current contact and the historical contact. Wherein, the direction is divided into four directions, namely: up, down, left and right. When gesture recognition is carried out, the current action is judged according to the change of the line drawing direction and the line drawing angle, and the current action is further used as an important basis for gesture character recognition. Specifically, the existing gesture recognition algorithm mainly includes the following steps:
(1) and judging the scribing directions of the upper part, the lower part, the left part and the right part according to the coordinate position between the current contact and the historical contact.
(2) And acquiring all scribing directions acquired in the gesture process, and taking all scribing directions as the direction code characteristics of the current gesture.
(3) And finishing the gesture recognition according to the matching of the direction feature codes and a preset direction library.
However, the gesture recognition algorithm has the following disadvantages:
1) because the direction division is not fine enough, the direction division can only be divided into an upper direction, a lower direction, a left direction and a right direction, and the direction change analysis is not fine. As shown in fig. 1, the contacts in which the scribe angle variation Δ θ is less than 45 ° all belong to one direction.
2) Due to the fact that the direction division is not fine, the situation that the direction code spaces are overlapped occurs in recognition of some characters. As shown in fig. 2, for the lower case L, the upper case L, and the less than # <, the direction codes are all recognized as first-next-right, resulting in the direction code spaces coinciding.
Disclosure of Invention
In view of the foregoing disadvantages of the prior art, an object of the present invention is to provide a gesture recognition method, a system, a storage medium, and a touch screen device, which implement accurate recognition of a gesture according to an angle variation characteristic in the gesture, and have strong practicability.
In order to achieve the above and other related objects, the present invention provides a gesture recognition method applied to a touch screen device, including the following steps: when the touch screen equipment receives gesture operation, acquiring contact coordinate information in each frame of gesture data; calculating an angle change value of a current frame contact compared with a previous frame contact for gesture data of a preset number of frames at intervals, and determining the direction of a contact connecting line of the current frame and the previous frame according to the angle change value; and taking all direction changes corresponding to the gesture operation as direction code characteristics, and judging characters corresponding to the gesture operation according to the direction code characteristics.
In an embodiment of the present invention, the gesture operation presses the touch screen device as a starting point of the gesture operation; the gesture operation is lifted from the touch screen device as an end point of the gesture operation.
In an embodiment of the present invention, calculating the angle change value of the touch point of the current frame compared to the touch point of the previous frame includes the following steps:
calculating a characteristic distance (Δ x, Δ y) between the current frame contact and the previous frame contact, wherein Δ x ═ xi–xi-1),Δy=(yi–yi-1),(xi,yi) As the current frame contact coordinate information, (x)i-1,yi-1) Coordinate information of the contact point in the previous frame;
calculating the angle change value
Figure BDA0002331642490000021
In an embodiment of the present invention, determining the direction of the contact connecting line between the current frame and the previous frame according to the angle variation value includes the following steps:
dividing the angle change range into 8 intervals, wherein each interval corresponds to one direction;
and taking the direction corresponding to the interval to which the angle change value belongs as the direction of the contact connection line.
In an embodiment of the present invention, the eight intervals are [0 °, pi/8), [ pi/8, pi/4), [ pi/4, 3 pi/8), [3 pi/8, pi/2), [ pi/2, 5 pi/8), [5 pi/8, 3 pi/4), [3 pi/4, 7 pi/8) and [7 pi/8, pi ], respectively, and the corresponding directions are up, up-left, down-right, right and up-right.
In an embodiment of the present invention, when all direction changes corresponding to the gesture operation are used as the direction code feature, when the directions of two continuous contact connection lines are consistent, the direction code feature is not updated; and when the connection directions of two continuous contacts are not consistent, adding a new direction in the direction code characteristics.
In an embodiment of the present invention, determining the character corresponding to the gesture operation according to the direction code feature includes the following steps:
constructing a direction code feature library of characters;
and searching the direction code characteristics in the direction code characteristic library, and judging that the characters corresponding to the direction code characteristics are the characters corresponding to the gesture operation.
Correspondingly, the invention provides a gesture recognition system which is applied to touch screen equipment and comprises an acquisition module, a calculation module and a recognition module;
the acquisition module is used for acquiring contact coordinate information in each frame of gesture data when the touch screen equipment receives gesture operation;
the calculation module is used for calculating an angle change value of a current frame contact compared with a previous frame contact for gesture data of each interval preset number of frames, and determining the direction of a contact connecting line of the current frame and the previous frame according to the angle change value;
the recognition module is used for taking all direction changes corresponding to the gesture operation as direction code features, and judging characters corresponding to the gesture operation according to the direction code features.
The present invention provides a storage medium having stored thereon a computer program which, when executed by a processor, implements the gesture recognition method described above.
Finally, the present invention provides a touch screen device comprising: a processor and a memory;
the memory is used for storing a computer program;
the processor is configured to execute the computer program stored in the memory, so as to enable the touch screen device to execute the gesture recognition method.
As described above, the gesture recognition method, system, storage medium, and touch screen device of the present invention have the following beneficial effects:
(1) the gesture can be accurately recognized according to the angle change characteristics in the gesture;
(2) the scribing direction is finely divided into eight directions, namely, upper, lower, left, right, upper left, lower left, upper right, lower right and the like from the upper direction, the lower direction, the left side and the right side, so that the accuracy of gesture recognition is improved; moreover, as the number of the feature vectors is increased, algorithms with good effects such as similarity matching and the like can be adopted, and the tolerance to stroke errors or character overlapping is higher;
(3) the problem of because of the direction code space overlap that the slight change of angle can not discern and lead to the fact misidentification etc. is solved.
Drawings
FIG. 1 is a diagram illustrating direction code identification in one embodiment of the prior art;
FIG. 2 is a diagram illustrating direction code identification in another embodiment of the prior art;
FIG. 3 is a flowchart illustrating a gesture recognition method according to an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating angle variation values according to an embodiment of the present invention;
FIG. 5 is a flowchart illustrating a gesture recognition method according to another embodiment of the present invention;
FIG. 6 is a schematic diagram illustrating a gesture recognition system according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of an apparatus according to an embodiment of the present invention.
Description of the element reference numerals
61 acquisition Module
62 calculation module
63 identification module
71 processor
72 memory
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention.
It should be noted that the drawings provided in the present embodiment are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
According to the gesture recognition method, the gesture recognition system, the storage medium and the touch screen device, the gesture is accurately recognized according to the angle change characteristics collected in the gesture, the problems that the direction code space is overlapped and further mistaken recognition is caused due to the fact that the direction code cannot be recognized due to slight angle change are effectively solved, and the gesture recognition method and the gesture recognition system are high in practicability.
The gesture recognition method is applied to touch screen equipment. The touch screen device is a device including a touch screen. The touch screen is used as an input device, can simply, conveniently and naturally realize human-computer interaction, and is mainly applied to inquiry of public information, industrial control, military command, electronic games, multimedia teaching and the like. In an embodiment of the present invention, the Touch screen device includes one or more combinations of a smart phone, a tablet computer, an industrial personal computer, and a Touch and Display Driver Integration (TDDI) device.
As shown in fig. 3, in an embodiment, the gesture recognition method of the present invention includes the following steps:
and step S1, when the touch screen device receives gesture operation, acquiring touch point coordinate information in each frame of gesture data.
Specifically, the user touches the touch screen device through a specific gesture to input a corresponding character. For a complete gesture operation, it contains a start point and an end point. In the invention, a contact point collected when the gesture operation presses down the touch screen equipment is used as a starting point of the gesture operation; and taking the contact point collected before the gesture operation is lifted from the touch screen equipment as the terminal point of the gesture operation. In a complete gesture operation execution process, acquiring gesture data on the touch screen device one by taking a frame as a unit, and collecting coordinate information (x, y) of a touch point in the gesture data under a Cartesian coordinate system.
Step S2, for gesture data of each preset number of frames, calculating an angle change value of a current frame contact compared with a previous frame contact, and determining the direction of a contact connecting line of the current frame and the previous frame according to the angle change value.
In the invention, gesture recognition can be carried out according to the angle change of the contact connecting line on each frame of gesture data. Because the angle change of the contact connecting line between adjacent frames is not obvious in most cases, the invention can also select the angle change of the contact connecting line of gesture data of every preset number of frames to perform gesture recognition, thereby not influencing the accuracy of gesture recognition and effectively reducing the calculation complexity.
In an embodiment of the present invention, calculating the angle change value of the touch point of the current frame compared to the touch point of the previous frame includes the following steps:
a) calculating the characteristic distance (Δ x, Δ y) between the current frame contact and the previous frame contact.
Wherein Δ x ═ xi–xi-1),Δy=(yi–yi-1),(xi,yi) As the current frame contact coordinate information, (x)i-1,yi-1) The coordinate information of the contact point in the previous frame. Wherein i is a natural number of 1 or more.
b) As shown in fig. 4, the angle change value of the contact of the current frame compared to the angle change value of the contact of the previous frame is calculated as Δ θ.
As can be seen from the figures, the,
Figure BDA0002331642490000041
in an embodiment of the present invention, determining the direction of the contact connecting line between the current frame and the previous frame according to the angle variation value includes the following steps:
A) the angle variation range is divided into 8 intervals, and each interval corresponds to one direction.
Specifically, the angle change range of the angle change value is divided into 8 sections to correspond to 8 different directions, i.e., up, down, left, right, left-up, left-down, right-up, and right-down. That is, the directions identified in the present invention are extended to 8.
In an embodiment of the present invention, the eight intervals are [0 °, pi/8), [ pi/8, pi/4), [ pi/4, 3 pi/8), [3 pi/8, pi/2), [ pi/2, 5 pi/8), [5 pi/8, 3 pi/4), [3 pi/4, 7 pi/8) and [7 pi/8, pi ], respectively, and the corresponding directions are up, up-left, down-right, right and up-right. It should be noted that the correspondence relationship between the above-mentioned section and the direction is only an example. In the actual use process, the corresponding relation between the angle interval and the direction can be determined according to a specific application scene.
B) And taking the direction corresponding to the interval to which the angle change value belongs as the direction of the contact connection line.
Specifically, an angle change section to which the angle change value belongs is determined. And taking the direction corresponding to the angle change interval as the direction of the contact connecting line of the current frame and the previous frame.
It should be noted that, for the gesture data of the first frame and the gesture data of the last frame, no processing is required, that is, the direction of the contact connecting line with the previous frame does not need to be calculated.
And step S3, taking all direction changes corresponding to the gesture operation as direction code characteristics, and judging characters corresponding to the gesture operation according to the direction code characteristics.
Specifically, all direction changes obtained in the gesture operation process are obtained and used as the direction feature codes of the gesture operation. When the connection directions of two continuous contacts are consistent, the direction code characteristics are not updated; and when the connection directions of two continuous contacts are not consistent, adding a new direction in the direction code characteristics. For example, when the obtained contact connection direction is upper, right, lower left … … in this order, the obtained direction feature code is upper, right, lower left … …
In an embodiment of the present invention, determining the character corresponding to the gesture operation according to the direction code feature includes the following steps:
31) and constructing a direction code feature library of the characters.
Specifically, the existing characters are tested, the direction code characteristics are analyzed and stored as a direction code characteristic library, and the direction code characteristic library is set as A.
32) And searching the direction code characteristics in the direction code characteristic library, and judging that the characters corresponding to the direction code characteristics are the characters corresponding to the gesture operation.
Specifically, the direction code feature is set to be x, and the direction code feature x is searched in the direction code feature library a. If the direction code feature of a certain character in the direction code feature library A is x (namely x belongs to A), the recognition is successful, and the character corresponding to the gesture operation is judged to be the character; if the direction code features of all the characters in the direction code feature library A are not x (namely, the direction code features of all the characters in the direction code feature library A are not x)
Figure BDA0002331642490000051
) And if the gesture operation is failed, judging that the character corresponding to the gesture operation is an invalid character. For example, if the direction feature code x is set to (right, lower right), and the corresponding character in the direction code feature library (right, lower right) is "lower", the character corresponding to the gesture operation can be determined to be "lower" by performing the matching operation in the direction code feature library.
The gesture recognition method of the present invention is further described below by specific embodiments.
As shown in fig. 5, when the touch screen device is touched by a gesture, touch point coordinate data on each frame of gesture data is collected. Wherein the touch point at the moment of pressing gesture is used as the starting point (x)1,y1) The touch point at the moment of gesture lift is taken as the terminal point (x)n,yn). For the current frame contact coordinates (x)i,yi) First, calculate its contact point (x) with the previous framei-1,yi-1) Then calculating the angle change theta between the current frame contact and the previous frame contact1Finally, according to said angular variation θ1And determining the direction of the connection line of the current frame contact and the previous frame contact. And processing frame by frame, sequentially recording the direction change of the scribing process, and storing the direction change as a direction code characteristic containing a direction combination. And finally, identifying the direction code characteristics according to a pre-stored direction code characteristic library so as to obtain characters corresponding to the gesture operation.
As shown in fig. 6, in an embodiment, the gesture recognition system of the present invention is applied to a touch screen device, and includes an obtaining module 61, a calculating module 62, and a recognition module 63.
The obtaining module 61 is configured to obtain the contact coordinate information in each frame of gesture data when the touch screen device receives a gesture operation.
Specifically, the user touches the touch screen device through a specific gesture to input a corresponding character. For a complete gesture operation, it contains a start point and an end point. In the invention, a contact point collected when the gesture operation presses down the touch screen equipment is used as a starting point of the gesture operation; and taking the contact point collected before the gesture operation is lifted from the touch screen equipment as the terminal point of the gesture operation. In a complete gesture operation execution process, acquiring gesture data on the touch screen device one by taking a frame as a unit, and collecting coordinate information (x, y) of a touch point in the gesture data under a Cartesian coordinate system.
The calculating module 62 is connected to the obtaining module 61, and configured to calculate, for gesture data of every preset number of frames, an angle change value of a current frame contact compared with a previous frame contact, and determine, according to the angle change value, a direction of a connection line between the current frame contact and the previous frame contact.
In the invention, gesture recognition can be carried out according to the angle change of the contact connecting line on each frame of gesture data. Because the angle change of the contact connecting line between adjacent frames is not obvious in most cases, the invention can also select the angle change of the contact connecting line of gesture data of every preset number of frames to perform gesture recognition, thereby not influencing the accuracy of gesture recognition and effectively reducing the calculation complexity.
In an embodiment of the present invention, calculating the angle change value of the touch point of the current frame compared to the touch point of the previous frame includes the following steps:
a) calculating the characteristic distance (Δ x, Δ y) between the current frame contact and the previous frame contact.
Wherein Δ x ═ xi–xi-1),Δy=(yi–yi-1),(xi,yi) As the current frame contact coordinate information, (x)i-1,yi-1) The coordinate information of the contact point in the previous frame. Wherein i is a number 1 or moreAnd (4) counting.
b) As shown in fig. 4, the angle change value of the contact of the current frame compared to the angle change value of the contact of the previous frame is calculated as Δ θ.
As can be seen from the figures, the,
Figure BDA0002331642490000071
in an embodiment of the present invention, determining the direction of the contact connecting line between the current frame and the previous frame according to the angle variation value includes the following steps:
A) the angle variation range is divided into 8 intervals, and each interval corresponds to one direction.
Specifically, the angle change range of the angle change value is divided into 8 sections to correspond to 8 different directions, i.e., up, down, left, right, left-up, left-down, right-up, and right-down. That is, the directions identified in the present invention are extended to 8.
In an embodiment of the present invention, the eight intervals are [0 °, pi/8), [ pi/8, pi/4), [ pi/4, 3 pi/8), [3 pi/8, pi/2), [ pi/2, 5 pi/8), [5 pi/8, 3 pi/4), [3 pi/4, 7 pi/8) and [7 pi/8, pi ], respectively, and the corresponding directions are up, up-left, down-right, right and up-right. It should be noted that the correspondence relationship between the above-mentioned section and the direction is only an example. In the actual use process, the corresponding relation between the angle interval and the direction can be determined according to a specific application scene.
B) And taking the direction corresponding to the interval to which the angle change value belongs as the direction of the contact connection line.
Specifically, an angle change section to which the angle change value belongs is determined. And taking the direction corresponding to the angle change interval as the direction of the contact connecting line of the current frame and the previous frame.
It should be noted that, for the gesture data of the first frame and the gesture data of the last frame, no processing is required, that is, the direction of the contact connecting line with the previous frame does not need to be calculated.
The recognition module 63 is connected to the calculation module 62, and configured to use all direction changes corresponding to the gesture operation as direction code features, and determine a character corresponding to the gesture operation according to the direction code features.
Specifically, all direction changes obtained in the gesture operation process are obtained and used as the direction feature codes of the gesture operation. When the connection directions of two continuous contacts are consistent, the direction code characteristics are not updated; and when the connection directions of two continuous contacts are not consistent, adding a new direction in the direction code characteristics. For example, when the obtained contact connection direction is upper, right, lower left … … in this order, the obtained direction feature code is upper, right, lower left … …
In an embodiment of the present invention, determining the character corresponding to the gesture operation according to the direction code feature includes the following steps:
31) and constructing a direction code feature library of the characters.
Specifically, the existing characters are tested, the direction code characteristics are analyzed and stored as a direction code characteristic library, and the direction code characteristic library is set as A.
32) And searching the direction code characteristics in the direction code characteristic library, and judging that the characters corresponding to the direction code characteristics are the characters corresponding to the gesture operation.
Specifically, the direction code feature is set to be x, and the direction code feature x is searched in the direction code feature library a. If the direction code feature of a certain character in the direction code feature library A is x (namely x belongs to A), the recognition is successful, and the character corresponding to the gesture operation is judged to be the character; if the direction code features of all the characters in the direction code feature library A are not x (namely, the direction code features of all the characters in the direction code feature library A are not x)
Figure BDA0002331642490000081
) And if the gesture operation is failed, judging that the character corresponding to the gesture operation is an invalid character. For example, if the direction feature code x is set to (right, lower right), and the corresponding character in the direction code feature library (right, lower right) is "lower", the character corresponding to the gesture operation can be determined to be "lower" by performing the matching operation in the direction code feature library.
It should be noted that the division of the modules of the above apparatus is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the x module may be a processing element that is set up separately, or may be implemented by being integrated in a chip of the apparatus, or may be stored in a memory of the apparatus in the form of program code, and the function of the x module may be called and executed by a processing element of the apparatus. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
The storage medium of the present invention stores thereon a computer program that realizes the above-described gesture recognition method when executed by a processor. Preferably, the storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic disk, U-disk, memory card, or optical disk.
As shown in fig. 7, in an embodiment, the touch screen device of the present invention includes: a processor 71 and a memory 72.
The memory 72 is used for storing computer programs.
The memory 72 includes: various media that can store program codes, such as ROM, RAM, magnetic disk, U-disk, memory card, or optical disk.
The processor 71 is connected to the memory 72, and is configured to execute the computer program stored in the memory 72, so as to enable the touch screen device to execute the above travel mode reminding method.
Preferably, the Processor 71 may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components.
In an embodiment of the present invention, the touch screen device includes one or more combinations of a smart phone, a tablet computer, an industrial personal computer, and a TDDI device.
In summary, the gesture recognition method, the gesture recognition system, the storage medium and the touch screen device of the invention can realize accurate recognition of the gesture according to the angle change characteristics in the gesture; the scribing direction is finely divided into eight directions, namely, upper, lower, left, right, upper left, lower left, upper right, lower right and the like from the upper direction, the lower direction, the left side and the right side, so that the accuracy of gesture recognition is improved; moreover, as the number of the feature vectors is increased, algorithms with good effects such as similarity matching and the like can be adopted, and the tolerance to stroke errors or character overlapping is higher; the problem of because of the direction code space overlap that the slight change of angle can not discern and lead to the fact misidentification etc. is solved. Therefore, the invention effectively overcomes various defects in the prior art and has high industrial utilization value.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (10)

1. A gesture recognition method is applied to touch screen equipment and is characterized in that: the method comprises the following steps:
when the touch screen equipment receives gesture operation, acquiring contact coordinate information in each frame of gesture data;
calculating an angle change value of a current frame contact compared with a previous frame contact for gesture data of a preset number of frames at intervals, and determining the direction of a contact connecting line of the current frame and the previous frame according to the angle change value;
and taking all direction changes corresponding to the gesture operation as direction code characteristics, and judging characters corresponding to the gesture operation according to the direction code characteristics.
2. The gesture recognition method according to claim 1, characterized in that: the gesture operation presses the touch screen device to serve as a starting point of the gesture operation; the gesture operation is lifted from the touch screen device as an end point of the gesture operation.
3. The gesture recognition method according to claim 1, characterized in that: the calculation of the angle change value of the contact point of the current frame compared with the contact point of the previous frame comprises the following steps:
calculating a characteristic distance (Δ x, Δ y) between the current frame contact and the previous frame contact, wherein Δ x ═ xi–xi-1),Δy=(yi–yi-1),(xi,yi) As the current frame contact coordinate information, (x)i-1,yi-1) Coordinate information of the contact point in the previous frame;
calculating the angle change value
Figure FDA0002331642480000011
4. The gesture recognition method according to claim 1, characterized in that: determining the direction of a contact connecting line of the current frame and the previous frame according to the angle change value comprises the following steps:
dividing the angle change range into 8 intervals, wherein each interval corresponds to one direction;
and taking the direction corresponding to the interval to which the angle change value belongs as the direction of the contact connection line.
5. The gesture recognition method according to claim 4, characterized in that: the eight intervals are respectively [0 degrees, pi/8), [ pi/8, pi/4), [ pi/4, 3 pi/8), [3 pi/8, pi/2), [ pi/2, 5 pi/8), [5 pi/8, 3 pi/4), [3 pi/4, 7 pi/8) and [7 pi/8, pi ], and the corresponding directions are respectively upper, upper left, lower right, upper right and upper right.
6. The gesture recognition method according to claim 1, characterized in that: when all direction changes corresponding to the gesture operation are taken as direction code features, when the directions of connecting lines of two continuous contacts are consistent, the direction code features are not updated; and when the connection directions of two continuous contacts are not consistent, adding a new direction in the direction code characteristics.
7. The gesture recognition method according to claim 1, characterized in that: the step of judging the character corresponding to the gesture operation according to the direction code characteristics comprises the following steps:
constructing a direction code feature library of characters;
and searching the direction code characteristics in the direction code characteristic library, and judging that the characters corresponding to the direction code characteristics are the characters corresponding to the gesture operation.
8. The utility model provides a gesture recognition system, is applied to touch screen equipment which characterized in that: the system comprises an acquisition module, a calculation module and an identification module; the acquisition module is used for acquiring contact coordinate information in each frame of gesture data when the touch screen equipment receives gesture operation;
the calculation module is used for calculating an angle change value of a current frame contact compared with a previous frame contact for each frame of gesture data, and determining the direction of a contact connecting line of the current frame and the previous frame according to the angle change value;
the recognition module is used for taking all direction changes corresponding to the gesture operation as direction code features, and judging characters corresponding to the gesture operation according to the direction code features.
9. A storage medium having stored thereon a computer program, characterized in that the computer program, when executed by a processor, implements the gesture recognition method of any one of claims 1 to 7.
10. A touch screen device, comprising: a processor and a memory;
the memory is used for storing a computer program;
the processor is configured to execute the memory-stored computer program to cause the touch screen device to perform the gesture recognition method of any of claims 1 to 7.
CN201911338584.2A 2019-12-23 2019-12-23 Gesture recognition method and system, storage medium and touch screen device Pending CN113093972A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911338584.2A CN113093972A (en) 2019-12-23 2019-12-23 Gesture recognition method and system, storage medium and touch screen device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911338584.2A CN113093972A (en) 2019-12-23 2019-12-23 Gesture recognition method and system, storage medium and touch screen device

Publications (1)

Publication Number Publication Date
CN113093972A true CN113093972A (en) 2021-07-09

Family

ID=76663637

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911338584.2A Pending CN113093972A (en) 2019-12-23 2019-12-23 Gesture recognition method and system, storage medium and touch screen device

Country Status (1)

Country Link
CN (1) CN113093972A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103677642A (en) * 2013-12-19 2014-03-26 深圳市汇顶科技股份有限公司 Touch screen terminal and method and system for identifying hand gestures of touch screen terminal
CN103995665A (en) * 2014-04-14 2014-08-20 深圳市汇顶科技股份有限公司 Mobile terminal and method and system for getting access to application programs in ready mode
CN104331151A (en) * 2014-10-11 2015-02-04 中国传媒大学 Optical flow-based gesture motion direction recognition method
US20160054815A1 (en) * 2009-12-31 2016-02-25 Lenovo (Beijing) Limited Method and mobile terminal for processing touch input
CN107463331A (en) * 2017-08-15 2017-12-12 上海闻泰电子科技有限公司 Gesture path analogy method, device and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160054815A1 (en) * 2009-12-31 2016-02-25 Lenovo (Beijing) Limited Method and mobile terminal for processing touch input
CN103677642A (en) * 2013-12-19 2014-03-26 深圳市汇顶科技股份有限公司 Touch screen terminal and method and system for identifying hand gestures of touch screen terminal
CN103995665A (en) * 2014-04-14 2014-08-20 深圳市汇顶科技股份有限公司 Mobile terminal and method and system for getting access to application programs in ready mode
CN104331151A (en) * 2014-10-11 2015-02-04 中国传媒大学 Optical flow-based gesture motion direction recognition method
CN107463331A (en) * 2017-08-15 2017-12-12 上海闻泰电子科技有限公司 Gesture path analogy method, device and electronic equipment

Similar Documents

Publication Publication Date Title
US10679146B2 (en) Touch classification
CN109829368B (en) Palm feature recognition method and device, computer equipment and storage medium
CN106575170B (en) Method for executing touch action in touch sensitive device
CN109886127B (en) Fingerprint identification method and terminal equipment
CN103870071B (en) One kind touches source discrimination and system
CN111507146B (en) Fingerprint identification device and method
CN114402369A (en) Human body posture recognition method and device, storage medium and electronic equipment
CN112336342A (en) Hand key point detection method and device and terminal equipment
KR101559502B1 (en) Method and recording medium for contactless input interface with real-time hand pose recognition
CN109375833B (en) Touch instruction generation method and device
CN111492407B (en) System and method for map beautification
CN109271069B (en) Secondary area searching method based on capacitive touch, touch device and mobile terminal
CN103455262A (en) Pen-based interaction method and system based on mobile computing platform
JP2021064423A (en) Feature value generation device, system, feature value generation method, and program
CN106156774B (en) Image processing method and image processing system
CN112966719A (en) Method and device for recognizing meter panel reading and terminal equipment
CN111488897B (en) Method and device for detecting and identifying touch object
CN113093972A (en) Gesture recognition method and system, storage medium and touch screen device
CN113296616B (en) Pen point selection method and device and intelligent terminal
CN113743371A (en) Fingerprint identification method and fingerprint identification device
US20170185831A1 (en) Method and device for distinguishing finger and wrist
CN110737364B (en) Control method for touch writing acceleration under android system
CN111459395A (en) Gesture recognition method and system, storage medium and man-machine interaction device
TWI531985B (en) Palm biometric method
CN112947836A (en) Gesture recognition method and system based on inflection point characteristics, storage medium and touch screen device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210709