CN112947836A - Gesture recognition method and system based on inflection point characteristics, storage medium and touch screen device - Google Patents

Gesture recognition method and system based on inflection point characteristics, storage medium and touch screen device Download PDF

Info

Publication number
CN112947836A
CN112947836A CN201911267117.5A CN201911267117A CN112947836A CN 112947836 A CN112947836 A CN 112947836A CN 201911267117 A CN201911267117 A CN 201911267117A CN 112947836 A CN112947836 A CN 112947836A
Authority
CN
China
Prior art keywords
inflection point
inflection
gesture
touch screen
gesture operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911267117.5A
Other languages
Chinese (zh)
Inventor
闫俊超
姜鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chipone Technology Beijing Co Ltd
Original Assignee
Chipone Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chipone Technology Beijing Co Ltd filed Critical Chipone Technology Beijing Co Ltd
Priority to CN201911267117.5A priority Critical patent/CN112947836A/en
Publication of CN112947836A publication Critical patent/CN112947836A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

The invention provides a gesture recognition method and system based on inflection point characteristics, a storage medium and touch screen equipment, which are applied to the touch screen equipment and comprise the following steps: when the touch screen equipment receives gesture operation, acquiring touch point coordinate information of each frame of gesture data; for each frame of gesture data, calculating an angle between a contact connecting line between a current frame and a previous frame and a contact connecting line between the current frame and a next frame, and judging that a contact on the current frame is an inflection point when the angle is greater than a preset threshold value; and acquiring inflection point characteristic values according to all inflection points included in the gesture operation, and judging characters corresponding to the gesture operation according to the inflection point characteristic values. According to the gesture recognition method and system based on the inflection point characteristics, the storage medium and the touch screen device, accurate recognition of gestures is achieved according to the inflection point characteristics in the gestures, and the practicability is high.

Description

Gesture recognition method and system based on inflection point characteristics, storage medium and touch screen device
Technical Field
The invention relates to the technical field of gesture recognition, in particular to a gesture recognition method and system based on inflection point characteristics, a storage medium and touch screen equipment.
Background
In the prior art, a gesture recognition algorithm for a touch device based on sensor data is designed based on only a direction code, which presets a direction library of certain letters or characters according to empirical data. When gesture recognition is carried out, firstly, touch data are analyzed, the direction of a scribing line is judged, and a group of direction code data are obtained; and if the group of direction code data is completely matched with a group of direction codes in a direction library of a certain character in a preset library, identifying the current gesture as the character. Specifically, the gesture recognition algorithm includes the following two stages:
(1) parsing touch data into direction code vectors
And analyzing the coordinate position of the acquired touch data, recording the scribing track of the whole process from pressing to lifting except the scribing point (double-click awakening) gesture and other scribing gestures, and dividing the scribing track to judge the scribing track into different direction code combinations.
(2) Completing character recognition according to the matching condition of the direction code vector and the preset direction library
Acquiring a direction library of characters in advance according to experience values to serve as a recognition basis of specific characters; and if the recognized direction code combination is completely matched with a certain direction code combination in the direction library, judging that the current gesture is the character.
However, as shown in fig. 1, there are a plurality of drawing manners for the character V; as shown in fig. 2, there is a problem that the direction code spaces overlap for the characters I and V. Therefore, when the direction code combinations of different characters are the same, the situation of false recognition can occur by adopting the gesture recognition method.
Disclosure of Invention
In view of the foregoing disadvantages of the prior art, an object of the present invention is to provide a gesture recognition method, a system, a storage medium, and a touch screen device based on inflection point features, which implement accurate recognition of gestures according to the inflection point features in the gestures, and have strong practicability.
In order to achieve the above and other related objects, the present invention provides a gesture recognition method based on inflection point characteristics, applied to a touch screen device, including the following steps: when the touch screen equipment receives gesture operation, acquiring contact coordinate information in each frame of gesture data; for each frame of gesture data, calculating an angle between a contact connecting line between a current frame and a previous frame and a contact connecting line between the current frame and a next frame, and judging that a contact on the current frame is an inflection point when the angle is greater than a preset threshold value; and acquiring inflection point characteristic values according to all inflection points included in the gesture operation, and judging characters corresponding to the gesture operation according to the inflection point characteristic values.
In an embodiment of the present invention, the gesture operation presses the touch screen device as a starting point of the gesture operation; the gesture operation is lifted from the touch screen equipment to serve as an end point of the gesture operation; the inflection point is located between the start point and the end point.
In an embodiment of the present invention, the inflection point characteristic value includes an inflection point number, an inflection point interval, and an inflection point angle; the inflection points comprise the abscissa spacing, the ordinate spacing and the Euclidean distance of two adjacent inflection points; and the inflection point angle is an included angle of a connecting line between the current inflection point and two adjacent inflection points.
In an embodiment of the present invention, the inflection point feature value is input into a classification model, and the classification model determines a character corresponding to the gesture operation.
Correspondingly, the invention provides a gesture recognition system based on inflection point characteristics, which is applied to touch screen equipment and comprises an acquisition module, a calculation module and a judgment module;
the acquisition module is used for acquiring contact coordinate information in each frame of gesture data when the touch screen equipment receives gesture operation;
the calculation module is used for calculating the angle between a contact connecting line between the current frame and the previous frame and a contact connecting line between the current frame and the next frame for each frame of gesture data, and judging that the contact on the current frame is an inflection point when the angle is greater than a preset threshold value;
the judgment module is used for acquiring inflection point characteristic values according to all inflection points included in the gesture operation and judging characters corresponding to the gesture operation according to the inflection point characteristic values.
In an embodiment of the present invention, the gesture operation in the obtaining module presses the touch screen device as a starting point of the gesture operation; the gesture operation is lifted from the touch screen equipment to serve as an end point of the gesture operation; the inflection point is located between the start point and the end point.
In an embodiment of the present invention, the inflection point characteristic value includes an inflection point number, an inflection point interval, and an inflection point angle; the inflection points comprise the abscissa spacing, the ordinate spacing and the Euclidean distance of two adjacent inflection points; and the inflection point angle is an included angle of a connecting line between the current inflection point and two adjacent inflection points.
In an embodiment of the present invention, the determination module inputs the inflection point feature value into a classification model, and the classification model determines a character corresponding to the gesture operation.
The present invention provides a storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described gesture recognition method based on inflection point features.
Finally, the present invention provides a touch screen device comprising: a processor and a memory;
the memory is used for storing a computer program;
the processor is configured to execute the computer program stored in the memory to enable the touch screen device to execute the above gesture recognition method based on the inflection point feature.
As described above, the gesture recognition method, system, storage medium and touch screen device based on the inflection point feature of the present invention have the following beneficial effects:
(1) the gesture can be accurately identified according to the inflection point characteristics in the gesture;
(2) the problem of redundant strokes caused by starting, receiving, special writing habits and the like in practical application scenes is solved, and the character recognition accuracy is effectively improved;
(3) the gesture recognition algorithm in the prior art is complex, needs strong hardware performance and cannot be used in a chip with low processing performance; the gesture recognition algorithm does not need extra resource consumption, is suitable for practical hardware environments such as touch equipment and application scenes, and can meet the real-time requirement in the application scenes.
Drawings
FIG. 1 is a diagram illustrating a prior art drawing of a character in one embodiment;
FIG. 2 is a diagram illustrating another embodiment of a prior art character;
FIG. 3 is a flowchart illustrating a method for gesture recognition based on inflection point feature according to an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating an embodiment of a gesture recognition system based on inflection point feature of the present invention;
FIG. 5 is a schematic diagram of an apparatus according to an embodiment of the present invention.
Description of the element reference numerals
41 acquisition module
42 calculation module
43 judging module
51 processor
52 memory
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention.
It should be noted that the drawings provided in the present embodiment are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
According to the gesture recognition method, the gesture recognition system, the storage medium and the touch screen device based on the inflection point characteristics, the inflection point in the gesture is recognized, the gesture is accurately recognized according to the inflection point characteristics, and false recognition caused by redundant gestures and the like is effectively avoided; meanwhile, the system power consumption of hardware is reduced, and the practicability is high.
The gesture recognition method based on the inflection point characteristics is applied to touch screen equipment. The touch screen device shown refers to a device incorporating a touch screen. The touch screen is used as an input device, can simply, conveniently and naturally realize human-computer interaction, and is mainly applied to inquiry of public information, industrial control, military command, electronic games, multimedia teaching and the like. In an embodiment of the present invention, the Touch screen device includes one or more combinations of a smart phone, a tablet computer, an industrial personal computer, and a Touch and Display Driver Integration (TDDI) device.
As shown in fig. 3, in an embodiment, the method for gesture recognition based on inflection point feature of the present invention includes the following steps:
and step S1, when the touch screen device receives gesture operation, acquiring touch point coordinate information in each frame of gesture data.
Specifically, the user touches the touch screen device through a specific gesture to belong to the corresponding character. For a complete gesture operation, it contains a start point and an end point. In the invention, a contact point collected when the gesture operation presses down the touch screen equipment is used as a starting point of the gesture operation; and taking the contact point collected before the gesture operation is lifted from the touch screen equipment as the terminal point of the gesture operation. In a complete gesture operation execution process, acquiring gesture data on the touch screen device one by taking a frame as a unit, and collecting coordinate information (x, y) of a contact point in the gesture data under a rectangular coordinate system.
Preferably, the contact connecting line may be a measured contact or a contact obtained through interpolation processing.
Step S2, for each frame of gesture, calculating an angle between a contact connecting line of the current frame and the previous frame and a contact connecting line of the current frame and the next frame, and judging that the contact on the current frame is an inflection point when the angle is larger than a preset threshold value.
Specifically, for each frame of gesture data, a contact point (x) on the current frame is first obtainedi,yi) And the last frame upper contact (x)i-1,yi-1) Line between, and contact (x) on the current framei,yi) And the next frame upper contact (x)i+1,yi+1) The line between them. Under the rectangular coordinate system, between two connecting linesIs θ. And judging whether the contact on the current frame is an inflection point according to the size of the angle theta. That is, the inflection point is located between the start point and the end point.
In an embodiment of the invention, the predetermined threshold is obtained according to an empirical value. Specifically, the preset threshold is obtained by repeated testing for the character type in the early stage, and the preset threshold can be adjusted according to the use scene in the later stage. When the delta theta is not larger than the preset threshold, judging that the contact on the current frame belongs to the redundant stroke, and discarding the contact; and when the theta is larger than the preset threshold, judging that the contact on the current frame belongs to an inflection point, wherein the inflection point can be used for subsequent character recognition.
And step S3, obtaining inflection point characteristic values according to all inflection points included in the gesture operation, and judging characters corresponding to the gesture operation according to the inflection point characteristic values.
Specifically, the inflection point feature value is calculated for all the acquired inflection points. In an embodiment of the present invention, the inflection point characteristic values include an inflection point number, an inflection point interval, and an inflection point angle. Wherein, the abscissa interval Δ x between the inflection points including two adjacent inflection points is (x)i–xi-1) And the ordinate interval Δ y ═ y (y)i–yi-1) And the Euclidean distance ρi=[(xi-xi-1)2+(yi–yi-1)2]1/2. Wherein (x)i,yi) And (x)i-1,yi-1) And coordinate information respectively representing two adjacent inflection points. The inflection point angle refers to an angle between a contact point connecting line of a frame where the inflection point is located and a previous frame and a contact point connecting line of a frame where the inflection point is located and a next frame.
In the present invention, classification is a very important method of data mining. The concept of classification is to learn a classification function or to construct a classification model (i.e., classifier) based on existing data. The function or model can map data records in the database to one of a given category and thus can be applied to data prediction. Specifically, first, the empirical characteristics for a specific character are obtained through testing and stored as the characteristic set of the character. The characteristics comprise the number of inflection points, the inflection point distance, the inflection point angle and the like of the character. For the character to be detected, firstly, inflection point features, namely inflection point number, inflection point interval, inflection point angle and the like are analyzed and then are matched with the feature set; if the characteristic is satisfied, the character is identified, otherwise, the identification fails. Preferably, the classification model can be used alone or in combination with other classification models to realize gesture recognition.
As shown in fig. 4, in an embodiment, the gesture recognition system based on inflection point feature of the present invention is applied to a touch screen device, and includes an obtaining module 41, a calculating module 42, and a determining module 43.
The obtaining module 41 is configured to obtain the contact coordinate information in each frame of gesture data when the touch screen device receives a gesture operation.
Specifically, the user touches the touch screen device through a specific gesture to belong to the corresponding character. For a complete gesture operation, it contains a start point and an end point. In the invention, a contact point collected when the gesture operation presses down the touch screen equipment is used as a starting point of the gesture operation; and taking the contact point collected before the gesture operation is lifted from the touch screen equipment as the terminal point of the gesture operation. In a complete gesture operation execution process, acquiring gesture data on the touch screen device one by taking a frame as a unit, and collecting coordinate information (x, y) of a contact point in the gesture data under a rectangular coordinate system.
Preferably, the contact connecting line may be a measured contact or a contact obtained through interpolation processing.
The calculating module 42 is connected to the obtaining module 41, and is configured to calculate, for each frame of gesture, an angle between a contact connecting line between the current frame and the previous frame and a contact connecting line between the current frame and the next frame, and determine that a contact on the current frame is an inflection point when the angle is greater than a preset threshold.
Specifically, for each frame of gesture data, a contact point (x) on the current frame is first obtainedi,yi) And the last frame upper contact (x)i-1,yi-1) Line between, and contact (x) on the current framei,yi) And the next frame upper contact (x)i+1,yi+1) The line between them. Under the rectangular coordinate system, the angle between the two connecting lines is theta. And judging whether the contact on the current frame is an inflection point according to the size of the angle theta. That is, the inflection point is located between the start point and the end point.
In an embodiment of the invention, the predetermined threshold is obtained according to an empirical value. Specifically, the preset threshold is obtained by repeated testing for the character type in the early stage, and the preset threshold can be adjusted according to the use scene in the later stage. When the delta theta is not larger than the preset threshold, judging that the contact on the current frame belongs to the redundant stroke, and discarding the contact; and when the theta is larger than the preset threshold, judging that the contact on the current frame belongs to an inflection point, wherein the inflection point can be used for subsequent character recognition.
The determining module 43 is connected to the calculating module 42, and configured to obtain inflection point feature values according to all inflection points included in the gesture operation, and determine a character corresponding to the gesture operation according to the inflection point feature values.
Specifically, the inflection point feature value is calculated for all the acquired inflection points. In an embodiment of the present invention, the inflection point characteristic values include an inflection point number, an inflection point interval, and an inflection point angle. Wherein, the abscissa interval Δ x between the inflection points including two adjacent inflection points is (x)i–xi-1) And the ordinate interval Δ y ═ y (y)i–yi-1) And the Euclidean distance ρi=[(xi-xi-1)2+(yi–yi-1)2]1/2. Wherein (x)i,yi) And (x)i-1,yi-1) And coordinate information respectively representing two adjacent inflection points. The inflection point angle refers to an angle between a contact point connecting line of a frame where the inflection point is located and a previous frame and a contact point connecting line of a frame where the inflection point is located and a next frame.
In the present invention, classification is a very important method of data mining. The concept of classification is to learn a classification function or to construct a classification model (i.e., classifier) based on existing data. The function or model can map data records in the database to one of a given category and thus can be applied to data prediction. Specifically, first, the empirical characteristics for a specific character are obtained through testing and stored as the characteristic set of the character. The characteristics comprise the number of inflection points, the inflection point distance, the inflection point angle and the like of the character. For the character to be detected, firstly, inflection point features, namely inflection point number, inflection point interval, inflection point angle and the like are analyzed and then are matched with the feature set; if the characteristic is satisfied, the character is identified, otherwise, the identification fails. Preferably, the classification model can be used alone or in combination with other classification models to realize gesture recognition.
It should be noted that the division of the modules of the above apparatus is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the x module may be a processing element that is set up separately, or may be implemented by being integrated in a chip of the apparatus, or may be stored in a memory of the apparatus in the form of program code, and the function of the x module may be called and executed by a processing element of the apparatus. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
The storage medium of the present invention stores thereon a computer program that, when executed by a processor, implements the above-described gesture recognition method based on inflection point features. Preferably, the storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic disk, U-disk, memory card, or optical disk.
As shown in fig. 5, in an embodiment, the touch screen device of the present invention includes: a processor 51 and a memory 52.
The memory 52 is used for storing computer programs.
The memory 52 includes: various media that can store program codes, such as ROM, RAM, magnetic disk, U-disk, memory card, or optical disk.
The processor 51 is connected to the memory 52, and is configured to execute the computer program stored in the memory 52, so as to enable the touch screen device to execute the above gesture recognition method based on the inflection point feature.
Preferably, the Processor 51 may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components.
In an embodiment of the present invention, the touch screen device includes one or more combinations of a smart phone, a tablet computer, an industrial personal computer, and a TDDI device.
In summary, the gesture recognition method, system, storage medium and touch screen device based on the inflection point feature of the present invention can realize accurate recognition of a gesture according to the inflection point feature in the gesture; the problem of redundant strokes caused by starting, receiving, special writing habits and the like in practical application scenes is solved, and the character recognition accuracy is effectively improved; the gesture recognition algorithm in the prior art is complex, needs strong hardware performance and cannot be used in a chip with low processing performance; the gesture recognition algorithm does not need extra resource consumption, is suitable for the actual hardware environment and the application scene, and can meet the real-time requirement in the application scene. Therefore, the invention effectively overcomes various defects in the prior art and has high industrial utilization value.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (10)

1. A gesture recognition method based on inflection point features is applied to touch screen equipment and is characterized in that: the method comprises the following steps:
when the touch screen equipment receives gesture operation, acquiring contact coordinate information in each frame of gesture data;
for each frame of gesture data, calculating an angle between a contact connecting line between a current frame and a previous frame and a contact connecting line between the current frame and a next frame, and judging that a contact on the current frame is an inflection point when the angle is greater than a preset threshold value;
and acquiring inflection point characteristic values according to all inflection points included in the gesture operation, and judging characters corresponding to the gesture operation according to the inflection point characteristic values.
2. The method of gesture recognition based on inflection point features of claim 1, wherein: the gesture operation presses the touch screen device to serve as a starting point of the gesture operation; the gesture operation is lifted from the touch screen equipment to serve as an end point of the gesture operation; the inflection point is located between the start point and the end point.
3. The method of gesture recognition based on inflection point features of claim 1, wherein: the inflection point characteristic value comprises the number of inflection points, an inflection point interval and an inflection point angle; the inflection points comprise the abscissa spacing, the ordinate spacing and the Euclidean distance of two adjacent inflection points; and the inflection point angle is an included angle of a connecting line between the current inflection point and two adjacent inflection points.
4. The method of gesture recognition based on inflection point features of claim 1, wherein: inputting the inflection point characteristic value into a classification model, and judging characters corresponding to the gesture operation by the classification model.
5. The utility model provides a gesture recognition system based on inflection point characteristic is applied to touch screen equipment which characterized in that: the device comprises an acquisition module, a calculation module and a judgment module;
the acquisition module is used for acquiring contact coordinate information in each frame of gesture data when the touch screen equipment receives gesture operation;
the calculation module is used for calculating the angle between a contact connecting line between the current frame and the previous frame and a contact connecting line between the current frame and the next frame for each frame of gesture data, and judging that the contact on the current frame is an inflection point when the angle is greater than a preset threshold value;
the judgment module is used for acquiring inflection point characteristic values according to all inflection points included in the gesture operation and judging characters corresponding to the gesture operation according to the inflection point characteristic values.
6. The inflection-feature-based gesture recognition system of claim 5, wherein: the gesture operation in the acquisition module presses the touch screen device to serve as a starting point of the gesture operation; the gesture operation is lifted from the touch screen equipment to serve as an end point of the gesture operation; the inflection point is located between the start point and the end point.
7. The inflection-feature-based gesture recognition system of claim 5, wherein: the inflection point characteristic value comprises the number of inflection points, an inflection point interval and an inflection point angle; the inflection points comprise the abscissa spacing, the ordinate spacing and the Euclidean distance of two adjacent inflection points; and the inflection point angle is an included angle of a connecting line between the current inflection point and two adjacent inflection points.
8. The inflection-feature-based gesture recognition system of claim 5, wherein: and the judging module inputs the inflection point characteristic value into a classification model, and the classification model judges the character corresponding to the gesture operation.
9. A storage medium having stored thereon a computer program, wherein the computer program, when executed by a processor, implements the inflection point feature-based gesture recognition method of any one of claims 1 to 4.
10. A touch screen device, comprising: a processor and a memory;
the memory is used for storing a computer program;
the processor is configured to execute the memory-stored computer program to cause the touch screen device to perform the inflection point feature-based gesture recognition method of any one of claims 1 to 4.
CN201911267117.5A 2019-12-11 2019-12-11 Gesture recognition method and system based on inflection point characteristics, storage medium and touch screen device Pending CN112947836A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911267117.5A CN112947836A (en) 2019-12-11 2019-12-11 Gesture recognition method and system based on inflection point characteristics, storage medium and touch screen device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911267117.5A CN112947836A (en) 2019-12-11 2019-12-11 Gesture recognition method and system based on inflection point characteristics, storage medium and touch screen device

Publications (1)

Publication Number Publication Date
CN112947836A true CN112947836A (en) 2021-06-11

Family

ID=76233939

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911267117.5A Pending CN112947836A (en) 2019-12-11 2019-12-11 Gesture recognition method and system based on inflection point characteristics, storage medium and touch screen device

Country Status (1)

Country Link
CN (1) CN112947836A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103186338A (en) * 2011-12-31 2013-07-03 联想(北京)有限公司 Method for setting clock and electronic equipment
CN103677642A (en) * 2013-12-19 2014-03-26 深圳市汇顶科技股份有限公司 Touch screen terminal and method and system for identifying hand gestures of touch screen terminal
CN103713730A (en) * 2012-09-29 2014-04-09 炬才微电子(深圳)有限公司 Mid-air gesture recognition method and device applied to intelligent terminal
CN104503591A (en) * 2015-01-19 2015-04-08 王建勤 Information input method based on broken line gesture
CN104933408A (en) * 2015-06-09 2015-09-23 深圳先进技术研究院 Hand gesture recognition method and system
CN104991687A (en) * 2015-06-09 2015-10-21 惠州Tcl移动通信有限公司 Method and system for acquiring curve operating track of touch-screen device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103186338A (en) * 2011-12-31 2013-07-03 联想(北京)有限公司 Method for setting clock and electronic equipment
CN103713730A (en) * 2012-09-29 2014-04-09 炬才微电子(深圳)有限公司 Mid-air gesture recognition method and device applied to intelligent terminal
CN103677642A (en) * 2013-12-19 2014-03-26 深圳市汇顶科技股份有限公司 Touch screen terminal and method and system for identifying hand gestures of touch screen terminal
CN104503591A (en) * 2015-01-19 2015-04-08 王建勤 Information input method based on broken line gesture
CN104933408A (en) * 2015-06-09 2015-09-23 深圳先进技术研究院 Hand gesture recognition method and system
CN104991687A (en) * 2015-06-09 2015-10-21 惠州Tcl移动通信有限公司 Method and system for acquiring curve operating track of touch-screen device

Similar Documents

Publication Publication Date Title
US10679146B2 (en) Touch classification
CN112162930B (en) Control identification method, related device, equipment and storage medium
CN109919077B (en) Gesture recognition method, device, medium and computing equipment
CN109685092B (en) Clustering method, equipment, storage medium and device based on big data
WO2023130717A1 (en) Image positioning method and apparatus, computer device and storage medium
KR101559502B1 (en) Method and recording medium for contactless input interface with real-time hand pose recognition
CN111598012B (en) Picture clustering management method, system, device and medium
CN112529068B (en) Multi-view image classification method, system, computer equipment and storage medium
CN114509785A (en) Three-dimensional object detection method, device, storage medium, processor and system
CN114402369A (en) Human body posture recognition method and device, storage medium and electronic equipment
CN108520263B (en) Panoramic image identification method and system and computer storage medium
CN112241789A (en) Structured pruning method, device, medium and equipment for lightweight neural network
CN111492407B (en) System and method for map beautification
CN112749576B (en) Image recognition method and device, computing equipment and computer storage medium
CN104077268A (en) Shaping device
CN112947836A (en) Gesture recognition method and system based on inflection point characteristics, storage medium and touch screen device
CN106547807B (en) Data analysis method and device
CN111459395A (en) Gesture recognition method and system, storage medium and man-machine interaction device
CN115063473A (en) Object height detection method and device, computer equipment and storage medium
CN114973300A (en) Component type identification method and device, electronic equipment and storage medium
CN110555114A (en) Media retrieval method and device
CN110929767B (en) Font processing method, system, device and medium
CN103547982A (en) Identifying contacts and contact attributes in touch sensor data using spatial and temporal features
CN113093972A (en) Gesture recognition method and system, storage medium and touch screen device
CN113392455A (en) House type graph scale detection method and device based on deep learning and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination