CN111459395A - Gesture recognition method and system, storage medium and man-machine interaction device - Google Patents

Gesture recognition method and system, storage medium and man-machine interaction device Download PDF

Info

Publication number
CN111459395A
CN111459395A CN202010237880.XA CN202010237880A CN111459395A CN 111459395 A CN111459395 A CN 111459395A CN 202010237880 A CN202010237880 A CN 202010237880A CN 111459395 A CN111459395 A CN 111459395A
Authority
CN
China
Prior art keywords
character
direction code
similarity
characters
recognized
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010237880.XA
Other languages
Chinese (zh)
Inventor
闫俊超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chipone Technology Beijing Co Ltd
Original Assignee
Chipone Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chipone Technology Beijing Co Ltd filed Critical Chipone Technology Beijing Co Ltd
Priority to CN202010237880.XA priority Critical patent/CN111459395A/en
Publication of CN111459395A publication Critical patent/CN111459395A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Character Discrimination (AREA)

Abstract

The invention provides a gesture recognition method, a system, a storage medium and a man-machine interaction device, which comprises the following steps: acquiring a direction code vector and a character feature set of a character, and constructing a direction code feature library of the character; when the man-machine interaction equipment receives gesture operation, acquiring a direction code vector and a character feature set of a character to be recognized corresponding to the gesture operation; calculating the similarity between the character to be recognized and the characters in the direction code feature library based on the direction code vector, and calculating the distance between the character to be recognized and the characters in the direction code feature library based on the character feature set; if the similarity corresponding to a certain character in the direction code feature library is larger than a first similarity threshold and the corresponding distance is larger than a second similarity threshold, the character corresponding to the gesture operation is recognized as the character. The gesture recognition method, the system, the storage medium and the man-machine interaction device realize accurate recognition of the gesture and effectively reduce the error recognition rate.

Description

Gesture recognition method and system, storage medium and man-machine interaction device
Technical Field
The invention relates to the technical field of gesture recognition, in particular to a gesture recognition method, a gesture recognition system, a storage medium and a man-machine interaction device.
Background
In the prior art, gesture recognition algorithms for human-computer interaction devices generally adopt methods such as stroke matching, and one of the specific algorithms is as follows:
(1) dividing the touch area into coordinate systems in four directions of up, down, left and right (or south, east and west), judging the stroke type with fixed distance length according to the touch track, classifying the stroke type into up, down, left or right, and expressing the stroke type with one number in 1,2,3 or 4;
(2) sequentially recording strokes of the gesture operation from the beginning of pressing to the moment of lifting to obtain a string of digital combinations representing direction combinations;
(3) acquiring and processing empirical data to obtain a direction library of characters, wherein one character may correspond to one or more digital combinations;
(4) in the actual gesture recognition, comparing the acquired gesture direction codes with the digital combination in the direction library; and when the gesture direction code is matched with one of the number combinations of a certain character, judging the character as the recognition result of the gesture.
However, the above gesture recognition algorithm has the following problems:
1. the construction of a direction library for character matching needs to depend on empirical data, and the empirical data is obtained through testing, so that the method has certain limitation; meanwhile, the empirical data has the problems of data volume not large enough, inaccurate data caused by special writing habits of individuals and the like, so that the empirical data is not representative;
2. due to data deviation, if the data volume of the direction library is too large in the actual use process, part of similar characters can be recognized as the characters, and the false recognition rate is increased; if the data size of the direction library is too small, the missing recognition rate is increased;
3. only the direction codes are relied on for matching, so that the classifier is single and the robustness is poor.
Disclosure of Invention
In view of the above disadvantages of the prior art, an object of the present invention is to provide a gesture recognition method, a system, a storage medium, and a human-computer interaction device, which implement accurate gesture recognition by using similarity matching mechanisms such as direction code similarity, character contour distance, inflection point characteristics, and angle variation, and effectively reduce the false recognition rate.
In order to achieve the above and other related objects, the present invention provides a gesture recognition method applied to a human-computer interaction device, comprising the following steps: acquiring a direction code vector and a character feature set of a character, and constructing a direction code feature library of the character; when the man-machine interaction equipment receives a gesture operation, acquiring a direction code vector and a character feature set of a character to be recognized corresponding to the gesture operation; calculating the similarity between the character to be recognized and the characters in the direction code feature library based on the direction code vector, and calculating the distance between the character to be recognized and the characters in the direction code feature library based on the character feature set; and if the similarity corresponding to a certain character in the direction code feature library is greater than a first similarity threshold and the corresponding distance is greater than a second similarity threshold, identifying the character corresponding to the gesture operation as the character.
In an embodiment of the present invention, obtaining the direction code vector of the character includes the following steps:
when characters are input on the human-computer interaction equipment through gesture operation, acquiring the coordinate information of a contact on a current frame at intervals of a preset number of frames;
and acquiring the direction code vectors of the characters according to the contact coordinate information of the preset number.
In an embodiment of the present invention, the character feature set includes one or more combinations of the number of inflection points, the relative distance between contours, and the angle variation value between strokes/inflection points.
In one embodiment of the present invention, the Similarity is determined according to SimilaritycAnd (D ') -cos (D- (D + D ')/2, D ' - (D + D ')/2) calculating the similarity, wherein D represents the direction code vector of the character to be recognized, and D ' represents the direction code vector of the character in the direction code feature library.
In an embodiment of the present invention
Figure BDA0002431618710000021
Calculating said distance, wherein F1=(F,F')TF represents the character feature set of the characters in the direction code feature library, F' represents the character feature set of the characters to be recognized, and mu represents F1Mean value of (E), sigma denotes F1The covariance matrix of (2).
In an embodiment of the present invention, if there is no character in the direction code feature library whose similarity is greater than the first similarity threshold and whose corresponding distance is greater than the second similarity threshold, the character corresponding to the gesture operation is identified as an invalid character.
In an embodiment of the present invention, the similarity is preferentially calculated for the characters in the direction code feature library, and the distance is calculated only when the similarity is greater than the first similarity threshold.
Correspondingly, the invention provides a gesture recognition system, which is applied to human-computer interaction equipment and comprises a construction module, an acquisition module, a calculation module and a recognition module;
the construction module is used for acquiring a direction code vector and a character feature set of the character and constructing a direction code feature library of the character;
the acquisition module is used for acquiring a direction code vector and a character feature set of a character to be recognized corresponding to a gesture operation when the human-computer interaction device receives the gesture operation;
the calculation module is used for calculating the similarity between the character to be recognized and the characters in the direction code feature library based on the direction code vector, and calculating the distance between the character to be recognized and the characters in the direction code feature library based on the character feature set;
the recognition module is used for recognizing the character corresponding to the gesture operation as the character when the similarity corresponding to the certain character in the direction code feature library is larger than a first similarity threshold and the corresponding distance is larger than a second similarity threshold.
The present invention provides a storage medium having stored thereon a computer program which, when executed by a processor, implements the gesture recognition method described above.
Finally, the invention provides a human-computer interaction device comprising: a processor and a memory;
the memory is used for storing a computer program;
the processor is used for executing the computer program stored in the memory so as to enable the human-computer interaction device to execute the gesture recognition method.
As described above, the gesture recognition method, system, storage medium, and human-computer interaction device of the present invention have the following advantages:
(1) the gesture is accurately recognized by adopting similarity matching mechanisms such as direction code similarity calculation, character outline distance, inflection point characteristics, angle change and the like;
(2) the multi-dimensional classifier design is adopted, so that the error recognition rate of gesture recognition is effectively reduced;
(3) the gesture recognition method is suitable for the requirements of human-computer interaction equipment with high response speed and low processing performance on gesture recognition.
Drawings
FIG. 1 is a flow chart illustrating a gesture recognition method according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a gesture recognition method according to another embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a gesture recognition system according to an embodiment of the present invention;
FIG. 4 is a diagram of a human-computer interaction device according to an embodiment of the present invention.
Description of the element reference numerals
31 building block
32 acquisition module
33 calculation module
34 identification module
41 processor
42 memory
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention.
It should be noted that the drawings provided in the present embodiment are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
The gesture recognition method, the system, the storage medium and the man-machine interaction equipment perform gesture recognition from the angles of similarity such as direction code similarity, character outline distance, inflection point characteristics, angle change and the like, effectively improve the gesture recognition precision, reduce the false recognition rate and have strong practicability.
Specifically, the gesture recognition method is applied to the human-computer interaction equipment. The human-computer interaction device is used as an input device, can simply, conveniently and naturally realize human-computer interaction, and is mainly applied to inquiry of public information, industrial control, military command, electronic games, multimedia teaching and the like. In an embodiment of the present invention, the human-computer interaction device includes one or more combinations of a smart phone, a tablet computer, an industrial personal computer, and a Touch and Display Driver Integration (TDDI) device.
As shown in fig. 1, in an embodiment, the gesture recognition method of the present invention includes the following steps:
and S1, acquiring the direction code vector and the character feature set of the character, and constructing a direction code feature library of the character.
Specifically, firstly, a direction code feature library of the character is constructed, and the direction code feature library comprises a direction code vector and a character feature set of the character. That is, for each character, its direction code vector and character feature set are collected first.
It should be noted that the characters refer to font-like units or symbols, including letters, numbers, operation symbols, punctuation marks and other symbols, and some functional symbols.
In an embodiment of the present invention, obtaining the direction code vector of the character includes the following steps:
11) when characters are input on the human-computer interaction equipment through gesture operation, acquiring the coordinate information of a contact point on the current frame at intervals of a preset number of frames.
Specifically, when characters are input on the human-computer interaction equipment through gestures, acquiring the characters on the current frame at intervals of a preset number of frames NCoordinates of the touch points, thereby constructing a touch point set a { (x)1,y1),(x1+N,y1+N),.....(xm,ym),...,(xk-N,yk-N),(xk,yk)}. Where the setting of N may be character-dependent to obtain certain specific location information. m, k and N are natural numbers.
12) And acquiring the direction code vectors of the characters according to the contact coordinate information of the preset number.
Specifically, a direction code vector D ═ (x) of the character is obtained according to the contact set a1,x2,x3,x4,....,xn). To further improve the accuracy of gesture recognition, preferably x1,x2,x3,x4,....,xnThe direction is characterized by an 8-dimensional direction code. If (1,2,3,4,5,6,7,8) is used to represent different directions respectively.
In an embodiment of the invention, the character feature set includes one or more combinations of a number of inflection points Δ I, a relative distance Δ R between contours, and an angle variation value Δ a between specific strokes/inflection points of a character. Preferably, the character feature set may be expressed as F ═ (Δ I, Δ R, Δ a).
In an embodiment of the present invention, in the direction code feature library, the direction code vectors may be separately stored as a matching library, and the character feature set may be separately stored as a feature set database.
And step S2, when the human-computer interaction equipment receives gesture operation, acquiring a direction code vector and a character feature set of the character to be recognized corresponding to the gesture operation.
Specifically, for the current gesture operation, firstly, extraction of a direction code vector and a character feature set is performed on a character to be recognized. And the extraction algorithm of the direction code vector and the character feature set of the character to be recognized is consistent with the extraction algorithm of the direction code vector and the character feature set in the constructed direction code feature library. Wherein the direction code vector of the character to be recognized is represented by D' ═ (y)1,y2,y3,y4,....,yn) And the character feature set is represented as F'=(ΔI',ΔR',ΔA')。
Step S3, calculating the similarity between the character to be recognized and the character in the direction code feature library based on the direction code vector, and calculating the distance between the character to be recognized and the character in the direction code feature library based on the character feature set.
In an embodiment of the present invention, the similarity is a cosine similarity, and the distance is a mahalanobis distance. In particular, according to Similarityc(D ') -cos (D- (D + D')/2, D '- (D + D')/2) calculating the cosine similarity according to
Figure BDA0002431618710000051
Calculating the Mahalanobis distance, wherein D represents the direction code vector of the character to be recognized, D' represents the direction code vector of the character in the direction code feature library, and F1=(F,F')TF represents the character feature set of the characters in the direction code feature library, F' represents the character feature set of the characters to be recognized, and mu represents F1Mean value of (E), sigma denotes F1The covariance matrix of (2).
Step S4, if the similarity corresponding to a certain character in the direction code feature library is greater than the first similarity threshold and the corresponding distance is greater than the second similarity threshold, identifying the character corresponding to the gesture operation as the character.
Specifically, traversing characters in the direction code feature library, and judging whether the cosine similarity and the mahalanobis distance corresponding to the characters are respectively greater than a first similarity threshold and a second similarity threshold; determining that the character is the character corresponding to the gesture operation only when the cosine similarity and the Mahalanobis distance are respectively greater than the first similarity threshold and the second similarity threshold; and if the characters in the direction code feature library are traversed and no character with the cosine similarity and the Mahalanobis distance respectively larger than the first similarity threshold and the second similarity threshold exists, identifying the character corresponding to the gesture operation as an invalid character.
In an embodiment of the present invention, the first similarity threshold and the second similarity threshold are set according to an actually used character set. Preferably, the first similarity threshold and the second similarity threshold have the same value.
In order to further reduce the calculation complexity and reduce the system power consumption, as shown in fig. 2, in an embodiment of the present invention, the cosine similarity is preferentially calculated for the characters in the direction code feature library, and the mahalanobis distance is calculated only when the cosine similarity is greater than the first similarity threshold. That is, for each character in the direction code feature library, the corresponding cosine similarity is first calculated. If the cosine similarity is not equal to the first similarity threshold, judging that the character is not the character corresponding to the gesture operation, and matching the next character; and if the cosine similarity is greater than the first similarity threshold, judging that the character is possibly the character corresponding to the gesture operation, and calculating the next Mahalanobis distance to accurately identify. If the Mahalanobis distance is larger than the second similarity threshold, the character is the character corresponding to the gesture operation; otherwise, judging that the character is not the character corresponding to the gesture operation, and matching the next character until all characters are traversed. Therefore, the direction code matching is a prerequisite of the gesture recognition method, and further recognition is performed by combining the character outline distance, the inflection point feature, the angle change and other features, so that the accuracy of gesture recognition is effectively improved, and the false recognition rate is reduced.
As shown in fig. 3, in an embodiment, the gesture recognition system of the present invention is applied to a human-computer interaction device, and includes a construction module 31, an acquisition module 32, a calculation module 33, and a recognition module 34.
The building module 31 is configured to obtain a direction code vector and a character feature set of a character, and build a direction code feature library of the character.
Specifically, firstly, a direction code feature library of the character is constructed, and the direction code feature library comprises a direction code vector and a character feature set of the character. That is, for each character, its direction code vector and character feature set are collected first.
It should be noted that the characters refer to font-like units or symbols, including letters, numbers, operation symbols, punctuation marks and other symbols, and some functional symbols.
In an embodiment of the present invention, obtaining the direction code vector of the character includes the following steps:
11) when characters are input on the human-computer interaction equipment through gesture operation, acquiring the coordinate information of a contact point on the current frame at intervals of a preset number of frames.
Specifically, when characters are input on the human-computer interaction device through gestures, coordinates of a contact point on a current frame are collected every a preset number of frames N, and therefore a contact point set A { (x) is constructed1,y1),(x1+N,y1+N),.....(xm,ym),...,(xk-N,yk-N),(xk,yk)}. Where the setting of N may be character-dependent to obtain certain specific location information. m, N and N are natural numbers.
12) And acquiring the direction code vectors of the characters according to the contact coordinate information of the preset number.
Specifically, a direction code vector D ═ (x) of the character is obtained according to the contact set a1,x2,x3,x4,....,xn). To further improve the accuracy of gesture recognition, preferably x1,x2,x3,x4,....,xnThe direction is characterized by an 8-dimensional direction code. If (1,2,3,4,5,6,7,8) is used to represent different directions respectively.
In an embodiment of the invention, the character feature set includes one or more combinations of a number of inflection points Δ I, a relative distance Δ R between contours, and an angle variation value Δ a between specific strokes/inflection points of a character. Preferably, the character feature set may be expressed as F ═ (Δ I, Δ R, Δ a).
In an embodiment of the present invention, in the direction code feature library, the direction code vectors may be separately stored as a matching library, and the character feature set may be separately stored as a feature set database.
The obtaining module 32 is configured to obtain, when the human-computer interaction device receives a gesture operation, a direction code vector and a character feature set of a character to be recognized corresponding to the gesture operation.
Specifically, for the current gesture operation, firstly, extraction of a direction code vector and a character feature set is performed on a character to be recognized. And the extraction algorithm of the direction code vector and the character feature set of the character to be recognized is consistent with the extraction algorithm of the direction code vector and the character feature set in the constructed direction code feature library. Wherein the direction code vector of the character to be recognized is represented by D' ═ (y)1,y2,y3,y4,....,yn) The character feature set is denoted as F '═ Δ I', Δ R ', Δ a'.
The calculating module 33 is connected to the constructing module 31 and the obtaining module 32, and is configured to calculate cosine similarity between the character to be recognized and the characters in the direction code feature library based on the direction code vector, and calculate mahalanobis distance between the character to be recognized and the characters in the direction code feature library based on the character feature set.
In an embodiment of the present invention, the similarity is a cosine similarity, and the distance is a mahalanobis distance. In particular, according to Similarityc(D ') -cos (D- (D + D')/2, D '- (D + D')/2) calculating the cosine similarity according to
Figure BDA0002431618710000071
Calculating the Mahalanobis distance, wherein D represents the direction code vector of the character to be recognized, D' represents the direction code vector of the character in the direction code feature library, and F1=(F,F')TF represents the character feature set of the characters in the direction code feature library, F' represents the character feature set of the characters to be recognized, and mu represents F1Mean value of (E), sigma denotes F1The covariance matrix of (2).
The recognition module 34 is connected to the calculation module 33, and configured to recognize a character corresponding to the gesture operation as the character when the similarity corresponding to a certain character in the direction code feature library is greater than a first similarity threshold and the corresponding distance is greater than a second similarity threshold.
Specifically, traversing characters in the direction code feature library, and judging whether the cosine similarity and the mahalanobis distance corresponding to the characters are respectively greater than a first similarity threshold and a second similarity threshold; determining that the character is the character corresponding to the gesture operation only when the cosine similarity and the Mahalanobis distance are respectively greater than the first similarity threshold and the second similarity threshold; and if the characters in the direction code feature library are traversed and no character with the cosine similarity and the Mahalanobis distance respectively larger than the first similarity threshold and the second similarity threshold exists, identifying the character corresponding to the gesture operation as an invalid character.
In an embodiment of the present invention, the first similarity threshold and the second similarity threshold are set according to an actually used character set. Preferably, the first similarity threshold and the second similarity threshold have the same value.
It should be noted that the division of the modules of the above apparatus is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the x module may be a processing element that is set up separately, or may be implemented by being integrated in a chip of the apparatus, or may be stored in a memory of the apparatus in the form of program code, and the function of the x module may be called and executed by a processing element of the apparatus. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
The storage medium of the present invention stores thereon a computer program that realizes the above-described gesture recognition method when executed by a processor. Preferably, the storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic disk, U-disk, memory card, or optical disk.
As shown in fig. 4, in an embodiment, the human-computer interaction device of the present invention includes: a processor 41 and a memory 42.
The memory 42 is used for storing computer programs.
The memory 42 includes: various media that can store program codes, such as ROM, RAM, magnetic disk, U-disk, memory card, or optical disk.
The processor 41 is connected to the memory 42, and is configured to execute the computer program stored in the memory 42, so as to enable the human-computer interaction device to execute the gesture recognition method described above.
Preferably, the Processor 41 may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; the integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components.
In an embodiment of the present invention, the human-computer interaction device includes one or more combinations of a smart phone, a tablet computer, an industrial personal computer, and a TDDI device.
In summary, the gesture recognition method, the system, the storage medium and the human-computer interaction device of the invention adopt similarity matching mechanisms such as direction code similarity calculation, character outline distance, inflection point characteristics, angle change and the like to realize accurate recognition of gestures; the multi-dimensional classifier design is adopted, so that the error recognition rate of gesture recognition is effectively reduced; the gesture recognition method is suitable for the requirements of human-computer interaction equipment with high response speed and low processing performance on gesture recognition. Therefore, the invention effectively overcomes various defects in the prior art and has high industrial utilization value.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (10)

1. A gesture recognition method is applied to human-computer interaction equipment and is characterized in that: the method comprises the following steps:
acquiring a direction code vector and a character feature set of a character, and constructing a direction code feature library of the character;
when the man-machine interaction equipment receives a gesture operation, acquiring a direction code vector and a character feature set of a character to be recognized corresponding to the gesture operation;
calculating the similarity between the character to be recognized and the characters in the direction code feature library based on the direction code vector, and calculating the distance between the character to be recognized and the characters in the direction code feature library based on the character feature set;
and if the similarity corresponding to a certain character in the direction code feature library is greater than a first similarity threshold and the corresponding distance is greater than a second similarity threshold, identifying the character corresponding to the gesture operation as the character.
2. The gesture recognition method according to claim 1, characterized in that: the method for acquiring the direction code vector of the character comprises the following steps:
when characters are input on the human-computer interaction equipment through gesture operation, acquiring the coordinate information of a contact on a current frame at intervals of a preset number of frames;
and acquiring the direction code vectors of the characters according to the contact coordinate information of the preset number.
3. The gesture recognition method according to claim 1, characterized in that: the character feature set comprises one or more of the number of inflection points, the relative distance of the outline and the angle change value between specific strokes/inflection points of the character.
4. The gesture recognition method according to claim 1, characterized in that: according to SimilaritycAnd (D ') -cos (D- (D + D ')/2, D ' - (D + D ')/2) calculating the similarity, wherein D represents the direction code vector of the character to be recognized, and D ' represents the direction code vector of the character in the direction code feature library.
5. The gesture recognition method according to claim 1, characterized in that: according to
Figure FDA0002431618700000011
Calculating said distance, wherein F1=(F,F')TF represents the character feature set of the characters in the direction code feature library, F' represents the character feature set of the characters to be recognized, and mu represents F1Mean value of (E), sigma denotes F1The covariance matrix of (2).
6. The gesture recognition method according to claim 1, characterized in that: if the characters with the similarity larger than a first similarity threshold value and the corresponding distance larger than a second similarity threshold value do not exist in the direction code feature library, the characters corresponding to the gesture operation are recognized as invalid characters.
7. The gesture recognition method according to claim 1, characterized in that: and preferentially calculating the similarity of the characters in the direction code feature library, and calculating the distance only when the similarity is greater than the first similarity threshold.
8. The gesture recognition system is applied to human-computer interaction equipment and is characterized in that: the system comprises a construction module, an acquisition module, a calculation module and an identification module;
the construction module is used for acquiring a direction code vector and a character feature set of the character and constructing a direction code feature library of the character;
the acquisition module is used for acquiring a direction code vector and a character feature set of a character to be recognized corresponding to a gesture operation when the human-computer interaction device receives the gesture operation;
the calculation module is used for calculating the similarity between the character to be recognized and the characters in the direction code feature library based on the direction code vector, and calculating the distance between the character to be recognized and the characters in the direction code feature library based on the character feature set;
the recognition module is used for recognizing the character corresponding to the gesture operation as the character when the similarity corresponding to the certain character in the direction code feature library is larger than a first similarity threshold and the corresponding distance is larger than a second similarity threshold.
9. A storage medium having stored thereon a computer program, characterized in that the computer program, when executed by a processor, implements the gesture recognition method of any one of claims 1 to 7.
10. A human-computer interaction device, comprising: a processor and a memory;
the memory is used for storing a computer program;
the processor is used for executing the computer program stored in the memory to enable the man-machine interaction device to execute the gesture recognition method of any one of claims 1 to 7.
CN202010237880.XA 2020-03-30 2020-03-30 Gesture recognition method and system, storage medium and man-machine interaction device Pending CN111459395A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010237880.XA CN111459395A (en) 2020-03-30 2020-03-30 Gesture recognition method and system, storage medium and man-machine interaction device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010237880.XA CN111459395A (en) 2020-03-30 2020-03-30 Gesture recognition method and system, storage medium and man-machine interaction device

Publications (1)

Publication Number Publication Date
CN111459395A true CN111459395A (en) 2020-07-28

Family

ID=71683421

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010237880.XA Pending CN111459395A (en) 2020-03-30 2020-03-30 Gesture recognition method and system, storage medium and man-machine interaction device

Country Status (1)

Country Link
CN (1) CN111459395A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113064545A (en) * 2021-05-18 2021-07-02 清华大学 Gesture recognition method and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1851730A (en) * 2006-05-25 2006-10-25 无敌科技(西安)有限公司 Word recognition method and its system
US20090285490A1 (en) * 2008-05-13 2009-11-19 Fujitsu Limited Dictionary creating apparatus, recognizing apparatus, and recognizing method
CN103677642A (en) * 2013-12-19 2014-03-26 深圳市汇顶科技股份有限公司 Touch screen terminal and method and system for identifying hand gestures of touch screen terminal
CN103995665A (en) * 2014-04-14 2014-08-20 深圳市汇顶科技股份有限公司 Mobile terminal and method and system for getting access to application programs in ready mode
CN109992106A (en) * 2019-01-10 2019-07-09 北京工业大学 Gesture track recognition method, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1851730A (en) * 2006-05-25 2006-10-25 无敌科技(西安)有限公司 Word recognition method and its system
US20090285490A1 (en) * 2008-05-13 2009-11-19 Fujitsu Limited Dictionary creating apparatus, recognizing apparatus, and recognizing method
CN103677642A (en) * 2013-12-19 2014-03-26 深圳市汇顶科技股份有限公司 Touch screen terminal and method and system for identifying hand gestures of touch screen terminal
CN103995665A (en) * 2014-04-14 2014-08-20 深圳市汇顶科技股份有限公司 Mobile terminal and method and system for getting access to application programs in ready mode
CN109992106A (en) * 2019-01-10 2019-07-09 北京工业大学 Gesture track recognition method, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
唐竟人: "连续字符轨迹手势识别及其应用", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113064545A (en) * 2021-05-18 2021-07-02 清华大学 Gesture recognition method and system
CN113064545B (en) * 2021-05-18 2022-04-29 清华大学 Gesture recognition method and system

Similar Documents

Publication Publication Date Title
CN106682598B (en) Multi-pose face feature point detection method based on cascade regression
Zeng et al. Hand gesture recognition using leap motion via deterministic learning
US10679146B2 (en) Touch classification
Paisitkriangkrai et al. Pedestrian detection with spatially pooled features and structured ensemble learning
Liu et al. Kinect-based hand gesture recognition using trajectory information, hand motion dynamics and neural networks
Zeng et al. Curvature bag of words model for shape recognition
US11301674B2 (en) Stroke attribute matrices
CN110785753B (en) Method, apparatus and storage medium for searching image
GB2462903A (en) Single Stroke Character Recognition
Ghosh et al. Language-invariant novel feature descriptors for handwritten numeral recognition
Nasri et al. A novel approach for dynamic hand gesture recognition using contour-based similarity images
Mohammadi et al. Air-writing recognition system for Persian numbers with a novel classifier
CN114937285B (en) Dynamic gesture recognition method, device, equipment and storage medium
Verma et al. Grassmann manifold based dynamic hand gesture recognition using depth data
Yang et al. Parsing 3D motion trajectory for gesture recognition
Mohammadi et al. Real-time Kinect-based air-writing system with a novel analytical classifier
Golovanov et al. Combining hand detection and gesture recognition algorithms for minimizing computational cost
Park et al. Unified convolutional neural network for direct facial keypoints detection
Gheitasi et al. Estimation of hand skeletal postures by using deep convolutional neural networks
CN112749576B (en) Image recognition method and device, computing equipment and computer storage medium
Singh et al. A Temporal Convolutional Network for modeling raw 3D sequences and air-writing recognition
CN113780140A (en) Gesture image segmentation and recognition method and device based on deep learning
CN111459395A (en) Gesture recognition method and system, storage medium and man-machine interaction device
Li et al. A novel art gesture recognition model based on two channel region-based convolution neural network for explainable human-computer interaction understanding
CN113468972B (en) Handwriting track segmentation method for handwriting recognition of complex scene and computer product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200728