CN110309726A - A kind of micro- gesture identification method - Google Patents

A kind of micro- gesture identification method Download PDF

Info

Publication number
CN110309726A
CN110309726A CN201910496492.0A CN201910496492A CN110309726A CN 110309726 A CN110309726 A CN 110309726A CN 201910496492 A CN201910496492 A CN 201910496492A CN 110309726 A CN110309726 A CN 110309726A
Authority
CN
China
Prior art keywords
gesture
data
carried out
curve
identification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910496492.0A
Other languages
Chinese (zh)
Other versions
CN110309726B (en
Inventor
孙元功
郭小沛
冯志全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Jinan
Original Assignee
University of Jinan
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Jinan filed Critical University of Jinan
Priority to CN201910496492.0A priority Critical patent/CN110309726B/en
Publication of CN110309726A publication Critical patent/CN110309726A/en
Application granted granted Critical
Publication of CN110309726B publication Critical patent/CN110309726B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention provides a kind of micro- gesture identification method, using receipt gloves as input equipment, the data glove can obtain the pivoting angle data in m joint, comprising the following steps: a. carries out data prediction first, and the data that will acquire are carried out curve fitting processing using least square method;B. gesture modeling is carried out, makes every kind of gesture with a discrete subsequence SLZ comprising m element to indicate;C. gesture identification is carried out, the matching and identification of gesture are carried out using the editing distance based on template matching.The identification that the present invention is significantly acted from traditional palm, arm etc. turns to the identification direction of finger fine movement, provides the solution of more natural harmony for the human-computer interaction under general environment.

Description

A kind of micro- gesture identification method
Technical field
The present invention relates to the method, in particular to one kind of image identification technical field more particularly to a kind of gesture identification are micro- Gesture identification method.
Background technique
Gesture manipulation based on conventional human's interactive mode, has that interactivity is poor, accuracy rate is low, is also easy to aggravate The operational load of user, therefore the gesture interaction based on touch technology becomes the hot spot studied at present.With smart phone, wrist-watch With the universal and application of the mobile devices such as tablet computer, many mobile computing devices are had become based on finger touch input mode Main input mode [1,2,3].
Whether the gesture identification of view-based access control model, or the gesture identification based on wearable device, current most of researchs It requires that user carries out complexity and significantly hand and arm action with application technology, allows machine " seeing " to " understanding " people Operation be intended to, although user is made to get rid of the constraint of mouse, keyboard, the interaction for also having aggravated user to a certain extent is negative Lotus, prolonged hanging operation is easy to produce feeling of fatigue, and then is difficult to keep accuracy of identification [4].Currently based on wearable device into Row finger movement interaction technique becomes mainstream, not only avoids illumination and blocks the influence to gesture recognition effect, and is small Finger movement recognition mode provide a kind of more natural interactive mode for man-machine coordination.
The present invention is directed to unnatural property problem existing for traditional interactive means, proposes to construct micro- gesture identification model, pass through Wearable device --- data glove obtains small movement of finger joint data, and is regarded as determining time series, mentions It takes the trend feature of sequence to carry out gesture identification, is finally distinguished according to the feature of micro- gesture, generic gesture and noise gesture Gesture cluster is carried out, user is made to carry out more acurrate, sensitive interaction with natural posture and machine.
Summary of the invention
The present invention is directed to above-mentioned the technical issues of being previously mentioned, in order to which on the basis of improving gesture identification rate, enhancing is interacted Naturality and accuracy, will act on effective interactive space micromotion gesture be referred to as " micro- gesture ", it is small using finger and Movement abundant carries out human-computer interaction, there is and be not limited only to the two-dimentional gesture of contact, the more traditional interactive mode of this mode More accurately and naturally, to provide a kind of easier micro- gesture identification method.
The present invention is achieved through the following technical solutions, provides a kind of micro- gesture identification method, is made using receipt gloves For input equipment, the data glove can obtain the pivoting angle data in m joint, comprising the following steps:
A. data prediction is carried out first, and the data that will acquire are carried out curve fitting processing using least square method, are fitted The polynomial formula of curve is
Wherein n is polynomial order, and is preferably the difference of two squares of curve values and true value through experimental analysis acquirement fitting effect With the smallest polynomial order and without over-fitting;
B. gesture modeling is carried out, since the gesture data of acquisition is the small motion change of hand joint, every kind of hand Gesture can all regard m group as and determine time series, and the length of every group of data is the curve sampled data quantity n obtained in step a, is denoted as TI={ (t1,Y1),(t2,Y2),…,(tn,Yn), tiRepresent the i-th frame, YiFor the value at the moment, for the variation to every group of data Trend is quantitatively described, and every kind of gesture includes m group data namely m sequence of interval, and is denoted as s1,s2,L,sm, every kind of gesture In section siVariation tendency in (1≤i≤m) is denoted as sci, and sci∈{scup,scdw,scst,scpk,scth, and quantify respectively And correspond to 1,2,3,4,5, wherein scupIt indicates that ascendant trend, sc is presenteddwIt indicates that downward trend, sc is presentedstIt indicates to present flat Slow trend, scpkIt indicates to obtain peak value, scthIndicate that obtain valley requires first in this way for a kind of m group data of gesture The value for obtaining the starting points of the data on this section of curve, intermediate point and end point, is denoted as V respectively1、V2And V3, and obtain the group The maximum value V of datamaxWith minimum value Vmin, and judge according to table 1 [24] Sequence Trend of every section of curve, by each Sequence Trend It combines and just obtains the trend symbol queue of this kind of gesture, be denoted as SL (T)={ (s1,sc1),(s2,sc2),…,(si, sci), wherein sci∈{scup,scdw,scst,scpk,scth(1≤i≤m), since five kinds of states are quantified as 1~5 respectively Integer, therefore every kind of gesture can be indicated with a discrete subsequence SLZ comprising m element;
C. carry out gesture identification, using based on template matching editing distance carry out gesture matching and identification, editor away from From the minimum edit operation times changed into needed for another between two word strings as one are referred to, can be used to carry out similarity Compare work, the edit operation of license includes that a character is substituted for another character, is inserted into a character, deletes a word Symbol, in general, editing distance is smaller, and the similarity of two character strings is bigger, and the Dynamic Programming Equation of editing distance can indicate Are as follows:
Wherein,The editing distance of gesture 1 and gesture 2 is expressed as Dis (1,2), then two kinds of hands The similarity of gesture is indicated with ρ (ρ ∈ [0,1] }), if ρ > ε, regards gesture 1 and gesture 2 as micro- same gesture, on the contrary , wherein ε is similarity threshold, shown in ρ such as formula (1),
Wherein X, Y are respectively the sequence length of gesture 1 and gesture 2, if two groups of data are identified as a kind of gesture, then its Similarity answers infinite approach or equal to 1.
Using above scheme, the step a in the present invention is due in the ideal case, if to finger or hand exercise Sufficiently accurate words are tracked, motion profile should be a time, spatially continuous curve in space, there should be the time Continuity and spatial continuity, therefore complete gesture has time and a various features spatially, the present invention is by micro- hand Gesture regards the aggregate motion of m artis of manpower as, and one group of movement can be described as a time series, i.e. every kind of gesture is represented by m The combination of a time series is able to reflect variation characteristic of the gesture in time sequencing, in gesture collection process, due to user Deng otherness, cause between gesture of the same race there are still little bit different or noise data, therefore in order to preferably be identified, The present invention has carried out pretreatment work to data, uses minimum two to hand joint angle change data are obtained by data glove Multiplication carries out curve fitting, and solves the confounding issues of hand shake and micro- gesture, reduces data and jumps to caused by recognition result It influences.
Preferably, the quantity m that the data glove can obtain joint rotation angle data is 15.
In conclusion the exercise data of maintenance data gloves of the present invention capture hand joint, accurate data and no light and The influence blocked, carries out effective pattern classification using a large amount of training samples are not needed, and regards micro- gesture identification as time sequence The measurement problem of column similarity chooses the trend similitude of timing on the basis of removing hand shake using smoothed curve Feature carries out the identification and matching of gesture, which pertains only to the shape of curve without being influenced by noise or specific value etc., With translation invariance, and time complexity is lower.It is obtained by testing us for 10 kinds of micro- gestures based on touch technology 97% discrimination, the identification significantly acted from traditional palm, arm etc. turn to the identification direction of finger fine movement, The solution of more natural harmony is provided for the human-computer interaction under general environment.
Detailed description of the invention
Fig. 1 is a kind of algorithm frame figure of micro- gesture identification method of the present invention;
Fig. 2 is a kind of algorithm flow schematic diagram of micro- gesture identification method of the present invention;
Fig. 3 is the corresponding hand joint distribution schematic diagram of data glove used in the present invention;
Fig. 4 is matched curve schematic diagram when order is 2 in the present invention;
Fig. 5 is matched curve schematic diagram when order is 5 in the present invention;
Fig. 6 is matched curve schematic diagram when order is 8 in the present invention;
Fig. 7 is matched curve schematic diagram when order is 15 in the present invention;
Fig. 8 is a kind of curvilinear trend exemplary graph in the present invention;
Fig. 9 is 2 trumpeter's power curve tendency charts in the present invention;
Figure 10 is editing distance algorithm schematic diagram in the present invention.
Figure 11 is the schematic diagram of discrete subsequence of the every kind of gesture of the invention comprising m element.
Specific embodiment
For that can understand the technical characterstic for illustrating the present invention program, with reference to the accompanying drawing, and by specific embodiment, to this Scheme is further described.
As shown in Fig. 1 to Fig. 2, a kind of micro- gesture identification method, using receipt gloves as input equipment, the number The pivoting angle data that 15 joints can be obtained according to gloves obtains the rotation angle number in 15 joints as shown in Figure 3 by it According to serial number situation corresponding with joint is 1~3: thumb three joints from bottom to up;4~6: three from bottom to up, index finger Joint;7~9: middle finger three joints from bottom to up;10~12: nameless three joints from bottom to up;13~15: little finger of toe is under Supreme three joints, the present invention the following steps are included:
A. data prediction is carried out first, and the data that will acquire are carried out curve fitting processing using least square method, are fitted The polynomial formula of curve is
Wherein n is polynomial order, and is preferably the difference of two squares of curve values and true value through experimental analysis acquirement fitting effect With the smallest polynomial order and without over-fitting;
The corresponding one group of data in each joint (quantity is grabbed frame number), are distributed in rectangular coordinate system, can send out Now certain " trend " is presented in they, and the number of polynomial fitting is determined according to this " trend ", can return to multinomial coefficient, first First the total movement data that all gestures of certain one acquisition are included are carried out curve fitting, each joint data is one Curve, it indicates to carry out using least square method multinomial from the movement angle variation tendency of movement start and ending corresponding joint Formula curve matching, with the increase of polynomial order, matched curve is more and more smooth, and closer to actual value, but order is super It will appear over-fitting when required maximum value out.Such as Fig. 4 to Fig. 7 is the song for doing " three refer to relieving " No. 12 joint of gesture Matched curve schematic diagram when order is respectively 2,5,8,15 when line is fitted, order is randomly selecting during adjusting ginseng herein, Being tested polynomial order used in the present embodiment is 8, at this time fitting effect be preferably curve values and true value the difference of two squares and Minimum, and without over-fitting;
B. gesture modeling is carried out, since the gesture data of acquisition is the small motion change of hand joint, every kind of hand Gesture can all regard m group as and determine time series, and the length of every group of data is the curve sampled data quantity n obtained in step a, is denoted as TI={ (t1,Y1),(t2,Y2),…,(tn,Yn), tiRepresent the i-th frame, YiFor the value at the moment, for the variation to every group of data Trend is quantitatively described, and every kind of gesture includes m group data namely m sequence of interval, and is denoted as s1,s2,L,sm, every kind of gesture In section siVariation tendency in (1≤i≤m) is denoted as sci, and sci∈{scup,scdw,scst,scpk,scth, and quantify respectively And correspond to 1,2,3,4,5, wherein scupIt indicates that ascendant trend, sc is presenteddwIt indicates that downward trend, sc is presentedstIt indicates to present flat Slow trend, scpkIt indicates to obtain peak value, scthIndicate that obtain valley requires first in this way for a kind of m group data of gesture The value for obtaining the starting points of the data on this section of curve, intermediate point and end point, is denoted as V respectively1、V2And V3, and obtain the group The maximum value V of datamaxWith minimum value Vmin, and judge according to table 1 [24] Sequence Trend of every section of curve, by each Sequence Trend It combines and just obtains the trend symbol queue of this kind of gesture, be denoted as SL (T)={ (s1,sc1),(s2,sc2),…,(si, sci), wherein sci∈{scup,scdw,scst,scpk,scth(1≤i≤m), since five kinds of states are quantified as 1~5 respectively Integer, therefore every kind of gesture can be indicated with a discrete subsequence SLZ comprising m element, referring specifically to Figure 11,
And then algorithm can be described as:
As shown in figure 8, the curve is divided into 11 sections, each section corresponds to five kinds of different one of trend, and It is expressed as sequence A according to table 1, then thus SLZ (A)={ Isosorbide-5-Nitrae, 5,1,2,1,2,5,1,2,3 }, and sequence SLZ (A) is referred to as this The trend subsequence of gesture.Generally, each sample of gesture can be indicated by such a trend subsequence, then Fig. 4 Shown in gesture G trend subsequence be Ga={ a1,a2,…,a15, wherein aiIt is accorded with for the trend of every section of curve of current gesture Number, Fig. 9 is the curvilinear trend of 2 trumpeter's gesture and its expression of trend symbol;
C. carry out gesture identification, using based on template matching editing distance carry out gesture matching and identification, editor away from From Dynamic Programming Equation may be expressed as:
Wherein,
The algorithm can also indicate with Figure 10,
The editing distance of gesture 1 and gesture 2 is expressed as Dis (1,2), then the similarity of two kinds of gestures with ρ (ρ ∈ [0, 1] it }) indicates, if ρ > ε, regard gesture 1 and gesture 2 as micro- same gesture, vice versa, and wherein ε is similitude threshold Value.Shown in ρ such as formula (3).
Wherein X, Y are respectively the sequence length of gesture 1 and gesture 2, therefore the similarity of two sequences shown in Fig. 2 is 0.2, in turn, if two groups of data are identified as a kind of gesture, then its similarity answers infinite approach or equal to 1.
Finally, also it should be noted that the example above and explanation are also not limited to above-described embodiment, skill of the present invention without description Art feature can realize that details are not described herein by or using the prior art;Above embodiments and attached drawing are merely to illustrate this hair Bright technical solution is not limitation of the present invention, is described the invention in detail referring to preferred embodiment, this Field it is to be appreciated by one skilled in the art that those skilled in the art are made within the essential scope of the present invention Variations, modifications, additions or substitutions without departure from spirit of the invention, also should belong to claims of the invention. Also it should be noted that referring to other gesture identification methods in the present invention, it is situated between for document cited in these gesture identification methods It continues as follows:
[1]Heo S,Gu J,Lee G.Expanding touch input vocabulary by using consecutive distant taps[C]//Proceedings of the SIGCHI Conference on Human Factors in Computing Systems.ACM,2014:2597-2606.
[2]Walter R,Bailly G,Müller J.StrikeAPose:revealing mid-air gestures on public displays[C]//Proceedings ofthe SIGCHI Conference on Human Factors in Computing Systems.ACM,2013:841-850.
[3]Chen X A,Schwarz J,Harrison C,et al.Air+touch:interweaving touch& in-air gestures[C]//Proceedings of the 27th annual ACM symposium on User interface software andtechnology.ACM,2014:519-525.
[4] Liu Jie, Huang Jin, Tian Feng wait mixing gesture interaction model [J] Journal of Software under continuous interactive space, 2017,28(8):2080-2095.
[5]Hinckley K,Song H.Sensor synaesthesia:touch in motion,and motion in touch[C]//Proceedings of the SIGCHI Conference on Human Factors in Computing Systems.ACM,2011:801-810.
[6]Lai J,Zhang D.ExtendedThumb:a motion-based virtual thumb for improving one-handed target acquisition on touch-screen mobile devices[C]// Proceedings of the extended abstracts of the 32nd annual ACM conference on Human factors in computing systems.ACM,2014:1825-1830.
[7]Oakley I,Lee D.Interaction on the edge:offset sensing for small devices[C]//Proceedings ofthe SIGCHI Conference on Human Factors in Computing Systems.ACM,2014:169-178.
[8]Perrin S,Cassinelli A,Ishikawa M.Gesture recognition using laser- based tracking system[C]//Sixth IEEE International Conference on Automatic Face and Gesture Recognition,2004.Proceedings.IEEE,2004:541-546.
[9]Perrin S,Cassinelli A,Ishikawa M.Gesture recognition using laser- based tracking system[C]//Sixth IEEE International Conference on Automatic Face and Gesture Recognition,2004.Proceedings.IEEE,2004:541-546.
[10]Turk M.Gesture recognition in handbook of virtual environment technology[J]. Stanney(Ed.),2001:223-238.
[11]Vafadar M,Behrad A.Human hand gesture recognition using motion orientation histogram for interaction of handicapped persons with computer [C]//International Conference on Image and Signal Processing.Springer,Berlin, Heidelberg,2008: 378-385.
[12]Chen X A,Schwarz J,Harrison C,et al.Air+touch:interweaving touch& in-air gestures[C]//Proceedings of the 27th annual ACM symposium on User interface software and technology.ACM,2014:519-525.
[13]Niikura T,Hirobe Y,Cassinelli A,et al.In-air typing interface for mobile devices with vibration feedback[C]//ACM SIGGRAPH 2010 Emerging Technologies.ACM, 2010:15.
[14]Ketabdar H,Yüksel K A,Roshandel M.MagiTact:interaction with mobile devices based on compass(magnetic)sensor[C]//Proceedings ofthe 15th international conference on Intelligent user interfaces.ACM,2010:413-414.
[15]Marquardt N,Jota R,Greenberg S,et al.The continuous interaction space: interaction techniques unifying touch and gesture on and above a digital surface[C]//IFIP Conference on Human-Computer Interaction.Springer, Berlin, Heidelberg,2011:461-476.
[16]Kristensson P O,Denby L C.Continuous recognition and visualization of pen strokes and touch-screen gestures[C]//Proceedings of the Eighth Eurographics Symposium on Sketch-Based Interfaces and Modeling.ACM, 2011:95-102.
[17]Kratz S,Rohs M.Protractor3D:a closed-form solution to rotation- invariant 3D gestures[C]//Proceedings of the 16th international conference on Intelligent user interfaces.ACM,2011:371-374.
[18]LaViola Jr J J.An introduction to 3D gestural interfaces[C]//ACM SIGGRAPH 2014 Courses.ACM,2014:25.
[19]Chung H,Yang H D.Conditional random field-based gesture recognition with depth information[J].Optical Engineering,2013,52(1):017201.
[20]Pang H,Ding Y.Dynamic hand gesture recognition using kinematic features based on hidden markov model[C]//Proceedings of the 2nd International Conference on Green Communications and Networks 2012(GCN 2012): Volume 5.Springer, Berlin,Heidelberg,2013:255-262.
[21]Loclair C,Gustafson S,Baudisch P.PinchWatch:a wearable device for one-handed microinteractions[C]//Proc.MobileHCI.2010,10.
[22]Ashbrook D,Baudisch P,White S.Nenya:subtle and eyes-free mobile input with a magnetically-tracked finger ring[C]//Proceedings of the SIGCHI Conference on Human Factors in Computing Systems.ACM,2011:2043-2046.
[23] foundation of Jia little Yong, Xu Chuansheng, Bai Xin least square method and its way of thinking [J] Northwest University journal 2006,36(3):507-511.
[24] Time Series Similarity measurement and cluster research [J] computer application of Xiao Rui, the Liu Guohua based on trend Research, 2014,31 (9): 2600-2605.
[25]LaViola Jr J J.An introduction to 3D gestural interfaces[C]//ACM SIGGRAPH 2014 Courses.ACM,2014:25:1-42.

Claims (2)

1. a kind of micro- gesture identification method, using receipt gloves as input equipment, the data glove can obtain m joint Pivoting angle data, which comprises the following steps:
A. data prediction is carried out first, and the data that will acquire are carried out curve fitting processing using least square method, matched curve Polynomial formula be
Wherein n is polynomial order, and is preferably the difference of two squares of curve values and true value through experimental analysis acquirement fitting effect With the smallest polynomial order and without over-fitting;
B. gesture modeling is carried out, since the gesture data of acquisition is the small motion change of hand joint, every kind of gesture is all M group can be regarded as and determine time series, the length of every group of data is the curve sampled data quantity n obtained in step a, is denoted as TI= {(t1,Y1),(t2,Y2),…,(tn,Yn), tiRepresent the i-th frame, YiFor the value at the moment, for the variation tendency to every group of data It is quantitatively described, every kind of gesture includes m group data namely m sequence of interval, and is denoted as s1,s2,L,sm, every kind of gesture is in area Between siVariation tendency in (1≤i≤m) is denoted as sci, and sci∈{scup,scdw,scst,scpk,scth, and respectively quantization and it is right Should be in 1,2,3,4,5, wherein scupIt indicates that ascendant trend, sc is presenteddwIt indicates that downward trend, sc is presentedstIt indicates to present gently to become Gesture, scpkIt indicates to obtain peak value, scthIndicate that obtain valley requires to obtain first in this way for a kind of m group data of gesture The value of the starting points of data on this section of curve, intermediate point and end point, is denoted as V respectively1、V2And V3, and obtain this group of data Maximum value VmaxWith minimum value Vmin, judge the Sequence Trend of every section of curve, each Sequence Trend combined and is just somebody's turn to do The trend symbol queue of kind gesture, is denoted as SL (T)={ (s1,sc1),(s2,sc2),…,(si,sci), wherein sci∈{scup, scdw,scst,scpk,scth(1≤i≤m), since five kinds of states have been quantified as 1~5 integer respectively, therefore every kind of gesture is all It can be indicated with a discrete subsequence SLZ comprising m element;
C. gesture identification is carried out, the matching and identification of gesture are carried out using the editing distance based on template matching, editing distance Dynamic Programming Equation may be expressed as:
Wherein,The editing distance of gesture 1 and gesture 2 is expressed as Dis (1,2), then two kinds of gestures Similarity is indicated with ρ (ρ ∈ [0,1] }), if ρ > ε, regards gesture 1 and gesture 2 as micro- same gesture, otherwise also So, wherein ε is similarity threshold, shown in ρ such as formula (3),
Wherein X, Y are respectively the sequence length of gesture 1 and gesture 2, if two groups of data are identified as a kind of gesture, then its is similar Degree answers infinite approach or equal to 1.
2. a kind of micro- gesture identification method according to claim 1, which is characterized in that the data glove can obtain pass The quantity m for saving pivoting angle data is 15.
CN201910496492.0A 2019-06-10 2019-06-10 Micro-gesture recognition method Expired - Fee Related CN110309726B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910496492.0A CN110309726B (en) 2019-06-10 2019-06-10 Micro-gesture recognition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910496492.0A CN110309726B (en) 2019-06-10 2019-06-10 Micro-gesture recognition method

Publications (2)

Publication Number Publication Date
CN110309726A true CN110309726A (en) 2019-10-08
CN110309726B CN110309726B (en) 2022-09-13

Family

ID=68075877

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910496492.0A Expired - Fee Related CN110309726B (en) 2019-06-10 2019-06-10 Micro-gesture recognition method

Country Status (1)

Country Link
CN (1) CN110309726B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111382699A (en) * 2020-03-09 2020-07-07 金陵科技学院 Dynamic gesture recognition method based on particle swarm optimization LSTM algorithm
CN111475030A (en) * 2020-05-25 2020-07-31 北京理工大学 Micro-gesture recognition method using near-infrared sensor
CN115454240A (en) * 2022-09-05 2022-12-09 无锡雪浪数制科技有限公司 Meta universe virtual reality interaction experience system and method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8971572B1 (en) * 2011-08-12 2015-03-03 The Research Foundation For The State University Of New York Hand pointing estimation for human computer interaction
US20160078289A1 (en) * 2014-09-16 2016-03-17 Foundation for Research and Technology - Hellas (FORTH) (acting through its Institute of Computer Gesture Recognition Apparatuses, Methods and Systems for Human-Machine Interaction
CN109032337A (en) * 2018-06-28 2018-12-18 济南大学 A kind of KEM Gesture Recognition Algorithm based on data glove
CN109271840A (en) * 2018-07-25 2019-01-25 西安电子科技大学 A kind of video gesture classification method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8971572B1 (en) * 2011-08-12 2015-03-03 The Research Foundation For The State University Of New York Hand pointing estimation for human computer interaction
US20160078289A1 (en) * 2014-09-16 2016-03-17 Foundation for Research and Technology - Hellas (FORTH) (acting through its Institute of Computer Gesture Recognition Apparatuses, Methods and Systems for Human-Machine Interaction
CN109032337A (en) * 2018-06-28 2018-12-18 济南大学 A kind of KEM Gesture Recognition Algorithm based on data glove
CN109271840A (en) * 2018-07-25 2019-01-25 西安电子科技大学 A kind of video gesture classification method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
XIAOPEI GUO: "《A Novel Method for Data Glove-Based Dynamic Gesture Recognition》", 《2017 INTERNATIONAL CONFERENCE ON VIRTUAL REALITY AND VISUALIZATION (ICVRV)》 *
严利民等: "基于倾斜指势识别系统的设计与应用", 《光电子技术》 *
刘杰等: "模板匹配的三维手势识别算法", 《计算机辅助设计与图形学学报》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111382699A (en) * 2020-03-09 2020-07-07 金陵科技学院 Dynamic gesture recognition method based on particle swarm optimization LSTM algorithm
CN111475030A (en) * 2020-05-25 2020-07-31 北京理工大学 Micro-gesture recognition method using near-infrared sensor
CN115454240A (en) * 2022-09-05 2022-12-09 无锡雪浪数制科技有限公司 Meta universe virtual reality interaction experience system and method
CN115454240B (en) * 2022-09-05 2024-02-13 无锡雪浪数制科技有限公司 Meta universe virtual reality interaction experience system and method

Also Published As

Publication number Publication date
CN110309726B (en) 2022-09-13

Similar Documents

Publication Publication Date Title
Dong et al. Dynamic hand gesture recognition based on signals from specialized data glove and deep learning algorithms
Huang et al. Improved Viola-Jones face detection algorithm based on HoloLens
CN106598227B (en) Gesture identification method based on Leap Motion and Kinect
Xia et al. Vision-based hand gesture recognition for human-robot collaboration: a survey
Zhu et al. Vision based hand gesture recognition using 3D shape context
TWI382352B (en) Video based handwritten character input device and method thereof
CN110309726A (en) A kind of micro- gesture identification method
EP2605844A2 (en) Method circuit and system for human to machine interfacing by hand gestures
Magrofuoco et al. Two-dimensional stroke gesture recognition: A survey
CN109145802B (en) Kinect-based multi-person gesture man-machine interaction method and device
Shrivastava et al. Control of A Virtual System with Hand Gestures
WO2017114002A1 (en) Device and method for inputting one-dimensional handwritten text
Linqin et al. Dynamic hand gesture recognition using RGB-D data for natural human-computer interaction
Caramiaux et al. Beyond recognition: using gesture variation for continuous interaction
CN108628455B (en) Virtual sand painting drawing method based on touch screen gesture recognition
CN109543644A (en) A kind of recognition methods of multi-modal gesture
CN113342208B (en) Railway line selection method based on multi-point touch equipment, terminal and storage medium
KR20190027287A (en) The method of mimesis for keyboard and mouse function using finger movement and mouth shape
CN111860086A (en) Gesture recognition method, device and system based on deep neural network
Zhong et al. Rapid 3D conceptual design based on hand gesture
CN112596659B (en) Drawing method and device based on intelligent voice and image processing
Jiang et al. A brief analysis of gesture recognition in VR
Xu et al. A novel multimedia human-computer interaction (HCI) system based on kinect and depth image understanding
Wang et al. SPFEMD: super-pixel based finger earth mover’s distance for hand gesture recognition
Lu et al. Dynamic hand gesture recognition using HMM-BPNN model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220913