CN107169411B - A kind of real-time dynamic gesture identification method based on key frame and boundary constraint DTW - Google Patents

A kind of real-time dynamic gesture identification method based on key frame and boundary constraint DTW Download PDF

Info

Publication number
CN107169411B
CN107169411B CN201710224005.6A CN201710224005A CN107169411B CN 107169411 B CN107169411 B CN 107169411B CN 201710224005 A CN201710224005 A CN 201710224005A CN 107169411 B CN107169411 B CN 107169411B
Authority
CN
China
Prior art keywords
gestures
gesture
frame images
key frame
dtw
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710224005.6A
Other languages
Chinese (zh)
Other versions
CN107169411A (en
Inventor
程春玲
刘杨俊武
周剑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Post and Telecommunication University
Original Assignee
Nanjing Post and Telecommunication University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Post and Telecommunication University filed Critical Nanjing Post and Telecommunication University
Priority to CN201710224005.6A priority Critical patent/CN107169411B/en
Publication of CN107169411A publication Critical patent/CN107169411A/en
Application granted granted Critical
Publication of CN107169411B publication Critical patent/CN107169411B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to a kind of real-time dynamic gesture identification method based on key frame and boundary constraint DTW, in dynamic gesture feature extraction phases, the direction of motion of dynamic gesture is introduced into the selection of key frame images of gestures, and realized according to the variation tendency of dynamic gesture and the dynamic of key frame images of gestures selected threshold is adjusted, it realizes and the fingertip characteristic of key frame images of gestures is extracted using the characteristic of local extremum and combination convex closure filtering on this basis;In the dynamic hand gesture recognition stage, a kind of conversion method is proposed, the DTW lower-bound-distance between two-dimentional gesture sequence for calculating Length discrepancy, and the method that the pairing range of setting gesture data is provided in calculating process in DTW.The present invention has comprehensively considered the track characteristic and structure feature of dynamic gesture, and is chosen by key frame images, construction DTW lower-bound-distance, the pairing range that gesture data is arranged, and shortens the calculating time of dynamic hand gesture recognition process.

Description

A kind of real-time dynamic gesture identification method based on key frame and boundary constraint DTW
Technical field
The present invention relates to a kind of real-time dynamic gesture identification method based on key frame and boundary constraint DTW, belongs to man-machine Interaction technique field.
Background technique
As human-computer interaction technology graduallys mature, Gesture Recognition is widely applied to Sign Language Recognition, intelligent transportation, trip The numerous areas such as play amusement.The difference of equipment is acquired according to gesture, current Gesture Recognition can be divided into based on data hand The Gesture Recognition of set and the Gesture Recognition of view-based access control model.Wherein, the Gesture Recognition of view-based access control model does not need to wear Bulky gloves equipment is worn, human-computer interaction can be carried out in such a way that one kind is more flexible, natural, it has also become current research heat Point.The technology is broadly divided into Hand Gesture Segmentation, gesture feature extraction and gesture identification three phases, wherein gesture feature, which extracts, to be used It is to carry out the foundation of gesture identification, and gesture identification is for distinguishing gesture, is the important way for realizing gesture interaction in expression gesture Diameter.Therefore, gesture feature extraction algorithm and Gesture Recognition Algorithm are the key that determine Gesture Recognition accuracy and real-time Factor.
Dynamic gesture feature extraction refers to the repeated data eliminated in images of gestures, obtains that gesture essential attribute can be described A series of features indicated with numerical value, and express gesture by these features.Ganapathyraju et al. proposes a kind of base In the dynamic gesture feature extraction algorithm of convex closure defect, for each of convex closure convex closure defect, according to its starting point, in Heart point, apart from farthest point and farthest point to the correlation between four features of distance of convex closure, judge whether there is finger tip Point.But the algorithm can only calculate the number of finger tip, can not obtain the location information of finger tip point, and vulnerable to noise data Interference, it is poor for similar gesture identification accuracy.Li Bonan et al. proposes that a kind of gesture feature for improving k curvature extracts Candidate finger tip point is divided into multiple subsets using clustering algorithm, and select after filtering out the biggish finger tip point set of curvature by algorithm The middle site for selecting each subset indicates fingertip location.But the algorithm needs to rely on the curvature threshold being artificially arranged, and to time It selects fingertip characteristic point to carry out cluster filtering and needs to consume the longer calculating time.Pathak et al. proposes a kind of based on key frame Distance is greater than the gesture of threshold value by calculating and comparing the centroid distance of consecutive frame images of gestures by gesture feature extraction algorithm Image carries out feature extraction as key frame.But the setting of the key frame extraction threshold value of the algorithm, which relies on artificial priori, to be known Know, since the movement rate of Different Dynamic gesture has differences, is difficult to determine suitable threshold size.Ding Hongli et al. proposes one Kind key frame extraction algorithm determines key frame extraction by the mean value and variance that calculate the consecutive frame image pixel difference of entire video Threshold size, but during gesture identification, dynamic gesture video length is unknown, therefore the algorithm cannot be used for dynamic The key frame images of gestures of gesture is chosen.
Dynamic hand gesture recognition refer to by certain mode calculate the related coefficient of gesture motion track and default template come Judge the interaction semantics of gesture.Dynamic hand gesture recognition algorithm based on DTW is as a kind of template matching that Nonlinear Time is regular Algorithm is allowed to reach maximum overlapping between gesture template, can disappear by the time shaft of bending input gesture sequence Except the time difference between gesture, improve the accuracy of gesture identification, but the computation complexity of the algorithm is higher, calculation amount compared with Greatly, recognition time is influenced by template matching number and gesture sequence length, it is difficult to realize that real-time gesture identifies.Zheng Xu in order to The calculation times for reducing DTW distance, propose a kind of calculation method of DTW lower-bound-distance, utilize the time series based on Wavelet Entropy Stage feeding polymerization approximate representation method by the different sequence dimensionality reduction of length at isometric sequence, then passes through improved DTW lower-bound-distance Function filters the distance of the DTW between the similar lower sequence of possibility and calculates, but the algorithm cannot be used for calculating gesture sequence Between DTW lower-bound-distance.What superfine people proposes in the calculating process of DTW distance, for the boundary that gesture data setting is fixed Width, to reduce the calculation amount of DTW distance.But the setting of border width depends on the priori knowledge of people, if border width It is too small, then it may cause that DTW range deviation is excessive, lead to the identification of mistake, the time that is on the contrary then shortening can be ignored.
It needs in conclusion carrying out feature extraction to dynamic gesture in existing dynamic hand gesture recognition technology from gesture video Every frame images of gestures in extract the information that can express the gesture motion track and contour structures, calculation amount is larger, can increase The time for adding gesture feature to extract.Meanwhile in the gesture identification stage, calculated using the dynamic hand gesture recognition algorithm based on DTW multiple Miscellaneous degree is higher, and recognition time is influenced by the number and gesture sequence length of template matching, reduces the real-time of gesture identification Property.
Summary of the invention
Technical problem to be solved by the invention is to provide a kind of real-time dynamic hand based on key frame and boundary constraint DTW Gesture recognition methods extracts the rail of dynamic gesture according to the kinetic characteristic of dynamic gesture and the local extremum of gesture finger tip Mark feature and structure feature, and by the border width of construction DTW lower-bound-distance and setting gesture data, calculate hand to be identified The DTW distance of gesture and gesture template shortens the time of gesture identification while obtaining gesture identification result.
In order to solve the above-mentioned technical problem the present invention uses following technical scheme: the present invention devises a kind of based on key frame With the real-time dynamic gesture identification method of boundary constraint DTW, include the following steps:
Step 1) chooses key frame gesture for dynamic gesture image sequence to be identified according to dynamic gesture kinetic characteristic Image;
Above-mentioned steps 1) in, key frame gesture figure is chosen for dynamic gesture image sequence to be identified according to following steps Picture;
Step 101) sets dynamic gesture image sequence as IInput={ I1,I2,...,In, n is dynamic gesture image sequence Length, It, t ∈ [1, n], ItIndicate t frame images of gestures;g′tIndicate the mass center of t frame images of gestures,It indicates The center-of-mass coordinate of t frame images of gestures;δtIndicate the key frame images of gestures selected threshold of t frame images of gestures;Indicate t1Frame images of gestures and t2Centroid distance between frame images of gestures;
Step 102) is directed to dynamic gesture image sequence, calculates centroid motion angle, θ between consecutive frame images of gesturestJust String value and cosine value:
Step 103) is directed to dynamic gesture image sequence, calculates the direction of relative movement between consecutive frame images of gestures dirt:
Step 104) is directed to images of gestures It, judge dirt≠dirt-1It is whether true, it is then by images of gestures ItAs Key frame images of gestures;Otherwise further directed to images of gestures It, judge dist (t, t-1) > δt-1It is whether true, be It then will be by images of gestures ItAs key frame images of gestures;Otherwise by images of gestures ItNot as key frame images of gestures;
Step 105) basis
Update key frame images of gestures selected threshold δt
Step 2) obtains the fingertip characteristic point of each key frame images of gestures, and combines the mass center of key frame images of gestures, The gesture feature vector of the frame images of gestures is constructed, the spy of dynamic gesture corresponding to dynamic gesture to be identified is then further constructed Levy vector;
Step 3) is directed to dynamic gesture feature vector, calculate separately in gesture template library each dynamic gesture template characteristic to DTW lower-bound-distance between amount and dynamic gesture feature vector to be identified, the DTW lower-bound-distance institute that acquisition meets preset requirement are right The each dynamic gesture template characteristic vector answered;
Above-mentioned steps 3) in, respectively for each dynamic gesture template characteristic vector in gesture template library, execute as follows Step calculates separately each dynamic gesture template characteristic vector in gesture template library and obtains dynamic gesture feature vector to be identified Between DTW lower-bound-distance, and judged for DTW lower-bound-distance, and then obtain the DTW lower-bound-distance for meeting preset requirement Corresponding each dynamic gesture template characteristic vector;
Step 301) set dynamic gesture feature vector to be identified asWhereinμ ∈ [1, n], n VILength;For VIIn all matter The set of heart coordinate,For GIProjection in the horizontal direction,For GIVertical Projection on direction;For VIIn all fingertip characteristics set;In gesture template library Dynamic gesture template characteristic vector isWhereinK ∈ [1, m], m VTLength Degree;For VTIn all center-of-mass coordinates set,For GT Projection in the horizontal direction,For GTProjection in vertical direction;For VTIn all fingertip characteristics set.Max be n and m maximum value, min be n and The minimum value of m;
Step 302) relocates GIAnd GT, obtain the identical gesture path sequence G of starting pointI′And GT′
Step 303) is using interpolation method in GI′And GT′Initial position add max+1-n and max+1-m starting point respectively CoordinateObtain the gesture sequence G that length is max+1I+={ GI*,GI′},And GT+= {GT*,GT′},
Step 304) calculates GI+Variation degree in the horizontal direction and the vertical direction:
Diff_x=max (XI+)-min(XI+), diff_y=max (YI+)-min(YI+)
Wherein max () indicates the maximum value in sequence, and min () indicates the minimum value in sequence;
If step 305) diff_x >=diff_y, X is calculated using LB_Keogh algorithmI+And XT+DTW lower-bound-distance LB_D(XI+,XT+);Otherwise Y is calculatedI+And YT+DTW lower-bound-distance LB_D (YI+,YT+);
Step 306) judges LB_D (XI+,XT+) or LB_D (YI+,YT+) whether it is greater than current preset minimum DTW distance, it is then The dynamic gesture template characteristic vector is unsatisfactory for preset requirement, and otherwise the dynamic gesture template characteristic vector meets preset requirement;
Step 4), which is directed to, meets each dynamic gesture template characteristic vector corresponding to the DTW lower-bound-distance of preset requirement, Calculate separately DTW distance between each dynamic gesture template characteristic vector and dynamic gesture feature vector to be identified, based on DTW away from From realization is directed to the identification of dynamic gesture to be identified.
As a preferred technical solution of the present invention, the step 2) includes the following steps:
Step 201) is setcskIndicate the gesture profile point of k-th of key frame images of gestures Collection, NkIndicate gesture profile point number, c in k-th of key frame images of gesturesk,λIs expressed as in k-th of key frame images of gestures λ gesture profile point, ck,λ=(xk,λ,yk,λ) it is expressed as the λ gesture profile point coordinate, λ in k-th of key frame images of gestures ∈[1,Nk], gkFor the mass center of k-th of key frame images of gestures;
Step 202) is directed to each key frame images of gestures respectively, according to the following formula:
Calculate each gesture profile point c in key frame images of gesturesk,λRespectively with the mass center g of corresponding key frame images of gesturesk The distance between dist (ck,λ,gk);
Step 203) is directed to each key frame images of gestures respectively, further respectively for each in key frame images of gestures A gesture profile point ck,λ, obtain meet in all gesture profile points of the key frame images of gestures first | ck,λ-ck,λ′| the hand of < ε Gesture profile point ck,λ′, then judgement obtains each gesture profile point ck,λ′, if it is all satisfied dist (ck,λ’,gk)≤dist(ck,λ, gk), it is then by gesture profile point ck,λThe concentration of class fingertip characteristic point corresponding to the key frame images of gestures is added, otherwise not To gesture profile point ck,λAny processing is carried out, is so completed for each gesture profile point in the key frame images of gestures Aforesaid operations obtain class fingertip characteristic point set corresponding to the key frame images of gestures, and then obtain each key frame gesture figure As corresponding class fingertip characteristic point set;Wherein, ε > 0, ε indicate preset range threshold value;
Step 204) is directed to each key frame images of gestures respectively, calculates and obtains convex closure corresponding to key frame images of gestures Curve hullk, and then obtain the corresponding convex closure curve hull of each key frame images of gestures differencek
Step 205) is directed to each key frame images of gestures respectively, further respectively for corresponding to key frame images of gestures Each class fingertip characteristic point that class fingertip characteristic point is concentrated, if class fingertip characteristic point belongs to corresponding to the key frame images of gestures Such fingertip characteristic point is then added to fingertip characteristic point set corresponding to the key frame images of gestures by convex closure curve, so complete At the aforesaid operations for concentrating each class fingertip characteristic point for class fingertip characteristic point corresponding to the key frame images of gestures, it is somebody's turn to do Fingertip characteristic point set corresponding to key frame images of gestures, and then obtain fingertip characteristic corresponding to each key frame images of gestures Point set;
Step 206) is directed to each key frame images of gestures respectively, obtains fingertip characteristic point corresponding to key frame images of gestures Fingertip characteristic point number is concentrated to construct the gesture feature of the key frame images of gestures in conjunction with the mass center of the key frame images of gestures Then vector further constructs dynamic gesture feature vector corresponding to dynamic gesture to be identified.
As a preferred technical solution of the present invention, in the step 4), for meet the DTW lower bound of preset requirement away from From corresponding each dynamic gesture template characteristic vector, based on such as giving a definition, following steps are executed, and be based on DTW distance, it is real Now it is directed to the identification of dynamic gesture to be identified;
Define gestures direction sequence FG={ subFG1,...,subFGα, subFG={ dirω,lenω,bandω},ω∈ { 1 ..., α }, wherein α is the number of subsequence, dirωFor the direction of motion of the ω subsequence, lenωFor with identical fortune The number of the continuous path data in dynamic direction, bandωFor the border width of the subsequence;
Step 401) calculates GI′In adjacent gesture path data g 'μWith g 'μ-1Between vectorCoordinate recycles Freeman-4 chain code calculates the direction code value c ' of the vectorμ:
Finally by GI′In the identical continuous gesture data of all code values merge, obtain several gestures direction subsequences subFGI′, G is obtained after combinationI′Gestures direction sequence FGI′, similarly construct GT′Gestures direction sequence FGT′
Step 402) compares FGI′And FGT′In with same index subsequence, if the direction code value of subsequence is identical, Select in the two subsequences biggish length as boundary candidate width cband, then will be on cband and the border width of permission Boundary iband compares, settingSelect boundary of the two minimum value as the corresponding all gesture datas of the subsequence Width;Otherwise the border width that corresponding all gesture datas of the subsequence are arranged is iband;
Step 403) finds optimal bending based on the thought of Dynamic Programming within the scope of the border width of each gesture data Path obtains GI′And GT′DTW distance DTW (GI′,GT′);
Step 404) is directed to all DTW distance being calculated and carries out descending sort, if it exists some DTW distance and minimum The absolute difference of DTW distance is less than threshold value, jumps to step 405);Otherwise minimum DTW is made apart from corresponding dynamic gesture template For recognition result;
Step 405) calculates CIAnd CTBetween structural distance, the identical optimal crooked route of DTW is continued to use, to path node OnWithExclusive or is carried out, and is added up to exclusive or result, C is obtainedIAnd CTStructural distance CD (CI,CT), Finally select the smallest dynamic gesture template of structural distance as recognition result.
A kind of real-time dynamic gesture identification method based on key frame and boundary constraint DTW of the present invention uses the above skill Art scheme compared with prior art, has following technical effect that
(1) the real-time dynamic gesture identification method based on key frame and boundary constraint DTW that the present invention designs, it is contemplated that dynamic The direction of motion and movement rate of state gesture, the present invention will be moved by comparing the direction of motion between consecutive frame images of gestures The images of gestures that direction changes also is used as key frame images to carry out feature extraction;Simultaneously by calculating present frame images of gestures The average value of all centroid distances before, dynamic adjusts the size of key frame images of gestures selected threshold, to reduce dynamic hand The loss of the key message of gesture improves accuracy and real-time that gesture feature extracts;
(2) the real-time dynamic gesture identification method based on key frame and boundary constraint DTW that the present invention designs, it is contemplated that hand Gesture finger tip point has local extremum, and the present invention passes through construction distance function and convex closure is combined to filter, and finds gesture contour curve On Local Extremum, determine fingertip characteristic present in gesture, can reduce fingertip characteristic extraction time complexity, protecting While demonstrate,proving high accuracy, shorten the time that fingertip characteristic extracts;
(3) the real-time dynamic gesture identification method based on key frame and boundary constraint DTW that the present invention designs, provides one kind The method that the two-dimentional gesture sequence of Length discrepancy is converted into isometric one-dimensional gesture sequence, can be by calculating DTW lower-bound-distance The DTW distance reduced between the similar lower gesture sequence of possibility calculates, and shortens the time of gesture identification;
(4) the real-time dynamic gesture identification method based on key frame and boundary constraint DTW that the present invention designs, according to dynamic Relationship between the gesture sequence direction of motion determines the pairing model of each gesture data in gesture sequence using Freeman chain code It encloses, to reduce the calculation amount of DTW distance, shortens the time of gesture identification.
Detailed description of the invention
Fig. 1 is the process of the real-time dynamic gesture identification method based on key frame and boundary constraint DTW designed by the present invention Schematic diagram;
Fig. 2 is the schematic diagram that fingertip characteristic extraction is carried out in one embodiment of the invention.
Specific embodiment
Specific embodiments of the present invention will be described in further detail with reference to the accompanying drawings of the specification.
As shown in Figure 1, the present invention devises a kind of real-time dynamic hand gesture recognition side based on key frame and boundary constraint DTW Method in actual application, specifically comprises the following steps:
Step 1) is according to dynamic gesture kinetic characteristic, using following steps, for dynamic gesture image sequence to be identified, choosing Take key frame images of gestures.
Step 101) sets dynamic gesture image sequence as IInput={ I1,I2,...,In, n is dynamic gesture image sequence Length, It, t ∈ [1, n], ItIndicate t frame images of gestures;g′tIndicate the mass center of t frame images of gestures,It indicates The center-of-mass coordinate of t frame images of gestures;δtIndicate the key frame images of gestures selected threshold of t frame images of gestures;Indicate t1Frame images of gestures and t2Centroid distance between frame images of gestures.
Step 102) is directed to dynamic gesture image sequence, calculates centroid motion angle, θ between consecutive frame images of gesturestJust String value and cosine value:
Step 103) is directed to dynamic gesture image sequence, calculates the direction of relative movement between consecutive frame images of gestures dirt
Step 104) is directed to images of gestures It, judge dirt≠dirt-1It is whether true, it is then by images of gestures ItAs Key frame images of gestures;Otherwise further directed to images of gestures It, judge dist (t, t-1) > δt-1It is whether true, be It then will be by images of gestures ItAs key frame images of gestures;Otherwise by images of gestures ItNot as key frame images of gestures.
Step 105) basis
Update key frame images of gestures selected threshold δt
Step 2) obtains the fingertip characteristic point of each key frame images of gestures, and combines the mass center of key frame images of gestures, The gesture feature vector of the frame images of gestures is constructed, the spy of dynamic gesture corresponding to dynamic gesture to be identified is then further constructed Levy vector.
Above-mentioned steps 2) specifically comprise the following steps:
Step 201) is setcskIndicate the gesture profile point of k-th of key frame images of gestures Collection, NkIndicate gesture profile point number, c in k-th of key frame images of gesturesk,λIs expressed as in k-th of key frame images of gestures λ gesture profile point, ck,λ=(xk,λ,yk,λ) it is expressed as the λ gesture profile point coordinate, λ in k-th of key frame images of gestures ∈[1,Nk], gkFor the mass center of k-th of key frame images of gestures.
Step 202) is directed to each key frame images of gestures respectively, according to the following formula:
Calculate each gesture profile point c in key frame images of gesturesk,λRespectively with the mass center g of corresponding key frame images of gesturesk The distance between dist (ck,λ,gk)。
Step 203) is directed to each key frame images of gestures respectively, further respectively for each in key frame images of gestures A gesture profile point ck,λ, obtain meet in all gesture profile points of the key frame images of gestures first | ck,λ-ck,λ′| the hand of < ε Gesture profile point ck,λ′, then judgement obtains each gesture profile point ck,λ′, if it is all satisfied dist (ck,λ’,gk)≤dist(ck,λ, gk), it is then by gesture profile point ck,λThe concentration of class fingertip characteristic point corresponding to the key frame images of gestures is added, otherwise not To gesture profile point ck,λAny processing is carried out, is so completed for each gesture profile point in the key frame images of gestures Aforesaid operations obtain class fingertip characteristic point set corresponding to the key frame images of gestures, and then obtain each key frame gesture figure As corresponding class fingertip characteristic point set;Wherein, ε > 0, ε indicate preset range threshold value.
Step 204) is directed to each key frame images of gestures respectively, calculates and obtains convex closure corresponding to key frame images of gestures Curve hullk, and then obtain the corresponding convex closure curve hull of each key frame images of gestures differencek
Step 205) is directed to each key frame images of gestures respectively, further respectively for corresponding to key frame images of gestures Each class fingertip characteristic point that class fingertip characteristic point is concentrated, if class fingertip characteristic point belongs to corresponding to the key frame images of gestures Such fingertip characteristic point is then added to fingertip characteristic point set corresponding to the key frame images of gestures by convex closure curve, so complete At the aforesaid operations for concentrating each class fingertip characteristic point for class fingertip characteristic point corresponding to the key frame images of gestures, it is somebody's turn to do Fingertip characteristic point set corresponding to key frame images of gestures, and then obtain fingertip characteristic corresponding to each key frame images of gestures Point set.
Step 206) is directed to each key frame images of gestures respectively, obtains fingertip characteristic point corresponding to key frame images of gestures Fingertip characteristic point number is concentrated to construct the gesture feature of the key frame images of gestures in conjunction with the mass center of the key frame images of gestures Then vector further constructs dynamic gesture feature vector corresponding to dynamic gesture to be identified.
Step 3) for each dynamic gesture template characteristic vector in gesture template library, executes following steps respectively, point It each dynamic gesture template characteristic vector and Ji Suan not obtain between dynamic gesture feature vector to be identified in gesture template library DTW lower-bound-distance, and judged for DTW lower-bound-distance, and then obtain and meet corresponding to the DTW lower-bound-distance of preset requirement Each dynamic gesture template characteristic vector.
Step 301) set dynamic gesture feature vector to be identified asWhereinμ ∈ [1, n], n VILength;For VIIn all matter The set of heart coordinate,For GIProjection in the horizontal direction,For GIVertical Projection on direction;For VIIn all fingertip characteristics set;In gesture template library Dynamic gesture template characteristic vector isWhereinK ∈ [1, m], m VTLength Degree;For VTIn all center-of-mass coordinates set,For GTProjection in the horizontal direction,For GTProjection in vertical direction;For VTIn all fingertip characteristics set.Max be n and m maximum value, min be n and The minimum value of m.
Step 302) relocates GIAnd GT, obtain the identical gesture path sequence G of starting pointI′And GT′
Step 303) is using interpolation method in GI′And GT′Initial position add max+1-n and max+1-m starting point respectively CoordinateObtain the gesture sequence G that length is max+1I+={ GI*,GI′},And GT+= {GT*,GT′},
Step 304) calculates GI+Variation degree in the horizontal direction and the vertical direction:
Diff_x=max (XI+)-min(XI+), diff_y=max (YI+)-min(YI+)
Wherein max () indicates the maximum value in sequence, and min () indicates the minimum value in sequence.
If step 305) diff_x >=diff_y, X is calculated using LB_Keogh algorithmI+And XT+DTW lower-bound-distance LB_D(XI+,XT+);Otherwise Y is calculatedI+And YT+DTW lower-bound-distance LB_D (YI+,YT+)。
Step 306) judges LB_D (XI+,XT+) or LB_D (YI+,YT+) whether it is greater than current preset minimum DTW distance, it is then The dynamic gesture template characteristic vector is unsatisfactory for preset requirement, and otherwise the dynamic gesture template characteristic vector meets preset requirement.
Step 4), which is directed to, meets each dynamic gesture template characteristic vector corresponding to the DTW lower-bound-distance of preset requirement, Based on such as giving a definition, following steps is executed, and be based on DTW distance, realize the identification for being directed to dynamic gesture to be identified.
Define gestures direction sequence FG={ subFG1,...,subFGα, subFG={ dirω,lenω,bandω},ω∈ { 1 ..., α }, wherein α is the number of subsequence, dirωFor the direction of motion of the ω subsequence, lenωFor with identical fortune The number of the continuous path data in dynamic direction, bandωFor the border width of the subsequence.
Step 401) calculates GI′In adjacent gesture path data g 'μWith g 'μ-1Between vectorCoordinate recycles Freeman-4 chain code calculates the direction code value c ' of the vectorμ:
Finally by GI′In the identical continuous gesture data of all code values merge, obtain several gestures direction subsequences subFGI′, G is obtained after combinationI′Gestures direction sequence FGI′, similarly construct GT′Gestures direction sequence FGT′
Step 402) compares FGI′And FGT′In with same index subsequence, if the direction code value of subsequence is identical, Select in the two subsequences biggish length as boundary candidate width cband, then will be on cband and the border width of permission Boundary iband compares, settingSelect boundary of the two minimum value as the corresponding all gesture datas of the subsequence Width;Otherwise the border width that corresponding all gesture datas of the subsequence are arranged is iband.
Step 403) finds optimal bending based on the thought of Dynamic Programming within the scope of the border width of each gesture data Path obtains GI′And GT′DTW distance DTW (GI′,GT′)。
Step 404) is directed to all DTW distance being calculated and carries out descending sort, if it exists some DTW distance and minimum The absolute difference of DTW distance is less than threshold value, jumps to step 405);Otherwise minimum DTW is made apart from corresponding dynamic gesture template For recognition result.
Step 405) calculates CIAnd CTBetween structural distance, the identical optimal crooked route of DTW is continued to use, to path node OnWithExclusive or is carried out, and is added up to exclusive or result, C is obtainedIAnd CTStructural distance CD (CI,CT), Finally select the smallest dynamic gesture template of structural distance as recognition result.
The LB_Keogh algorithm and Freeman chain code being related in technical solution of the present invention are the prior art, in detail Content can refer to document [Keogh E, Ratanamahatana C A.Exact indexing of dynamic time Warping [J] .Knowledge and information systems, 2005,7 (3): 358-386], [Freeman H.Computer processing of line-drawing images[J].ACM Computing Surveys(CSUR), 1974,6(1):57-97.]。
Real-time dynamic gesture identification method based on key frame and boundary constraint DTW designed by above-mentioned technical proposal considers The direction of motion and movement rate of dynamic gesture, the present invention, will by comparing the direction of motion between consecutive frame images of gestures The images of gestures that the direction of motion changes also is used as key frame images to carry out feature extraction;Simultaneously by calculating present frame gesture The average value of all centroid distances before image, dynamic adjust the size of key frame images of gestures selected threshold, to reduce dynamic The loss of the key message of state gesture improves accuracy and real-time that gesture feature extracts;And consider gesture finger tip point With local extremum, the present invention passes through construction distance function and convex closure is combined to filter, and finds the part on gesture contour curve Extreme point determines fingertip characteristic present in gesture, can reduce the time complexity of fingertip characteristic extraction, is guaranteeing compared with Gao Zhun While true property, shorten the time that fingertip characteristic extracts;And it provides and a kind of is converted into the two-dimentional gesture sequence of Length discrepancy The method of long one-dimensional gesture sequence, can by calculate DTW lower-bound-distance reduce the similar lower gesture sequence of possibility it Between DTW distance calculate, shorten the time of gesture identification;Moreover, according to the pass between dynamic gesture sequence motion direction System, the pairing range of each gesture data in gesture sequence is determined using Freeman chain code, to reduce the calculating of DTW distance Amount, shortens the time of gesture identification.
By the real-time dynamic gesture identification method designed by the present invention based on key frame and boundary constraint DTW, it is applied to tool In body embodiment, feature extraction is carried out to dynamic gesture to be identified first, gesture identification then is carried out to it.If to be identified Dynamic gesture is the gesture video that a length is 10, and the center-of-mass coordinate of every frame images of gestures is respectively (20,20), (20, 22),(20,23),(20,27),(20,30),(20,32),(22,32),(24,32),(25,32),(28,32).From first frame Images of gestures starts, and calculates the direction of motion and centroid distance between consecutive frame images of gestures, and constantly adjust key frame gesture The size of image selected threshold, finally obtained key frame images of gestures be the 1st frame, the 2nd frame, the 4th frame, the 5th frame, the 7th frame and 10th frame.Then to each key frame images of gestures carry out fingertip characteristic extraction, the fingertip characteristic extracted such as Fig. 2, wherein A, B, C, D, E, F, H, I are the profile point on gesture profile, and G is gesture mass center.Each profile point on gesture profile is calculated first At a distance from mass center, since profile point A, B, C, D, E, F, H are greater than the ε profile point and matter adjacent with them at a distance from mass center Therefore these profile points are added to class fingertip characteristic point and concentrated by the distance of the heart.Then the convex closure curve for calculating the gesture, because It is on convex closure curve for class fingertip characteristic point A, B, C, D, E, therefore, using them as the fingertip characteristic of gesture point.Most Afterwards, the feature vector of gesture to be identified is constructed are as follows:
VI={ (20,20), 5 }, { (20,22), 5 }, { (20,27), 5 }, { (20,30), 5 }, { (22,32), 5 }, {(28,32),5}}。
Then dynamic hand gesture recognition is carried out according to the feature vector extracted.If there are three kinds of dynamic hands in gesture template library Gesture type, after respectively singly referring to, being first downward to the right;After the five fingers, elder generation are downward to the right;The five fingers, upwards.The feature of every kind of gesture-type Vector is as follows:
V1={ (45,28), 1 }, { (45,30), 1 }, { (45,35), 1 }, { (45,38), 1 }, { (47,40), 1 }, {(53,40),1}};
V2={ (45,28), 5 }, { (45,30), 5 }, { (45,35), 5 }, { (45,38), 5 }, { (47,40), 5 }, {(53,40),5}};
V3={ (45,28), 5 }, { (45,26), 5 }, { (45,24), 5 }, { (45,21), 5 }, { (47,15), 5 }, {(53,13),5}}。
The then process of dynamic hand gesture recognition are as follows:
(1) first to VIIt is relocated, is obtained and V1The identical gesture sequence V of origin coordinatesI={ (45,28), 5 }, {(45,30),5},{(45,35),5},{(45,38),5},{(47,40),5},{(53,40),5}}
(2) V is calculatedIAnd V1DTW distance.Using Freeman-4 chain code, V is calculatedIAnd V1Gestures direction sequence FGI={ { 1,4,0 }, { 0,2,0 } } and FG1={ { 1,4,0 }, { 0,2,0 } }.By comparing FGIAnd FG1In have same index Subsequence, obtain FGIIn the boundary candidate width of two subsequences be respectively cband1=4 and cband2=2.Due to allowing The border width upper boundcband1> iband, cband2< iband, therefore, FGIInIt is corresponding The border width of all gesture datas is 3,The border width of corresponding all gesture datas is 2.It is finally wide on boundary It spends in range and carries out DTW distance calculating, obtain VIAnd V1DTW distance DTW (VI,V1)=0.Therefore, minimum at this time is set DTW distance min_dtw is 0, recognition result V1
(3) V is calculatedIAnd V2DTW lower-bound-distance.First using the identical method with step (1) to VIReorientation, obtains With V2Gesture sequence with identical origin coordinates.Due to VIVariation degree in vertical direction is greater than in the horizontal direction Variation degree, therefore calculate VIAnd V2The DTW lower-bound-distance LB_D (Y of set is projected in vertical directionI,Y2)=0.
(4) V is calculatedIAnd V2DTW distance.Because of LB_D (YI,Y2)=min_dtw, therefore using identical as step (2) Method calculate VIAnd V2DTW distance DTW (VI,V2)=0.Due to | DTW (VI,V2)-min_dtw |=0, therefore count respectively Calculate VIAnd V1、VIAnd V2Structural distance CD (CI,C1)=6, CD (CI,C2)=0.Because of CD (CI,C1) > CD (CI,C2), therefore Update minimum DTW distance min_dtw=0, recognition result V2
(5) V is calculatedIAnd V3DTW lower-bound-distance.First using the identical method with step (1) to VIReorientation, obtains With V3Gesture sequence with identical origin coordinates.Then V is calculatedIAnd V3The DTW lower-bound-distance of set is projected in vertical direction LB_D(YI,Y3)=72.Because of LB_D (YI,Y3) > min_dtw, therefore skip VIAnd V3DTW distance calculate.
(6) minimum DTW distance min_dtw=0, recognition result V are finally obtained2Corresponding dynamic gesture type, i.e., five After referring to, being first downward to the right.
Since this example is relatively simple, if the dynamic gesture sequence for carrying out the calculating of DTW lower-bound-distance in actual conditions is long Degree is different, needs that the gesture sequence of different length is converted into the identical gesture sequence of length using interpolation method, then carry out under DTW Boundary's distance calculates.
Embodiments of the present invention are explained in detail above in conjunction with attached drawing, but the present invention is not limited to above-mentioned implementations Mode within the knowledge of a person skilled in the art can also be without departing from the purpose of the present invention It makes a variety of changes.

Claims (3)

1. a kind of real-time dynamic gesture identification method based on key frame and boundary constraint DTW, which is characterized in that including walking as follows It is rapid:
Step 1) chooses key frame images of gestures for dynamic gesture image sequence to be identified according to dynamic gesture kinetic characteristic; Above-mentioned steps 1) in, key frame images of gestures is chosen for dynamic gesture image sequence to be identified according to following steps;Step 101) dynamic gesture image sequence is set as IInput={ I1,I2,...,In, n is the length of dynamic gesture image sequence, It,t∈ [1, n], ItIndicate t frame images of gestures;gt' indicate t frame images of gestures mass center,Indicate t frame gesture figure The center-of-mass coordinate of picture;δtIndicate the key frame images of gestures selected threshold of t frame images of gestures;It indicates T1Frame images of gestures and t2Centroid distance between frame images of gestures;
Step 102) is directed to dynamic gesture image sequence, calculates centroid motion angle, θ between consecutive frame images of gesturestSine value And cosine value:
Step 103) is directed to dynamic gesture image sequence, calculates the direction of relative movement dir between consecutive frame images of gesturest:
Step 104) is directed to images of gestures It, judge dirt≠dirt-1It is whether true, it is then by images of gestures ItAs key Frame images of gestures;Otherwise further directed to images of gestures It, judge dist (t, t-1) > δt-1It is whether true, then will be Images of gestures ItAs key frame images of gestures;Otherwise by images of gestures ItNot as key frame images of gestures;
Step 105) basis
Update key frame images of gestures selected threshold δt
Step 2) obtains the fingertip characteristic point of each key frame images of gestures, and combines the mass center of key frame images of gestures, building The gesture feature vector of the frame images of gestures, then further construct dynamic gesture feature corresponding to dynamic gesture to be identified to Amount;Step 3) is directed to dynamic gesture feature vector, calculate separately in gesture template library each dynamic gesture template characteristic vector with DTW lower-bound-distance between dynamic gesture feature vector to be identified, acquisition meet corresponding to the DTW lower-bound-distance of preset requirement Each dynamic gesture template characteristic vector;
Above-mentioned steps 3) in, respectively for each dynamic gesture template characteristic vector in gesture template library, following steps are executed, It calculates separately each dynamic gesture template characteristic vector in gesture template library and obtains between dynamic gesture feature vector to be identified DTW lower-bound-distance, and judged for DTW lower-bound-distance, so obtain meet preset requirement DTW lower-bound-distance institute it is right The each dynamic gesture template characteristic vector answered;
Step 301) set dynamic gesture feature vector to be identified asWhereinμ ∈ [1, n], n VILength;For VIIn all center-of-mass coordinates set,For GIProjection in the horizontal direction,For GIProjection in vertical direction;For VIIn all fingertip characteristics set;Dynamic gesture template is special in gesture template library Levying vector isWhereinK ∈ [1, m], m VTLength;For VTIn all center-of-mass coordinates set,For GTIn Projection in horizontal direction,For GTProjection in vertical direction;For VTIn all fingertip characteristics set;Max is the maximum value of n and m, min n With the minimum value of m;
Step 302) relocates GIAnd GT, obtain the identical gesture path sequence G of starting pointI'And GT'
Step 303) is using interpolation method in GI'And GT'Initial position add max+1-n and max+1-m starting point coordinate respectivelyObtain the gesture sequence G that length is max+1I+={ GI*,GI'},And GT+={ GT*, GT'},Step 304) calculates GI+Variation degree in the horizontal direction and the vertical direction:
Diff_x=max (XI+)-min(XI+), diff_y=max (YI+)-min(YI+)
Wherein max () indicates the maximum value in sequence, and min () indicates the minimum value in sequence;
If step 305) diff_x >=diff_y, X is calculated using LB_Keogh algorithmI+And XT+DTW lower-bound-distance LB_D (XI +,XT+);Otherwise Y is calculatedI+And YT+DTW lower-bound-distance LB_D (YI+,YT+);
Step 306) judges LB_D (XI+,XT+) or LB_D (YI+,YT+) whether it is greater than current preset minimum DTW distance, it is that then this is dynamic State gesture template characteristic vector is unsatisfactory for preset requirement, and otherwise the dynamic gesture template characteristic vector meets preset requirement;Step 4) it for each dynamic gesture template characteristic vector corresponding to the DTW lower-bound-distance of preset requirement is met, calculates separately each DTW distance between dynamic gesture template characteristic vector and dynamic gesture feature vector to be identified, is based on DTW distance, and realization is directed to The identification of dynamic gesture to be identified.
2. a kind of real-time dynamic gesture identification method based on key frame and boundary constraint DTW according to claim 1, special Sign is that the step 2) includes the following steps:
Step 201) is setcskIndicate the gesture profile point set of k-th of key frame images of gestures, NkTable Show gesture profile point number, c in k-th of key frame images of gesturesk,λIt is expressed as the λ gesture in k-th of key frame images of gestures Profile point, ck,λ=(xk,λ,yk,λ) it is expressed as the λ gesture profile point coordinate in k-th of key frame images of gestures, λ ∈ [1, Nk], gkFor the mass center of k-th of key frame images of gestures;
Step 202) is directed to each key frame images of gestures respectively, according to the following formula:
Calculate each gesture profile point c in key frame images of gesturesk,λRespectively with the mass center g of corresponding key frame images of gestureskBetween Distance dist (ck,λ,gk);
Step 203) is directed to each key frame images of gestures respectively, further respectively for each hand in key frame images of gestures Gesture profile point ck,λ, obtain meet in all gesture profile points of the key frame images of gestures first | ck,λ-ck,λ'| the gesture wheel of < ε Wide point ck,λ', then judgement obtains each gesture profile point ck,λ', if it is all satisfied dist (ck,λ’,gk)≤dist(ck,λ,gk), It is then by gesture profile point ck,λThe concentration of class fingertip characteristic point corresponding to the key frame images of gestures is added, otherwise not to this Gesture profile point ck,λCarry out any processing, so complete in the key frame images of gestures each gesture profile point it is above-mentioned Operation obtains class fingertip characteristic point set corresponding to the key frame images of gestures, and then obtains each key frame images of gestures institute Corresponding class fingertip characteristic point set;Wherein, ε > 0, ε indicate preset range threshold value;
Step 204) is directed to each key frame images of gestures respectively, calculates and obtains convex closure curve corresponding to key frame images of gestures hullk, and then obtain the corresponding convex closure curve hull of each key frame images of gestures differencek
Step 205) is directed to each key frame images of gestures respectively, further refers to respectively for class corresponding to key frame images of gestures Each class fingertip characteristic point in sharp feature point set, if class fingertip characteristic point belongs to convex closure corresponding to the key frame images of gestures Such fingertip characteristic point is then added to fingertip characteristic point set corresponding to the key frame images of gestures by curve, so completion needle The aforesaid operations that each class fingertip characteristic point is concentrated to class fingertip characteristic point corresponding to the key frame images of gestures, obtain the key Fingertip characteristic point set corresponding to frame images of gestures, and then obtain fingertip characteristic point corresponding to each key frame images of gestures Collection;Step 206) is directed to each key frame images of gestures respectively, obtains fingertip characteristic point corresponding to key frame images of gestures and concentrates Fingertip characteristic point number constructs the gesture feature vector of the key frame images of gestures in conjunction with the mass center of the key frame images of gestures, Then dynamic gesture feature vector corresponding to dynamic gesture to be identified is further constructed.
3. a kind of real-time dynamic gesture identification method based on key frame and boundary constraint DTW according to claim 1, special Sign is, in the step 4), for meeting each dynamic gesture template characteristic corresponding to the DTW lower-bound-distance of preset requirement Vector executes following steps, and be based on DTW distance, realizes the identification for being directed to dynamic gesture to be identified based on such as giving a definition;
Define gestures direction sequence FG={ subFG1,...,subFGα, subFG={ dirω,lenω,bandω},ω∈ { 1 ..., α }, wherein α is the number of subsequence, dirωFor the direction of motion of the ω subsequence, lenωFor with identical fortune The number of the continuous path data in dynamic direction, bandωFor the border width of the subsequence;
Step 401) calculates GI'In adjacent gesture path data g'μAnd g'μ-1Between vectorCoordinate recycles Freeman-4 chain code calculates the direction code value c' of the vectorμ:
Finally by GI'In the identical continuous gesture data of all code values merge, obtain several gestures direction subsequences subFGI', G is obtained after combinationI'Gestures direction sequence FGI', similarly construct GT'Gestures direction sequence FGT'
Step 402) compares FGI'And FGT'In selected with the subsequence of same index if the direction code value of subsequence is identical Biggish length is as boundary candidate width cband in the two subsequences, then by the border width upper bound of cband and permission Iband compares, settingSelect the two minimum value wide as the boundary of the corresponding all gesture datas of the subsequence Degree;Otherwise the border width that corresponding all gesture datas of the subsequence are arranged is iband;
Step 403) finds optimal crooked route based on the thought of Dynamic Programming within the scope of the border width of each gesture data, Obtain GI'And GT'DTW distance DTW (GI',GT');
Step 404) is directed to all DTW distance being calculated and carries out descending sort, if it exists some DTW distance and minimum DTW The absolute difference of distance is less than threshold value, jumps to step 405);Otherwise using minimum DTW apart from corresponding dynamic gesture template as know Other result;
Step 405) calculates CIAnd CTBetween structural distance, the identical optimal crooked route of DTW is continued to use, on path nodeWithExclusive or is carried out, and is added up to exclusive or result, C is obtainedIAnd CTStructural distance CD (CI,CT), finally Select the smallest dynamic gesture template of structural distance as recognition result.
CN201710224005.6A 2017-04-07 2017-04-07 A kind of real-time dynamic gesture identification method based on key frame and boundary constraint DTW Active CN107169411B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710224005.6A CN107169411B (en) 2017-04-07 2017-04-07 A kind of real-time dynamic gesture identification method based on key frame and boundary constraint DTW

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710224005.6A CN107169411B (en) 2017-04-07 2017-04-07 A kind of real-time dynamic gesture identification method based on key frame and boundary constraint DTW

Publications (2)

Publication Number Publication Date
CN107169411A CN107169411A (en) 2017-09-15
CN107169411B true CN107169411B (en) 2019-10-29

Family

ID=59849679

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710224005.6A Active CN107169411B (en) 2017-04-07 2017-04-07 A kind of real-time dynamic gesture identification method based on key frame and boundary constraint DTW

Country Status (1)

Country Link
CN (1) CN107169411B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SG11201909139TA (en) 2017-12-22 2019-10-30 Beijing Sensetime Technology Development Co Ltd Methods and apparatuses for recognizing dynamic gesture, and control methods and apparatuses using gesture interaction
CN109144260B (en) * 2018-08-24 2020-08-18 上海商汤智能科技有限公司 Dynamic motion detection method, dynamic motion control method and device
US11720814B2 (en) * 2017-12-29 2023-08-08 Samsung Electronics Co., Ltd. Method and system for classifying time-series data
CN108470077B (en) * 2018-05-28 2023-07-28 广东工业大学 Video key frame extraction method, system and device and storage medium
CN110059580B (en) * 2019-03-27 2023-01-31 长春理工大学 Dynamic gesture recognition enhancing method based on leap motion
WO2020258106A1 (en) * 2019-06-26 2020-12-30 Oppo广东移动通信有限公司 Gesture recognition method and device, and positioning and tracking method and device
CN110717385A (en) * 2019-08-30 2020-01-21 西安文理学院 Dynamic gesture recognition method
CN110895684B (en) * 2019-10-15 2023-06-27 西安理工大学 Gesture motion recognition method based on Kinect
CN111311588B (en) * 2020-02-28 2024-01-05 浙江商汤科技开发有限公司 Repositioning method and device, electronic equipment and storage medium
CN111860274B (en) * 2020-07-14 2023-04-07 清华大学 Traffic police command gesture recognition method based on head orientation and upper half skeleton characteristics
CN113642413A (en) * 2021-07-16 2021-11-12 新线科技有限公司 Control method, apparatus, device and medium
CN117789302B (en) * 2023-12-29 2024-10-01 点昀技术(深圳)有限公司 Gesture recognition method and gesture recognition model training method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104123007A (en) * 2014-07-29 2014-10-29 电子科技大学 Multidimensional weighted 3D recognition method for dynamic gestures
CN104834894A (en) * 2015-04-01 2015-08-12 济南大学 Gesture recognition method combining binary coding and Hausdorff-like distance
CN106354252A (en) * 2016-08-18 2017-01-25 电子科技大学 Continuous character gesture track recognizing method based on STDW
CN106528586A (en) * 2016-05-13 2017-03-22 上海理工大学 Human behavior video identification method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104123007A (en) * 2014-07-29 2014-10-29 电子科技大学 Multidimensional weighted 3D recognition method for dynamic gestures
CN104834894A (en) * 2015-04-01 2015-08-12 济南大学 Gesture recognition method combining binary coding and Hausdorff-like distance
CN106528586A (en) * 2016-05-13 2017-03-22 上海理工大学 Human behavior video identification method
CN106354252A (en) * 2016-08-18 2017-01-25 电子科技大学 Continuous character gesture track recognizing method based on STDW

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
A new approach for Dynamic gesture recognition using skeleton trajectory representation and histograms of cumulative magnitudes;Edwin Escobedo 等;《2016 29th SIBGRAPI Conference on Graphics,Patterns and Images》;20170116;第209-216页 *
Dynamic hand gesture recognition using motion trajectories and key frames;Wenjun Tan 等;《2010 2nd International Conference on Advanced Computer Control》;20100617;第163-167页 *
基于Kinect传感器的动态手势识别;余旭 等;《中国优秀硕士学位论文全文数据库信息科技辑》;20140915;第I138-831页 *
基于Kinect的实时手语识别技术研究;叶平;《中国优秀硕士学位论文全文数据库信息科技辑》;20170315;第I138-5562页 *
基于深度摄像的手势识别关键技术研究;胡丽华;《中国优秀硕士学位论文全文数据库信息科技辑》;20170215;第I138-2924页 *

Also Published As

Publication number Publication date
CN107169411A (en) 2017-09-15

Similar Documents

Publication Publication Date Title
CN107169411B (en) A kind of real-time dynamic gesture identification method based on key frame and boundary constraint DTW
CN107563286B (en) Dynamic gesture recognition method based on Kinect depth information
Panwar Hand gesture recognition based on shape parameters
Shokoohi-Yekta et al. On the non-trivial generalization of dynamic time warping to the multi-dimensional case
CN103294996B (en) A kind of 3D gesture identification method
Wu et al. Fusing multi-modal features for gesture recognition
CN104200240B (en) A kind of Sketch Searching method based on content-adaptive Hash coding
CN104932804B (en) A kind of intelligent virtual assembles action identification method
CN103971102A (en) Static gesture recognition method based on finger contour and decision-making trees
CN104899607B (en) A kind of automatic classification method of traditional moire pattern
CN103093196A (en) Character interactive input and recognition method based on gestures
CN103455794A (en) Dynamic gesture recognition method based on frame fusion technology
Wu et al. Vision-based fingertip tracking utilizing curvature points clustering and hash model representation
CN104966016A (en) Method for collaborative judgment and operating authorization restriction for mobile terminal child user
CN108846356B (en) Palm tracking and positioning method based on real-time gesture recognition
CN102622225A (en) Multipoint touch application program development method supporting user defined gestures
CN111105443A (en) Video group figure motion trajectory tracking method based on feature association
Panwar Hand gesture based interface for aiding visually impaired
CN103336967A (en) Hand motion trail detection method and apparatus
He et al. Salient feature point selection for real time RGB-D hand gesture recognition
Xiao et al. Sketch-based human motion retrieval via selected 2D geometric posture descriptor
CN103186241B (en) A kind of interactive desktop contact right-hand man&#39;s recognition methods
Elakkiya et al. Intelligent system for human computer interface using hand gesture recognition
CN106650554A (en) Static hand gesture identification method
Lekova et al. Fingers and gesture recognition with kinect v2 sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20170915

Assignee: NUPT INSTITUTE OF BIG DATA RESEARCH AT YANCHENG

Assignor: NANJING University OF POSTS AND TELECOMMUNICATIONS

Contract record no.: X2020980007071

Denomination of invention: A real time dynamic gesture recognition method based on key frame and boundary constraint DTW

Granted publication date: 20191029

License type: Common License

Record date: 20201026