CN110110674A - A kind of gesture identification method based on acceleration MEMS and basic stroke - Google Patents

A kind of gesture identification method based on acceleration MEMS and basic stroke Download PDF

Info

Publication number
CN110110674A
CN110110674A CN201910391987.7A CN201910391987A CN110110674A CN 110110674 A CN110110674 A CN 110110674A CN 201910391987 A CN201910391987 A CN 201910391987A CN 110110674 A CN110110674 A CN 110110674A
Authority
CN
China
Prior art keywords
gesture
code
moment
information
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910391987.7A
Other languages
Chinese (zh)
Other versions
CN110110674B (en
Inventor
吴佳
刘道星
唐文研
李坤
刘宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiangtan University
Original Assignee
Xiangtan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiangtan University filed Critical Xiangtan University
Priority to CN201910391987.7A priority Critical patent/CN110110674B/en
Publication of CN110110674A publication Critical patent/CN110110674A/en
Application granted granted Critical
Publication of CN110110674B publication Critical patent/CN110110674B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/30Writer recognition; Reading and verifying signatures
    • G06V40/37Writer recognition; Reading and verifying signatures based only on signature signals such as velocity or pressure, e.g. dynamic signature recognition
    • G06V40/382Preprocessing; Feature extraction
    • G06V40/388Sampling; Contour coding; Stroke extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/30Writer recognition; Reading and verifying signatures
    • G06V40/37Writer recognition; Reading and verifying signatures based only on signature signals such as velocity or pressure, e.g. dynamic signature recognition
    • G06V40/394Matching; Classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Health & Medical Sciences (AREA)
  • Character Discrimination (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of gesture identification method based on acceleration MEMS and basic stroke, comprising the following steps: according to the motion profile of gesture, convert gesture to the sequence formed with label, establish the code identification model based on code feature;The target gesture of established standards and its corresponding first code;Acquire hand signal, by hand signal be sent into code identification model in, obtain sequence label, using it is more when decision making algorithm, obtain the basic stroke of gesture;It presorts to the meaning of gesture, obtains the similarity between gesture and the first code for possessing identical turning code number of features, take result of the corresponding gesture of first code for possessing maximum similarity as gesture identification.The present invention describes the relationship between the stroke of gesture, first concepts such as code and code feature, and a whole set of complete, gesture identification method based on code feature is formd, by turning code feature, realize the automatic segmentation of gesture during gesture identification, algorithm is simple, and accuracy of identification is high.

Description

A kind of gesture identification method based on acceleration MEMS and basic stroke
Technical field
It is the present invention relates to a kind of gesture identification method, in particular to a kind of based on acceleration MEMS and basic stroke Gesture identification method.
Background technique
Gesture identification is the process identified to human body gesture, in fields such as medical monitoring, the Web-based instruction, electronic equipments It is with a wide range of applications.Moreover, gesture is as a kind of natural, intuitive, convenient man-machine interaction mode, also greatly letter Change human-computer interaction process, meanwhile, the diversification of man-machine interaction mode also promotes the development of gesture information collector.Currently, Gesture information collector for gesture identification research is broadly divided into: 1) picture pick-up device;Gesture identification system based on picture pick-up device System carries out Image Acquisition to hand and ambient enviroment, identifies gesture wherein included.2) wearable device, it by built-in or The mode that person integrates by collector fitting, is fixed on hand, acquires the movement or biological information of hand, complete the knowledge of gesture Not.3) novel information acquisition mode, the information of such acquisition include WIFI, radar etc., are current state-of-the art gesture identification sides Method.
The application background of gesture identification directly determines the selection of gesture collector.For example, the placement of picture pick-up device can be serious Influence user experience, while being also difficult to meet the requirement of real-time tracking;And the gesture identification based on novel information acquisition mode Method is not yet mature, and the gesture of identification is also universal simple;When carrying out single object tracking gesture identification, they are unable to satisfy It is required that.
Summary of the invention
That in order to solve the above technical problem, the present invention provides a kind of algorithms is simple, accuracy of identification is high micro- based on acceleration The gesture identification method of Mechatronic Systems and basic stroke.
Technical proposal that the invention solves the above-mentioned problems is: a kind of hand based on acceleration MEMS and basic stroke Gesture recognition methods, comprising the following steps:
Step 1: the identification model of training standard code realizes the one-to-one correspondence of gesture path and standard code;And gesture is turned Turn to standard code sequence;
Step 2: sample gesture and its corresponding first code are determined;
Step 3: decision making algorithm when utilizing more obtains the basic stroke of gesture;
Step 4: firstly, presorting according to basic stroke number opponent gesture meaning;Then, calculate gesture to be measured with Possess the similarity between same basic stroke number purpose sample code, takes the corresponding sample hand of the first code for possessing maximum similarity Result of the gesture as gesture identification.
The above-mentioned gesture identification method based on acceleration MEMS and basic stroke, the step 1 specific steps Are as follows:
1-1) according to the actual path of gesture, gesture is divided into 4 classes and label are as follows: turning, static state, arc, straight line;Turning table Show that gesture path tour, theoretical duration are 0, is denoted as 4, embodies the gesture of beginning and the knot segmentation and to(for) gesture Beam;Static representations gesture remains static, and the forward and backward hand of corresponding gesture is in the motionless period, is denoted as 3, is convenient for hand The beginning and end of gesture identifying system identification gesture;Arc indicates that gesture path is the period of arcuation, is denoted as 2, is gesture identification The foundation of system resolution gesture;Straight line indicates that gesture path is the linear period, is denoted as 1, is that gesture recognition system is differentiated The foundation of gesture;It is divided into the above four classes label in each moment gesture, then each gesture acquired is converted into above four class The sequence of label composition;Theoretically, the label of gesture at different times, obtains according to motion profile, to complete from gesture For sequence to the transformation of sequence label, this mode is one-to-one as coding, therefore tag along sort is denoted as standard code, Correspondingly, by the gesture information at each moment classify and label process, be denoted as standard code identification;
The raw information at each moment 1-2) is called time information, the corresponding standard code information at each moment is called Standard code information;In standard code identification process, standard code information, uses for reference speed pass corresponding with curve movement in order to obtain Each moment and information at the time of N number of moment below are carried out a variety of conversions, have obtained transitional information, transitional information includes by system Angle (Angle) and variance (Var), in which:
In formula,Angle information between at the time of indicating moment t and be spaced kT, k value are integer;T adopts for information The period of collection, T=0.02s;AtIt is the acceleration of t moment;At+kTIt is the acceleration at t+kT moment;π is constant;
AngletThe set of the angle information of expression moment t, element representation moment t therein and interval kT (k=1,2 ... N the angle information between at the time of);
In formula,Covariance information between at the time of indicating moment t and be spaced kT;
VartIndicate the set of the covariance information of moment t, element representation moment t therein and interval kT (k=1,2 ... N) At the time of between covariance information;
When being in the stationary state of standard at certain moment, without velocity and acceleration, therefore, the value of angle and covariance information It is stable near 0, when be in standard turning state when, the variation of speed is very big, thus acceleration information variation it is obvious that So the mean value and variance yields for angle and covariance information are very big;This theoretically, by be arranged threshold value, identify static With two class code feature of turning;
In moment t, N number of time interval is considered, obtain the information aggregate about feature FIt is right In feature F, the mean value and variance of set measure acceleration and velocity variations in moment t, consider in different features, L, N, threshold In the case where value, L is turning range, and the feature with highest evaluation of estimate, L, N and threshold value is selected to construct filter function:
The filter function of " turning " standard code are as follows:
VAR(Ft)/E(Ft)≥ε+λv
In formula, E (Ft) it is set (Ft) mean value, VAR (Ft) it is set (Ft) variance, ε is constant;λ is speed sensitive Coefficient;V is gesture speed;
The filter function of " static state " standard code are as follows:
In formula, α, β are constant;
In order to identify " turning ", " straight line " standard code using filter function, selectAndAs evaluation parameter, choosing Select feature, L, N, threshold value with highest evaluation of estimate;
For the evaluation parameter of gesture initial time, calculated result indicates that the reliability of algorithm is better closer to 1;Wherein t1The initial time of ' the gesture strokes identified for algorithm;t1For the theoretical initial time of gesture strokes;t0When being originated for gesture It carves;
For the evaluation parameter of gesture strokes length, calculated result indicates that the reliability of algorithm is better closer to 1;Its Middle leniFor the length of i-th of stroke of gesture, Leni' it is LeniThe middle correct occasion length of identification, i are gesture strokes serial number;
Information at the time of for that cannot be identified using angle and variance is carried out two classification using more SVM, obtains straight line Or the degree of membership of arc code feature;
When carrying out two classification using SVM, key is to map in sample, is obtained by the method training of statistical learning Two classification of one complete paired data of optimal hyperlane B, in training optimal hyperlane, constraint condition and objective function such as formula:
In formula, w is hyperplane method vector;B is that hyperplane is displaced item;C is the penalty factor for controlling marginal classification mistake, C The sample points of more big then misclassification are fewer, δjFor non-negative slack variable, Nu is the number of data point to be sorted;Work as utilization When SVM carries out standard code identification, xjIndicate j-th point of time, yiValue indicate j-th point of standard code, value n, n= 1,2;
When carrying out two classification using SVM, the set expression of the element with identical standard code n is Ln;SVM classifier classification The element set that obtained standard code is n is Sn, classified using SVM, classification results are the reliability δ of standard code nn:
δnFor measuring the reliability of svm classifier, at the time of it indicates that the standard code result for classification is " n ", there are δn Probabilistic standard code be n, while having 1- δnProbability be 3-n.
The above-mentioned gesture identification method based on acceleration MEMS and basic stroke, in the step 2, standard Target gesture is number from 0 to 9, corresponding member code are as follows: 0 corresponding yuan of code is (3,4,1,4,1,4,1,4,1,4,3), and 1 is corresponding First code be (3,4,1,4,3), 2 corresponding yuan of codes are (3,4,2,4,1,4,1,4,3), 3 corresponding yuan of codes be (3,4,2,4,2, 4,3), 4 corresponding yuan of codes are (3,4,1,4,1,4,1,4,3), and 5 corresponding yuan of codes are (3,4,1,4,1,4,2,4,3), and 6 is corresponding First code be (3,4,2,4,2,4,3), 7 corresponding yuan of codes are (3,4,1,4,1,4,3), 8 corresponding yuan of codes be (3,4,2,4,1, 4,2,4,1,4,3), 9 corresponding yuan of codes are (3,4,2,4,1,4,3).
The above-mentioned gesture identification method based on acceleration MEMS and basic stroke, in the step 3, gesture The identification of basic stroke the following steps are included:
I) it deletes in sequence label, at the time of all yards of features are static;
II) by the code feature at turning code feature l moment of two sides, it is assigned a value of turning, prevents the single turning period from being known Not Wei multiple segmentations turning;
III) turning code feature reduction, one turning in all continuous turnings is substituted, reduction sequence is obtained;
IV) to the sequence between adjacent corners code feature, decision when carrying out more: reference template is expressed asWhat Re was indicated is two turnings code features and its intermediate length M standard code sequence fragment, Wherein,When indicating to be identified by code feature, the degree of membership that moment t belongs to straight line is obtained using SVM, while moment t belongs to arc Degree of membership isBy it is more when decision, obtain the period stroke classification be straight line degree of membership:
In formula, i indicates serial number of the stroke in gesture sequence,Indicate that i-th of stroke of gesture belongs to being subordinate to for straight line Degree,Indicate that moment t carves the degree of membership for belonging to straight line, the value of t is 1-M.
The above-mentioned gesture identification method based on acceleration MEMS and basic stroke, in the step 4, when more After decision, the data in gesture sequence are made of turning and straight line degree of membership, wherein straight line degree of membership is higher, the period A possibility that belonging to straight line stroke is higher, and otherwise, a possibility that arc stroke is higher;In this case, turning code feature is utilized Number carry out decision, to presort to the meaning of gesture,For gesture sequence Reduction, whereinIndicate i-th1A stroke belongs to the degree of membership of " straight line " code feature;Indicate sample This gesture member code, whereinIndicate i-th2The standard code of a stroke, its value include: 1 (straight line), 2 (arcs);i1、i2For hand The number of stroke in gesture, if meeting equation i1=i2, then the similarity between two sequences is as follows:
In formula,Indicate the similarity between gesture R and first code G,Indicate that i-th of stroke belongs to straight line Degree of membership;Consider several sample gesture member codes, forms setIf sample g meets equation:
Then, the corresponding gesture of sample g is the gesture identification result of gesture to be measured.
The beneficial effects of the present invention are: the present invention according to the motion profile of gesture, converts gesture to and uses label first The sequence of composition establishes the code identification model based on code feature;Then the target gesture of established standards and its corresponding first code; Acquire hand signal again, by hand signal be sent into step 1 code identification model in, obtain sequence label, using it is more when decision Algorithm obtains the basic stroke of gesture;It finally presorts to the meaning of gesture, obtain gesture and possesses identical turning Ma Te The similarity between first code of number is levied, result of the corresponding gesture of first code for possessing maximum similarity as gesture identification is taken. Entire method describes the relationship between the stroke of gesture, first concepts such as code and code feature, and form a whole set of is complete, The automatic segmentation of gesture during gesture identification is realized, is had by turning code feature based on the gesture identification method of code feature Have the advantages that algorithm is simple, accuracy of identification is high.
Detailed description of the invention
Fig. 1 is flow chart of the invention.
Fig. 2 is the flow chart of code feature identification in code identification model of the invention.
Fig. 3 is the flow chart of svm classifier of the invention.
Fig. 4 is the decision diagram of the number of the invention using turning code feature.
Fig. 5 is the structural block diagram of gesture acquisition system in the embodiment of the present invention.
Fig. 6 is gesture path figure in the embodiment of the present invention.
Fig. 7 is original acceleration waveform diagram in the embodiment of the present invention.
Fig. 8 is normalized Acceleration pulse figure in the embodiment of the present invention.
Fig. 9 is code signature waveform figure in the embodiment of the present invention.
Specific embodiment
The present invention is further illustrated with reference to the accompanying drawings and examples.
As shown in Figure 1, a kind of gesture identification method based on acceleration MEMS and basic stroke, including following step It is rapid:
Step 1: it according to the motion profile of gesture, converts gesture to the sequence formed with label, establishes and be based on code feature Code identification model.Specific steps are as follows:
1-1) dimensionless is the concept in calculus, i.e., with number 0 be the limit and be infinitely close to 0 variable.In micro- product During point, the area of curve, is usually all that area is carried out to segmentation summation in order to obtain, and last section length approach is infinitely small When result be exactly curve area.But in calculus calculating process, as long as the sector boss angle value of variable is small to certain in fact Degree, the result integrated just have very high precision.This thought is used for reference, speed is that accelerating curve and time shaft are constituted Area under the curve, during gesture identification, it is ensured that information collection frequency is sufficiently high, so that it may straight using the acceleration of any time It is reversed should moment velocity information.Certainly, the velocity information of single point in time not can be carried out the identification of gesture, when only combining this The velocity information with n (n > 1) moment below is carved, the gesture at the moment can be classified, obtain the classification of the moment gesture.
Therefore, according to the actual path of gesture, gesture is divided into 4 classes and label are as follows: turning, static state, arc, straight line;Turning It indicates that gesture path tour, theoretical duration are 0, is denoted as 4, embodies the gesture of beginning and the knot segmentation and to(for) gesture Beam;Static representations gesture remains static, and the forward and backward hand of corresponding gesture is in the motionless period, is denoted as 3, is convenient for hand The beginning and end of gesture identifying system identification gesture;Arc indicates that gesture path is the period of arcuation, is denoted as 2, is gesture identification The foundation of system resolution gesture;Straight line indicates that gesture path is the linear period, is denoted as 1, is that gesture recognition system is differentiated The foundation of gesture;It is divided into the above four classes label in each moment gesture, then each gesture acquired is converted into above four class The sequence of label composition;Theoretically, the label of gesture at different times, obtains according to motion profile, to complete from gesture For sequence to the transformation of sequence label, this mode is one-to-one as coding, therefore tag along sort is denoted as a yard feature, Correspondingly, the gesture information at each moment classify and the process of label, it is denoted as the identification of yard feature;
Table 1
1-2) moment is a bit on time shaft, refers to certain in a flash, not only without size, but also does not have direction.Code feature is known Other purpose exactly obtains corresponding code feature of each moment.It, for convenience of description, will be every in code feature identification process The raw information at a moment is called time information, and the corresponding code characteristic information at each moment is called a yard characteristic information.Such as Fig. 2 Shown, in code feature identification process, code characteristic information, uses for reference the corresponding relationship of speed and curve movement in order to obtain, will be every A moment carries out a variety of conversions with information at the time of N number of moment below, has obtained transitional information, transitional information includes angle (Angle) and variance (Var), in which:
In formula,Angle information between at the time of indicating moment t and be spaced kT, k value are integer;T adopts for information The period of collection, T=0.02s;AtIt is the acceleration of t moment;At+kTIt is the acceleration at t+kT moment;π is constant;
AngletThe set of the angle information of expression moment t, element representation moment t therein and interval kT (k=1,2 ... N the angle information between at the time of);
In formula,Covariance information between at the time of indicating moment t and be spaced kT;
VartIndicate the set of the covariance information of moment t, element representation moment t therein and interval kT (k=1,2 ... N) At the time of between covariance information.
Theoretically, when being in the stationary state of standard at certain moment, without velocity and acceleration, therefore, angle and side The value of poor information is stable near 0, and when the turning state for the standard of being in, the variation of speed is very big, therefore acceleration information Variation is it is obvious that so mean value and variance yields for angle and covariance information are very big;This theoretically, pass through setting threshold Value identifies static and two class code feature of turning.
In moment t, N number of time interval is considered, obtain the information aggregate about feature FIt is right In feature F, the mean value and variance of set measure acceleration and velocity variations in moment t, consider in different features, L, N, threshold In the case where value, L is turning range, and the feature with highest evaluation of estimate, L, N and threshold value is selected to construct filter function:
The filter function of " turning " standard code are as follows:
VAR(Ft)/E(Ft)≥ε+λv
In formula, E (Ft) it is set (Ft) mean value, VAR (Ft) it is set (Ft) variance, ε is constant;λ is speed sensitive Coefficient;V is gesture speed;
The filter function of " static state " standard code are as follows:
In formula, α, β are constant.
In order to identify " turning ", " straight line " standard code using filter function, selectAndAs evaluation parameter, choosing Select feature, L, N, threshold value with highest evaluation of estimate;
For the evaluation parameter of gesture initial time, calculated result indicates that the reliability of algorithm is better closer to 1;Wherein t1The initial time of ' the gesture strokes identified for algorithm;t1For the theoretical initial time of gesture strokes;t0When being originated for gesture It carves;
For the evaluation parameter of gesture strokes length, calculated result indicates that the reliability of algorithm is better closer to 1;Its Middle leniFor the length of i-th of stroke of gesture, Leni' it is LeniThe middle correct occasion length of identification, i are gesture strokes serial number.
Information at the time of for that cannot be identified using angle and variance carries out two classification using more SVM, such as Fig. 3 institute Show, obtains the degree of membership of straight line or arc code feature;
When carrying out two classification using SVM, key is to map in sample, is obtained by the method training of statistical learning Two classification of one complete paired data of optimal hyperlane B, in training optimal hyperlane, constraint condition and objective function such as formula:
In formula, w is hyperplane method vector;B is that hyperplane is displaced item;C is the penalty factor for controlling marginal classification mistake, C The sample points of more big then misclassification are fewer, δjFor non-negative slack variable, Nu is the number of data point to be sorted;Work as utilization When SVM carries out standard code identification, xjIndicate j-th point of time, yiValue indicate j-th point of standard code, value n, n= 1,2;
When carrying out two classification using SVM, the set expression of the element with identical standard code n is Ln;SVM classifier classification The element set that obtained standard code is n is Sn, classified using SVM, classification results are the reliability δ of standard code nn:
δnFor measuring the reliability of svm classifier, at the time of it indicates that the standard code result for classification is " n ", there are δn Probabilistic standard code be n, while having 1- δnProbability be 3-n.
Step 2: the target gesture of established standards and its corresponding first code.
First code of gesture refers to the classification according to the standard trajectory of gesture in the geometric locus at each moment, obtained length Spend minimum, most simplified code characteristic sequence.It will be apparent that first code of gesture, also complies with people's recognizing for " stroke " this concept Know.So first code of gesture, it may also be said to be the basic stroke of standard gesture, it will not follow the personal habits of writer and Variation.Gesture member code is obtained by the standard trajectory of gesture, includes translation and two processes of reduction.
The translation of gesture member code refers to, according to gesture in the classification of each moment gesture, the gesture path of standard is translated For the process of code characteristic sequence.The translation of gesture member code is similar to conventional code feature identification.But the translation of gesture member code is not Specific algorithm or program is needed to identify sequence, it is to obtain according to people for the cognition of standard gesture path 's.
The purpose of the reduction of gesture member code is the influence rejected gesture and write length, pays close attention to the change of the curve category of gesture Change.For example, standard trajectory is the gesture of straight line, no matter its length no matter how long, the code feature obtained after translation is " straight How many is a at the time of line ", its gesture member code is all one " straight line ".Theoretically, for identical gesture, writer's Under different writing styles, they can regard the gesture path for the standard of being different as.But as long as the personal habits of writer not Will affect the correct of yard feature identify-will not namely obscure straightway, segmental arc, round corner section, then they can possess phase Same gesture member code.Meanwhile standard gesture path will not influence gesture member after the operation such as rotation, translation and scaling The result of code.Therefore, the gesture identification method based on first code feature, can effectively avoid personal writing style to gesture identification Influence.
As shown in table 2, the target gesture of established standards is number from 0 to 9, corresponding member code are as follows: 0 corresponding yuan of code is (3,4,1,4,1,4,1,4,1,4,3), 1 corresponding yuan of code are (3,4,1,4,3), 2 corresponding yuan of codes be (3,4,2,4,1,4,1, 4,3), 3 corresponding yuan of codes are (3,4,2,4,2,4,3), and 4 corresponding yuan of codes are (3,4,1,4,1,4,1,4,3), 5 corresponding yuan Code is (3,4,1,4,1,4,2,4,3), and 6 corresponding yuan of codes are (3,4,2,4,2,4,3), 7 corresponding yuan of codes be (3,4,1,4,1, 4,3), 8 corresponding yuan of codes are (3,4,2,4,1,4,2,4,1,4,3), and 9 corresponding yuan of codes are (3,4,2,4,1,4,3).Different hands The possible first code having the same of gesture, such as gesture 3 and 6;At this point, additional consider that first code duration is distinguished, such as gesture 3 All for tool there are two " arc " standard code, two sections of arc duration of gesture 3 are approximately equal with 6, the previous arc of gesture 6 it is lasting when Between compared with the latter arc length.
Table 2
0 P、C、L、C、L、C、L、C、L、C、P
1 P、C、L、C、P
2 P、C、A、C、L、C、L、C、P
3 P、C、A、C、A、C、P
4 P、C、L、C、L、C、L、C、P
5 P、C、L、C、L、C、A、C、P
6 P、C、A、C、A、C、P
7 P、C、L、C、L、C、P
8 P、C、A、C、L、C、A、C、L、C、P
9 P、C、A、C、L、C、P
Step 3: hand signal is sent into the code identification model of step 1, obtains label sequence by acquisition hand signal Column, using it is more when decision making algorithm, obtain the basic stroke of gesture.
The meaning of gesture, can be under certain background condition, and according to the stroke of gesture, the identification for obtaining basic stroke includes Following steps:
I it) deletes in sequence label, at the time of all yards of features are static;During gesture identification, static moment section is right Answer gesture garbage section, practical significance no for the identification of gesture;But gesture can detecte by quiet code feature The beginning and end moment.
II) by the code feature at turning code feature 1 moment of two sides, it is assigned a value of turning, prevents the single turning period from being known Not Wei multiple segmentations turning;Turning code feature, corresponds to current time, and gesture classification is just changing or useful section of gesture Beginning and end;The segmentation to gesture is embodied, gesture sequence is divided into different two classes of front and back.The value of L, will affect Global code feature accuracy of identification, " turning " discrimination.
III) turning code feature reduction, one turning in all continuous turnings is substituted, reduction sequence is obtained;
IV) to the sequence between adjacent corners code feature, decision when carrying out more: reference template sequence is expressed asWhat F was indicated is two turnings code features and its intermediate length M standard code sequence fragment, In,When indicating to be identified by code feature, the degree of membership that moment t belongs to straight line is obtained using SVM, while moment t belongs to being subordinate to for arc Degree isBy it is more when decision, obtain the period stroke classification be straight line degree of membership:
In formula, i is serial number of the stroke in gesture sequence,Indicate that i-th of stroke belongs to the degree of membership of straight line, Indicate that moment t carves the degree of membership for belonging to straight line, the value of t is 1-M.
Step 4: presorting to the meaning of gesture, obtains gesture and possesses first code of identical turning code number of features Between similarity, take result of the corresponding gesture of first code for possessing maximum similarity as gesture identification.
When more after decision, the data in gesture sequence are made of turning and straight line degree of membership, wherein straight line degree of membership Higher, a possibility which belongs to straight line stroke, is higher, and otherwise, a possibility that arc stroke is higher;In this case, sharp Decision is carried out with the number of turning code feature, to presort to the meaning of gesture, decision diagram is as shown in Figure 4.For the reduction of gesture sequence, whereinIndicate i-th1A stroke belongs to " straight line " code feature Degree of membership;Indicate sample gesture member code, whereinIndicate i-th2The standard code of a stroke, Its value includes: 1 (straight line), 2 (arcs);i1、i2For the number of stroke in gesture, if meeting equation i1=i2, then two sequence Between similarity it is as follows:
In formula,Indicate the similarity between gesture R and first code G,Indicate that i-th of stroke belongs to straight line Degree of membership;Consider several sample gesture member codes, forms setIf sample g meets equation:
Then, the corresponding gesture of sample g is the gesture identification result of gesture to be measured.
Embodiment
The information acquisition system that the present invention uses is divided into wired and wireless two class of information collection, they adequately meet Requirement to portability, their hardware connection are as shown in Figure 5.During the experiment, cable information acquisition system is selected herein Verify the accuracy of gesture identification method.
The sensor that the present invention uses is nine axis gyroscope postures of Shenzhen Wei Te intelligent technology limited research and development Sensor module JY901.In gesture acquisition, sensor and ring shape fixture group are integral, are worn on finger.Its ruler Very little is 15.24mm X 15.24mm X 2mm, this effectively guarantees user experience.JY901 provides four kinds of different measuring range accelerations ± 2g, ± 4g, ± 8g and ± 16g, operating voltage 3.3V-5V, acceleration stability are 0.01g, and attitude measurement stability is 0.01°.Output frequency adjustable range is 0.1hz~200Hz, and serial ports rate class selects 2400,4800,9600,19200, 38400,57600,115200,230400,460800,921600HZ。
In the present invention, the three-dimensional acceleration data output of acquisition is passed into computer.Output frequency selects 50HZ, also Be collection period be 0.02s.It moves the acceleration measured according to us to be most worth, it is determined that it is ± 2g that acceleration, which measures,.
The meaning and label of gesture are the emphasis of traditional gesture identification method research.Gesture identification side proposed by the present invention Method can not only obtain the meaning and label of gesture, moreover it is possible to obtain the gesture calibration curve information of gesture different moments.In order to verify simultaneously The accuracy and gesture meaning of gesture calibration curve information identification, gesture path movement are executed by robot.Also, to every kind of gesture Motion profile and length are provided, the reference value of gesture calibration curve information is conveniently obtained.When machine executes gesture motion, movement Speed is 8m/min, and the track of gesture is all reduced to the most common straight line and arc, the track of corresponding gesture and rail The parameter of mark is as shown in Figure 6.
In gesture data collection process, the beginning and stopping of the acquisition of each gesture, all by sensor bundled software Switch control.Each gesture is required with following: 1) during gesture acquires, collector uses gesture after opening to execution To closing between collector after the preceding and execution that uses gesture, hand information all remain stationary state, and the time at 4 seconds or more, Ensure beginning and the finish time of algorithm identification gesture.2) it in the every section of gesture data acquired, only exists one and uses gesture.
In this experiment, data set includes 10 kinds of gestures (0-9) and two Chinese character gestures, and every kind of gesture is by machine People executes 50 times according to fixed track, altogether 600 gesture samples.Part of test results of this method on above-mentioned data set Fig. 7 as Figure 7-9 is original acceleration waveform diagram, and the standard code in each period is by artificially demarcating.Fig. 8 is original adds Speed is into the waveform diagram crossed after normalizing, it can be seen that having certain separability between various criterion code in figure.Fig. 9 is that code is special Waveform diagram is levied, wherein at the time of standard code value κ=4, corresponding standard code is " turning ";The corresponding standard code of κ=3 be " static state " when It carves;At the time of standard 1 < code value κ, indicate that the degree of membership at the moment " straight line " is κ, the degree of membership of corresponding " arc " is 1- κ.Its Middle table 3 is the confusion matrix of gesture identification accuracy.
Table 3
Matrix 0 1 2 3 4 5 6 7 8 9
0
1
2
3
4
5
6
7
8
9
When carrying out gesture identification using context of methods to verify, the reliability of obtained gesture calibration curve information, the present invention Following data is proposed to be evaluated.The gesture total duration Theoretically Total Duration (TTD) of theory, The total gesture duration Calculated Total Duration (CTD) calculated, the code characteristic duration calculated Calculated Duration for X, X are four class code features (CDX, X can be L, A, C, P), curve different in track Duration T heoretically Duration for X (TDX).Wherein, the value of CTD/TTD illustrates the party closer to 1 Total gesture duration that method obtains and theoretical gesture duration are closer, and method is more reliable.It is bent for straight line and camber line Line, CDL/TDL and CDA/TDA illustrate that this method is more reliable closer to 1.For corner features, CDC illustrates the party closer to 0 Method is more reliable.For quiet code feature, it is in static since it actually represents gesture, it is quiet due to the regulation that gesture requires Only all in the dead section of gesture in code characteristic theory, therefore it is not evaluated.

Claims (5)

1. a kind of gesture identification method based on acceleration MEMS and basic stroke, comprising the following steps:
Step 1: the identification model of training standard code realizes the one-to-one correspondence of gesture path and standard code;And it converts gesture to Standard code sequence;
Step 2: sample gesture and its corresponding first code are determined;
Step 3: decision making algorithm when utilizing more obtains the basic stroke of gesture;
Step 4: firstly, presorting according to basic stroke number opponent gesture meaning;Then, it calculates gesture to be measured and possesses Similarity between same basic stroke number purpose sample code takes the corresponding sample gesture of the first code for possessing maximum similarity to make For the result of gesture identification.
2. the gesture identification method according to claim 1 based on acceleration MEMS and basic stroke, feature It is, the step 1 specific steps are as follows:
1-1) according to the actual path of gesture, gesture is divided into 4 classes and label are as follows: turning, static state, arc, straight line;Turning indicates hand Gesture track tour, theoretical duration are 0, are denoted as 4, embody the gesture of beginning and the end segmentation and to(for) gesture;It is quiet State indicates that gesture remains static, and the forward and backward hand of corresponding gesture is in the motionless period, is denoted as 3, is convenient for gesture identification The beginning and end of system identification gesture;Arc indicates that gesture path is the period of arcuation, is denoted as 2, is gesture recognition system point Distinguish the foundation of gesture;Straight line indicates that gesture path is the linear period, is denoted as 1, is that gesture recognition system differentiates gesture Foundation;It is divided into the above four classes label in each moment gesture, then each gesture acquired is converted into the above four classes set of tags At sequence;Theoretically, the label of gesture at different times, obtains according to motion profile, thus complete from gesture sequence to The transformation of sequence label, this mode is one-to-one as coding, therefore tag along sort is denoted as standard code, accordingly , by the gesture information at each moment classify and label process, be denoted as standard code identification;
The raw information at each moment 1-2) is called time information, the corresponding standard code information at each moment is called standard Code information;In standard code identification process, standard code information, uses for reference the corresponding relationship of speed and curve movement in order to obtain, will Each moment carries out a variety of conversions with information at the time of N number of moment below, has obtained transitional information, transitional information includes angle (Angle) and variance (Var), in which:
In formula,Angle information between at the time of indicating moment t and be spaced kT, k value are integer;T is information collection Period, T=0.02s;AtIt is the acceleration of t moment;At+kTIt is the acceleration at t+kT moment;π is constant;
AngletIndicate moment t angle information set, element representation moment t therein and interval kT (k=1,2 ... N) when Angle information between quarter;
In formula,Covariance information between at the time of indicating moment t and be spaced kT;
VartAt the time of indicating the set of the covariance information of moment t, element representation moment t therein and interval kT (k=1,2 ... N) Between covariance information;
When being in the stationary state of standard at certain moment, without velocity and acceleration, therefore, the value of angle and covariance information is steady Be scheduled near 0, when be in standard turning state when, the variation of speed is very big, thus acceleration information variation it is obvious that so It is very big for the mean value and variance yields of angle and covariance information;This theoretically, by the way that threshold value is arranged, identify static and turn Two class code feature of angle;
In moment t, N number of time interval is considered, obtain the information aggregate F about feature Ft=(Ft 1, Ft 2..., Ft N), for feature F, the mean value and variance of set measure the acceleration and velocity variations in moment t, consider different features, L, N, threshold value feelings Under condition, L is turning range, and the feature with highest evaluation of estimate, L, N and threshold value is selected to construct filter function:
The filter function of " turning " standard code are as follows:
VAR(Ft)/E(Ft)≥ε+λv
In formula, E (Ft) it is set (Ft) mean value, VAR (Ft) it is set (Ft) variance, ε is constant;λ is speed sensitive coefficient; V is gesture speed;
The filter function of " static state " standard code are as follows:
In formula, α, β are constant;
In order to identify " turning ", " straight line " standard code using filter function, selectAndAs evaluation parameter, selection tool There are feature, L, N, the threshold value of highest evaluation of estimate;
For the evaluation parameter of gesture initial time, calculated result indicates that the reliability of algorithm is better closer to 1;Wherein t1' be The initial time for the gesture strokes that algorithm identifies;t1For the theoretical initial time of gesture strokes;t0For gesture initial time;
For the evaluation parameter of gesture strokes length, calculated result indicates that the reliability of algorithm is better closer to 1;Wherein leni For the length of i-th of stroke of gesture, Leni' it is LeniThe middle correct occasion length of identification, i are gesture strokes serial number;
Information at the time of for that cannot be identified using angle and variance, using more SVM carry out two classification, obtain straight line or The degree of membership of arc code feature;
When carrying out two classification using SVM, key is to map in sample, obtains one by the method training of statistical learning Two classification of the complete paired data of optimal hyperlane B, in training optimal hyperlane, constraint condition and objective function such as formula:
In formula, w is hyperplane method vector;B is that hyperplane is displaced item;C is the penalty factor for controlling marginal classification mistake, and C is bigger Then the sample points of misclassification are fewer, δjFor non-negative slack variable, Nu is the number of data point to be sorted;When utilize SVM When carrying out standard code identification, xjIndicate j-th point of time, yiValue indicate j-th point of standard code, value n, n=1,2;
When carrying out two classification using SVM, the set expression of the element with identical standard code n is Ln;SVM classifier classifies to obtain Standard code be n element set be Sn, classified using SVM, classification results are the reliability δ of standard code nn:
δnFor measuring the reliability of svm classifier, at the time of it indicates that the standard code result for classification is " n ", there are δnIt is general Rate standard code is n, while having 1- δnProbability be 3-n.
3. the gesture identification method according to claim 2 based on acceleration MEMS and basic stroke, feature Be, in the step 2, the target gesture of standard is number from 0 to 9, corresponding member code are as follows: 0 corresponding yuan of code be (3,4, 1,4,1,4,1,4,1,4,3), 1 corresponding yuan of code is (3,4,1,4,3), and 2 corresponding yuan of codes are (3,4,2,4,1,4,1,4,3), 3 corresponding yuan of codes are (3,4,2,4,2,4,3), and 4 corresponding yuan of codes are (3,4,1,4,1,4,1,4,3), and 5 corresponding yuan of codes are (3,4,1,4,1,4,2,4,3), 6 corresponding yuan of codes are (3,4,2,4,2,4,3), 7 corresponding yuan of codes be (3,4,1,4,1,4, 3), 8 corresponding yuan of codes are (3,4,2,4,1,4,2,4,1,4,3), and 9 corresponding yuan of codes are (3,4,2,4,1,4,3).
4. the gesture identification method according to claim 3 based on acceleration MEMS and basic stroke, feature Be, in the step 3, the identification of the basic stroke of gesture the following steps are included:
I) it deletes in sequence label, at the time of all yards of features are static;
II) by the code feature at turning code feature l moment of two sides, it is assigned a value of turning, the single turning period is prevented to be identified as The turning of multiple segmentations;
III) turning code feature reduction, one turning in all continuous turnings is substituted, reduction sequence is obtained;
IV) to the sequence between adjacent corners code feature, decision when carrying out more: reference template is expressed asWhat Re was indicated is two turnings code features and its intermediate length M standard code sequence fragment, Wherein,When indicating to be identified by code feature, the degree of membership that moment t belongs to straight line is obtained using SVM, while moment t belongs to arc Degree of membership isBy it is more when decision, obtain the period stroke classification be straight line degree of membership:
In formula, i indicates serial number of the stroke in gesture sequence,Indicate that i-th of stroke of gesture belongs to the degree of membership of straight line,Table Show that moment t carves the degree of membership for belonging to straight line, the value of t is 1-M.
5. the gesture identification method according to claim 3 based on acceleration MEMS and basic stroke, feature It is, in the step 4, when more after decision, the data in gesture sequence are made of turning and straight line degree of membership, wherein A possibility that straight line degree of membership is higher, which belongs to straight line stroke is higher, and otherwise, a possibility that arc stroke is higher;At this Under kind situation, decision is carried out using the number of turning code feature, to presort to the meaning of gesture,For the reduction of gesture sequence, whereinIndicate i-th1A stroke belongs to " straight line " code feature Degree of membership;Indicate sample gesture member code, whereinIndicate i-th2The standard code of a stroke, Its value includes: 1 (straight line), 2 (arcs);i1、i2For the number of stroke in gesture, if meeting equation i1=i2, then two sequence Between similarity it is as follows:
In formula,Indicate the similarity between gesture R and first code G,Indicate that i-th of stroke belongs to being subordinate to for straight line Degree;Consider several sample gesture member codes, forms setIf sample g meets equation:
Then, the corresponding gesture of sample g is the gesture identification result of gesture to be measured.
CN201910391987.7A 2019-05-13 2019-05-13 Gesture recognition method based on acceleration micro-electromechanical system and basic strokes Active CN110110674B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910391987.7A CN110110674B (en) 2019-05-13 2019-05-13 Gesture recognition method based on acceleration micro-electromechanical system and basic strokes

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910391987.7A CN110110674B (en) 2019-05-13 2019-05-13 Gesture recognition method based on acceleration micro-electromechanical system and basic strokes

Publications (2)

Publication Number Publication Date
CN110110674A true CN110110674A (en) 2019-08-09
CN110110674B CN110110674B (en) 2022-12-13

Family

ID=67489581

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910391987.7A Active CN110110674B (en) 2019-05-13 2019-05-13 Gesture recognition method based on acceleration micro-electromechanical system and basic strokes

Country Status (1)

Country Link
CN (1) CN110110674B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103257711A (en) * 2013-05-24 2013-08-21 河南科技大学 Space gesture input method
CN103679213A (en) * 2013-12-13 2014-03-26 电子科技大学 3D gesture recognition method
US20190087654A1 (en) * 2017-09-15 2019-03-21 Huazhong University Of Science And Technology Method and system for csi-based fine-grained gesture recognition

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103257711A (en) * 2013-05-24 2013-08-21 河南科技大学 Space gesture input method
CN103679213A (en) * 2013-12-13 2014-03-26 电子科技大学 3D gesture recognition method
US20190087654A1 (en) * 2017-09-15 2019-03-21 Huazhong University Of Science And Technology Method and system for csi-based fine-grained gesture recognition

Also Published As

Publication number Publication date
CN110110674B (en) 2022-12-13

Similar Documents

Publication Publication Date Title
Frolova et al. Most probable longest common subsequence for recognition of gesture character input
Pisharady et al. Recent methods and databases in vision-based hand gesture recognition: A review
Davis et al. Recognizing hand gestures
Várkonyi-Kóczy et al. Human–computer interaction for smart environment applications using fuzzy hand posture and gesture models
US7095401B2 (en) System and method for gesture interface
Wilson et al. Learning visual behavior for gesture analysis
Nickel et al. Visual recognition of pointing gestures for human–robot interaction
CN106354252B (en) A kind of continuation character gesture track recognition method based on STDW
Xia et al. Vision-based hand gesture recognition for human-robot collaboration: a survey
Zhang et al. Single object tracking with fuzzy least squares support vector machine
US20070124703A1 (en) Command input method using motion recognition device
Yin et al. Real-time continuous gesture recognition for natural human-computer interaction
CN103488294A (en) Non-contact gesture control mapping adjustment method based on user interactive habits
CN104616028A (en) Method for recognizing posture and action of human limbs based on space division study
CN105335711B (en) Fingertip Detection under a kind of complex environment
CN104038799A (en) Three-dimensional television-oriented gesture manipulation method
CN111368762A (en) Robot gesture recognition method based on improved K-means clustering algorithm
CN105426882A (en) Method for rapidly positioning human eyes in human face image
Zhi et al. Teaching a robot sign language using vision-based hand gesture recognition
CN110717385A (en) Dynamic gesture recognition method
Zhang et al. Robotic control of dynamic and static gesture recognition
Choudhury et al. A CNN-LSTM based ensemble framework for in-air handwritten Assamese character recognition
Schumacher et al. Active learning of ensemble classifiers for gesture recognition
KR20120089948A (en) Real-time gesture recognition using mhi shape information
CN110110674A (en) A kind of gesture identification method based on acceleration MEMS and basic stroke

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20190809

Assignee: SUZHOU MEDCOIL HEALTHCARE Co.,Ltd.

Assignor: XIANGTAN University

Contract record no.: X2024980006092

Denomination of invention: A gesture recognition method based on acceleration microelectromechanical system and basic strokes

Granted publication date: 20221213

License type: Common License

Record date: 20240523