CN104484644B - A kind of gesture identification method and device - Google Patents

A kind of gesture identification method and device Download PDF

Info

Publication number
CN104484644B
CN104484644B CN201410621500.7A CN201410621500A CN104484644B CN 104484644 B CN104484644 B CN 104484644B CN 201410621500 A CN201410621500 A CN 201410621500A CN 104484644 B CN104484644 B CN 104484644B
Authority
CN
China
Prior art keywords
data sequence
grader
sample
gesture
uproar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410621500.7A
Other languages
Chinese (zh)
Other versions
CN104484644A (en
Inventor
陈涛
蒋文明
李敏
李力
范炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics China R&D Center, Samsung Electronics Co Ltd filed Critical Samsung Electronics China R&D Center
Priority to CN201410621500.7A priority Critical patent/CN104484644B/en
Publication of CN104484644A publication Critical patent/CN104484644A/en
Application granted granted Critical
Publication of CN104484644B publication Critical patent/CN104484644B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A kind of gesture identification method of present invention proposition and device, wherein method include:The data sequence of user gesture is obtained, the data sequence includes acceleration information sequence and angular velocity data sequence;The data sequence is pre-processed, feature vector is obtained;Described eigenvector is identified using preset predefined grader, obtains the corresponding user gesture of described eigenvector.The discrimination of the present invention is higher, and can recognize that gesture and hand rotation gesture based on Wrist-sport.

Description

A kind of gesture identification method and device
Technical field
The present invention relates to pattern-recognition and field of artificial intelligence more particularly to a kind of gesture identification methods and device.
Background technology
The gesture that traditional Gesture Recognition is broadly divided into gesture identification and data glove based on computer vision is known Not, it wherein the algorithm complexity of Gesture Recognition based on computer vision is generally higher, and is extremely easy by environmental factor It influences, discrimination is unsatisfactory;Although the Gesture Recognition discrimination based on data glove is preferable, since it sets For expensive, inconvenient to carry and user experience is poor, gradually it is studied personnel and is abandoned.
With the continuous development of computer hardware, the research direction of gesture identification is transferred to by more and more technical staff In terms of sensor, the Gesture Recognition for being based particularly on acceleration transducer data is a pattern-recognition and people in recent years The more active field of work smart field.
Gesture Recognition based on acceleration information mainly has template matches, neural network and probability statistical analysis etc. Method.Mode advantage based on template matches is that flow logic is simple, and maximum disadvantage is that computation complexity is higher, it is difficult to Meet the higher application of requirement of real-time, if Sven Kratz et al. integrate acceleration and angular speed data, passes through dynamic time 6 kinds of simple gestures of method pair of planning are identified, the results showed that the real-time of dynamic time programming is worst.Based on probability statistics The method of analysis is more popular at present, if Jiangfeng Liu et al. people are using stealthy Markov model, based on hand exercise 8 kinds of gestures of acceleration information pair are identified and identify that interest rate is preferable.Compressed sensing is current another more popular method, This method can reduce the dimension of gesture data, simplify the computation complexity of identification, and dynamic time is combined if Ahmad Akl Planning and 18 kinds of gestures of compressed sensing technology pair are identified.
It is in place of the deficiencies in the prior art:
Discrimination is higher when less gesture type is identified in current Gesture Recognition, to more gesture into Discrimination can be remarkably decreased when row identification;Also, mainly hand movement locus is identified, is not supported without apparent The gestures such as the hand rotation of movement locus.
Invention content
The present invention provides a kind of gesture identification methods, and discrimination is higher, and can recognize that based on Wrist-sport Gesture and hand rotate gesture.
The present invention also provides a kind of gesture identifying devices, and discrimination is higher, and can recognize that based on Wrist-sport Gesture and hand rotate gesture.
The technical proposal of the invention is realized in this way:
A kind of gesture identification method, including:
The data sequence of user gesture is obtained, the data sequence includes acceleration information sequence and angular velocity data sequence Row;
The data sequence is pre-processed, feature vector is obtained;
Described eigenvector is identified using preset predefined grader, obtains the corresponding user of described eigenvector Gesture.
A kind of gesture identifying device, set including:
Preprocessing module, the data sequence for obtaining user gesture, the data sequence include acceleration information sequence With angular velocity data sequence;The data sequence is pre-processed, feature vector is obtained;
Predefined grader, described eigenvector, obtains the corresponding user gesture of described eigenvector for identification.
As it can be seen that gesture identification method proposed by the present invention and device, by the acceleration information sequence of user gesture and angle speed Degrees of data sequence can accurately identify more gesture, and can recognize that without apparent movement locus as basis of characterization Hand rotate gesture.
Description of the drawings
Fig. 1 is gesture identification method implementation flow chart proposed by the present invention;
Fig. 2 is the system structure diagram of embodiment one;
Fig. 3 is the implementation flow chart of embodiment two;
Fig. 4 is dimension standardization schematic diagram;
Fig. 5 is the implementation flow chart of embodiment three;
Fig. 6 is the implementation flow chart of example IV;
Fig. 7 is test result schematic diagram of the method for the present invention on 6DMG PostgreSQL databases.
Specific implementation mode
The present invention proposes a kind of gesture identification method, such as the implementation flow chart that Fig. 1 is this method, including:
Step 101:The data sequence of user gesture is obtained, the data sequence includes acceleration information sequence and angular speed Data sequence;
Step 102:The data sequence is pre-processed, feature vector is obtained;
Step 103:Described eigenvector is identified using preset predefined grader, obtains described eigenvector pair The user gesture answered.
Due to including angular velocity data sequence, gesture proposed by the present invention in the user gesture data sequence of acquisition Recognition methods can recognize that the gestures such as gesture and hand rotation based on Wrist-sport.
The concrete mode of above-mentioned steps 102 can be:
Denoising smooth, dimension normalization and dimension normalizing operation are executed successively respectively to each data sequence;It will operation As a result the feature vector of user gesture is formed.
In step 103, when predefined grader None- identified goes out described eigenvector, the above method can be wrapped further It includes:Described eigenvector is identified using self-defined grader set by user, obtains the corresponding user gesture of described eigenvector.
The present invention can provide to the user define oneself hobby and not among predefined grader gesture set The method of gesture, is specifically as follows:
Obtain the data sequence for the user gesture that user inputs above twice in succession;
The data sequence is pre-processed, the feature vector for each user gesture is obtained;
When the feature vector for each user gesture can not be identified by the predefined grader, arbitrary two are judged Whether the similarity of the data sequence of a user gesture is no more than preset threshold value, if it is, by each user hand The data sequence of gesture is as initial sample;
Be weighted plus make an uproar to the initial sample processing, and the sample obtained after initial sample and weighting plus processing of making an uproar is made For the positive sample of self-defined grader;Choose negative sample of other data sequences as self-defined grader;By the positive sample With negative sample the self-defined grader is trained as training data.
Wherein, the concrete mode of weighting plus processing of making an uproar can be:Each data in data sequence are multiplied by respectively predetermined Random number in range;
Be weighted plus make an uproar to initial sample processing, using the sample obtained after initial sample and weighting plus processing of making an uproar as certainly The concrete mode of the positive sample of defining classification device can be:
Be weighted plus make an uproar to initial sample processing, obtains new data sequence;
Be weighted plus make an uproar again to the initial sample and new data sequence processing, obtains new data sequence;
Until when the quantity of data sequence meets preset require, by obtained new data sequence and initial sample As positive sample.
The concrete mode that other data sequences are chosen as the negative sample of self-defined grader can be:
Some or all of choose the predefined grader and can identify the data sequence of user gesture or one with Upper random sequence is as negative sample.
Below in conjunction with attached drawing, lifts specific embodiment and be discussed in detail.
Embodiment one:
The present embodiment introduction runs the exemplary system of gesture identification method proposed by the present invention, if Fig. 2 is the system knot Structure schematic diagram, the system realize the example that a third party application carries out human-computer interaction using the present invention.
The system by the equipment (i.e. data identification unit 210) of built-in acceleration and angular-rate sensor obtain user into The acceleration information sequence and angular velocity data sequence of row gesture operation, via bluetooth or its be transmitted to he connect equipment will accelerate Degrees of data sequence and angular velocity data sequence preprocessing module 220;
Preprocessing module 220 carries out denoising smooth processing to the acceleration information sequence and angular velocity data sequence of acquisition, To denoising smooth, treated that result is normalized, then carries out dimension standardization behaviour to the result after normalized Make, the data mean value for finally extracting each dimension sets up feature vector as feature, and feature vector is sent to identification module 230;
Identification module 230 may include preset predefined grader 231, can further include user voluntarily The self-defined grader 232 of setting;The feature vector received is identified in identification module 230, identifies corresponding user hand Recognition result is returned to third party application 240 by gesture, by third party application match cognization result to specific machine Operation, to realize the function of human-computer interaction.In addition, the system allows third party application to provide self-defined gesture to the user The new individual gesture that user oneself defines is mapped as third party application by interface by the self-defined identification module of this method Some operation.
The present embodiment describes the function of each unit on the whole, introduces each unit respectively for multiple embodiments below Specific process flow.
Embodiment two:
The present embodiment is introduced preprocessing module 220 and is carried out in advance to original acceleration information sequence and angular velocity data sequence The process of processing.In the present embodiment, the original data sequence of a user gesture includes 6, is denoted as AccSeq_x, AccSeq_y,AccSeq_z,AngSeq_x,AngSeq_y,AngSeq_z;Wherein, AccSeq indicates acceleration information sequence, AngSeq indicates that angular velocity data sequence, subscript x, y, z indicate 3 axis of orientations of sensor respectively.
Such as the implementation flow chart that Fig. 3 is the present embodiment, include the following steps:
Step 301:Denoising smooth.The present embodiment uses mean filter, and is point centered on 5 sliding window with width The updated value put centered on the mean value of data in window is carried out mean filter by field, and calculating formula is:
Wherein,
Valuei+jIndicate (i+j) in original data sequence (acceleration information sequence or angular velocity data sequence) A numerical value;
Value′iIndicate i-th of data after denoising smooth is handled in the data obtained sequence;
L indicates the length of original data sequence.
Step 302:Dimension normalization.
The original acceleration of extraction, because the operating habit or writing style of user are different, is presented with angular velocity information Acceleration or angular speed variation tendency it is although similar, but its virtual value section will present up and down fluctuation.It is asked to solve this Topic, the present embodiment carry out dimension normalization operation to the data sequence after above-mentioned denoising smooth.Specifically:
For through denoising smooth treated data sequence, extracting the maximum value and minimum value in the sequence respectively, each Data subtract the minimum value in its affiliated sequence, and divided by its affiliated sequence in data constant interval, obtain normalized Sequence afterwards.
Calculating formula is:
Value′i=(Valuei-min)/(max-min);Wherein,
ValueiIndicate i-th of data after denoising smooth is handled in the data obtained sequence;
Max and min indicates the maximum value and minimum value in the data obtained sequence after denoising smooth is handled respectively;
Value′iIndicate i-th of data after dimension normalization is handled in the data obtained sequence.
Step 303:Dimension standardizes.
It is uniform length, the present embodiment is to passing through due to the different length that the original data sequence of extraction can be shown Dimension normalization treated data sequence carries out dimension standardization.Specifically:
It is that L data sequences are evenly divided into N number of data segment by length, all data of each data segment are that an entirety is tieed up Degree, each dimension include L/N data, extract its data mean value as single dimension feature, are denoted as f, calculation formula is:
Wherein,
Value'(i*L/N)+jIndicate (the i*L/N)+j numbers after dimension normalization is handled in the data obtained sequence According to;
fiIt indicates by i-th of data in the data obtained sequence after dimension standardization.
If Fig. 4 is dimension standardization schematic diagram, if Fig. 4 is by taking N=8 as an example, each data sequence point is divided into 8 dimensions Degree takes the mean values of data inside dimension as single dimension feature, then entire data sequence is extracted 8 dimensional characteristics.
Step 304:Set up feature vector F.
301-303 through the above steps, data sequence (the i.e. AccSeq_ original to 6x,AccSeq_y,AccSeq_z, AngSeq_x,AngSeq_y,AngSeq_z) handled, obtain 6 new data sequences;This step is by this 6 new data Sequence group builds up feature vector F, specifically:
Wherein,
The 1st row to the 6th row of F is respectively to AccSeq_x,AccSeq_y,AccSeq_z,AngSeq_x,AngSeq_yWith AngSeq_zBy obtaining data sequence after above-mentioned processing operation, each data sequence includes N number of data.
The present embodiment has invited 30 personnel to acquire 150 groups of gesture set, every group of 38 gesture samples during realizing Original acceleration and angular-rate sensor data sample.Above-mentioned data sample is pre-processed, each sample pair is extracted The feature vector answered, and as training sample, carry out cross validation in the training process and optimal parameter is selected to obtain needle To predefining grader built in 38 kinds of Pre-defined gestures.
Embodiment three:
The present embodiment introduction carries out the feature vector that pretreatment obtains later using identification module 230 mistake of gesture identification Journey.Wherein identification module 230 includes the self-defined grader 232 of predefined grader 231 and user's sets itself.First by pre- Defining classification device 231 carries out gesture identification, when predefined grader 231 is 38 kinds of predefined hands to gesture identification input by user Gesture it is a kind of when, exit the module and return to third party application or user identification result;When predefined grader 231 can not be identified or be identified as negative sample, and self-defined grader 232 will continue to identify whether current input gesture is use The self-defined gesture at family, in detail operation are shown in Fig. 5, and steps are as follows.
Step 501:Predefined grader 231 identifies that user inputs the feature vector of gesture.The considerations of for user experience, Support vector machines estimated probability may be used in the present embodiment, and filters recognition result according to probability.Support vector machines estimates spy Levy the estimated result and estimated probability of vector.
Step 502:Result filtering is carried out according to probability.Specifically, if the result of current maximum estimated probability is more than threshold Value Threshold1, and the difference of maximum estimated probability and time big estimated probability is more than threshold value Threshold2, then receive current Recognition result, and exit the module.Otherwise it is considered as negative sample, i.e., currently input gesture is not any one of gesture built in 38 kinds Kind, and execute step 503.Specifically use the following formula:
Wherein, res indicates recognition result, PmaxIndicate gesture classification maximum estimated probability, PsecIndicate that gesture classification time is big Estimated probability.
Step 503:Self-defined grader 232 identifies that user inputs the feature vector of gesture, further determines that current input Whether the feature vector of gesture is user-defined gesture.
Here, self-defined grader 232 provide to the user it is a kind of define oneself hobby, not Pre-defined gesture set it In gesture method, the specific implementation mode of the self-defined grader of training introduced below 232.
Example IV:
The specific implementation mode of self-defined grader 232 is trained in the present embodiment introduction.Based on the considerations of user experience, make by oneself Adopted grader not may require that user's input repeatedly (is more than 3 times) gesture to obtain enough training samples generally, but very few Training sample will necessarily weaken the classification capacity of grader.In order to solve this problem, the present embodiment using following steps come complete At this function.In the present embodiment, continuously inputted using user twice gesture as being introduced for initial sample;User The above gesture three times can be continuously inputted, the present invention is not restricted user gesture input number.
As Fig. 6 be the present embodiment implementation flow chart, including:
Step 601:After opening User Defined pattern, user continuously inputs the hand of oneself definition twice as requested Gesture obtains the acceleration information sequence and angular velocity data sequence of gesture twice, by preprocessing module 220 to acceleration information sequence Row and angular velocity data sequence are pre-processed, and the feature vector of gesture is obtained.
Step 602:Feature vector is identified in identification module 230, if it is possible to identify, show input by user be somebody's turn to do Being included among Pre-defined gesture set or self-defined gesture set for gesture, then require user to re-enter;If cannot Identification, thens follow the steps 603.
Step 603:Compare the similarity of the front and back gesture data sequence inputted.It is specific that dynamic time programming calculation may be used Method (DTW) is compared, and DTW is a kind of non-linear regular technology that Time alignment and range measurement are gathered, main to use In two time series similarities of measurement.
Step 604:Judge whether comparison result is more than preset threshold value, if it is greater, then think to input twice Gesture is not similar, it is desirable that user re-enters;If it is not greater, then executing step 605.
Step 605:The data sequence for the gesture that user is inputted twice obtains positive sample as initial sample, extension.
It is specific that iteration weighting plus method for de-noising may be used, i.e., be with data size itself to each data in data sequence Weights, multiply in a preset range random number of (such as between 0.5 to 1.5) rebuilds sample, the method as noise It can ensure to extend sample on the basis of data sequence variation tendency again.Then continue to make with current extensions sample and initial sample Continue to extend for initial sample, until when the quantity of positive sample meets preset require, executes step 606.
Above-mentioned weighting adds processing of making an uproar that following formula may be used:
value'i=valuei× Random (0.5~1.5), wherein
valueiIndicate i-th of data in pending data sequence;
valuei' expression processing after data sequence in i-th of data.
Step 606:Choose negative sample.
Specifically, some or all of predefined grader can be chosen can identify the data sequence of user gesture and/ Or random data sequence is as negative sample.
Step 607:Self-defined grader 232 is trained as training data using above-mentioned positive sample and negative sample.So both Positive sample can effectively be identified, and false recognition rate caused by negative sample or invalid gesture can be reduced.
Gesture identification method proposed by the present invention is described above, the present invention also proposes a kind of gesture identifying device, including:
Preprocessing module, the data sequence for obtaining user gesture, the data sequence include acceleration information sequence With angular velocity data sequence;The data sequence is pre-processed, feature vector is obtained;
Predefined grader, described eigenvector, obtains the corresponding user gesture of described eigenvector for identification.
In above-mentioned apparatus, preprocessing module pre-processes data sequence and obtains the mode of feature vector can be with:
Denoising smooth, dimension normalization and dimension normalizing operation are executed successively respectively to each data sequence;It will operation As a result the feature vector of user gesture is formed.
Above-mentioned apparatus can also include:
Self-defined grader, for when the predefined grader None- identified feature vector, identifying this feature vector, Obtain the corresponding user gesture of described eigenvector.
Above-mentioned self-defined grader can be also used for,
Obtain the data sequence for the user gesture that user inputs above twice in succession;When for the feature of each user gesture When vector can not be identified by the predefined grader, judge the data sequence of any two user gesture similarity whether No more than preset threshold value, if it is, using the data sequence of each user gesture as initial sample;
Be weighted plus make an uproar to the initial sample processing, and the sample obtained after initial sample and weighting plus processing of making an uproar is made For the positive sample of self-defined grader;Choose negative sample of other data sequences as self-defined grader;By the positive sample It is trained as training data with negative sample.
Wherein, the mode of self-defined grader weighting plus processing of making an uproar can be:To each data difference in data sequence The random number being multiplied by preset range;
The self-defined grader, which is weighted initial sample, adds processing of making an uproar, after initial sample and weighting plus processing of making an uproar Obtained sample can be as the mode of the positive sample of self-defined grader:Be weighted plus make an uproar to initial sample processing, obtains To new data sequence;Be weighted plus make an uproar again to the initial sample and new data sequence processing, obtains new data Sequence;Until when the quantity of data sequence meets preset require, obtained new data sequence and initial sample are made For positive sample.
Self-defined grader chooses other data sequences:It chooses The predefined grader some or all of can identify that the data sequence of user gesture or more than one random sequence are made For negative sample.
To sum up, gesture identification method proposed by the present invention not only can be but also right to the gesture based on arm motion Gesture based on Wrist-sport can also be identified effectively, and the user demand of different operation custom is met.Profit of the invention It uses support vector machines as grader, higher discrimination can be obtained with the algorithm of single dimension global feature establishment feature vector With good generalization ability.In addition, significant difference of the present invention goes to be that can provide to the user certainly in terms of other Gesture Recognitions The function of gesture identification is defined, supports on-line study pattern, better user experience is brought and operation is interesting.In order to verify The validity of algorithm, the present invention selects 6DMG acceleration and angular speed data of increasing income to carry out 20 tests, wherein 80% with press proof , as training data, as a result remaining 20% data shows that average recognition rate can reach 97.92%, table as test data for this The now invention is effectively accurately that specifying information is as shown in Figure 7.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all essences in the present invention With within principle, any modification, equivalent substitution, improvement and etc. done should be included within the scope of protection of the invention god.

Claims (8)

1. a kind of gesture identification method, which is characterized in that the method includes:
The data sequence of user gesture is obtained, the data sequence includes acceleration information sequence and angular velocity data sequence;
The data sequence is pre-processed, feature vector is obtained;
Described eigenvector is identified using preset predefined grader, obtains the corresponding user hand of described eigenvector Gesture;
When the predefined grader None- identified goes out described eigenvector, the method further includes:
Described eigenvector is identified using self-defined grader set by user, obtains the corresponding user hand of described eigenvector Gesture;
The method further includes:
Obtain the data sequence for the user gesture that user inputs above twice in succession;
The data sequence is pre-processed, the feature vector for each user gesture is obtained;
When the feature vector for each user gesture can not be identified by the predefined grader, judge that any two is used Whether the similarity of the data sequence of family gesture is no more than preset threshold value, if it is, by each user gesture Data sequence is as initial sample;
Be weighted plus make an uproar to the initial sample processing, using the sample obtained after initial sample and weighting plus processing of making an uproar as certainly The positive sample of defining classification device;Choose negative sample of other data sequences as self-defined grader;By the positive sample and bear Sample trains the self-defined grader as training data.
2. according to the method described in claim 1, it is characterized in that, it is described data sequence is pre-processed and obtain feature to The mode of amount is:
Denoising smooth, dimension normalization and dimension normalizing operation are executed successively respectively to each data sequence;
Operating result is formed to the feature vector of user gesture.
3. according to the method described in claim 1, it is characterized in that, the mode of the weighting plus processing of making an uproar is:To data sequence In each data be multiplied by the random number in preset range respectively;
The processing that initial sample is weighted plus is made an uproar, using the sample obtained after initial sample and weighting plus processing of making an uproar as certainly The mode of the positive sample of defining classification device is:
Be weighted plus make an uproar to initial sample processing, obtains new data sequence;
Be weighted plus make an uproar again to the initial sample and new data sequence processing, obtains new data sequence;
Until when the quantity of data sequence meets preset require, using obtained new data sequence and initial sample as Positive sample.
4. according to the method described in claim 1, it is characterized in that, described other data sequences of selection are as self-defined grader The mode of negative sample be:
Some or all of choose the predefined grader and can identify the data sequence of user gesture or more than one with Machine sequence is as negative sample.
5. a kind of gesture identifying device, which is characterized in that described device includes:
Preprocessing module, the data sequence for obtaining user gesture, the data sequence include acceleration information sequence and angle Speed data sequence;The data sequence is pre-processed, feature vector is obtained;
Predefined grader, described eigenvector, obtains the corresponding user gesture of described eigenvector for identification;
Described device further includes:
Self-defined grader, for when the predefined grader None- identified feature vector, identification this feature vector to obtain The corresponding user gesture of described eigenvector;
The self-defined grader is additionally operable to,
Obtain the data sequence for the user gesture that user inputs above twice in succession;When for the feature vector of each user gesture When can not be identified by the predefined grader, whether not the similarity of the data sequence of any two user gesture is judged More than preset threshold value, if it is, using the data sequence of each user gesture as initial sample;
Be weighted plus make an uproar to the initial sample processing, using the sample obtained after initial sample and weighting plus processing of making an uproar as certainly The positive sample of defining classification device;Choose negative sample of other data sequences as self-defined grader;By the positive sample and bear Sample is trained as training data.
6. device according to claim 5, which is characterized in that the preprocessing module pre-processes simultaneously data sequence The mode for obtaining feature vector is:
Denoising smooth, dimension normalization and dimension normalizing operation are executed successively respectively to each data sequence;
Operating result is formed to the feature vector of user gesture.
7. device according to claim 5, which is characterized in that the mode of the self-defined grader weighting plus processing of making an uproar For:The random number each data in data sequence being multiplied by respectively in preset range;
The self-defined grader is weighted plus makes an uproar processing to initial sample, will initial sample and weighting plus make an uproar handle after obtain Sample be as the mode of the positive sample of self-defined grader:
Be weighted plus make an uproar to initial sample processing, obtains new data sequence;
Be weighted plus make an uproar again to the initial sample and new data sequence processing, obtains new data sequence;
Until when the quantity of data sequence meets preset require, using obtained new data sequence and initial sample as Positive sample.
8. device according to claim 5, which is characterized in that the self-defined grader chooses other data sequence conducts The mode of the negative sample of self-defined grader is:
Some or all of choose the predefined grader and can identify the data sequence of user gesture or more than one with Machine sequence is as negative sample.
CN201410621500.7A 2014-11-06 2014-11-06 A kind of gesture identification method and device Active CN104484644B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410621500.7A CN104484644B (en) 2014-11-06 2014-11-06 A kind of gesture identification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410621500.7A CN104484644B (en) 2014-11-06 2014-11-06 A kind of gesture identification method and device

Publications (2)

Publication Number Publication Date
CN104484644A CN104484644A (en) 2015-04-01
CN104484644B true CN104484644B (en) 2018-10-16

Family

ID=52759185

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410621500.7A Active CN104484644B (en) 2014-11-06 2014-11-06 A kind of gesture identification method and device

Country Status (1)

Country Link
CN (1) CN104484644B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160320850A1 (en) * 2015-04-29 2016-11-03 Samsung Electronics Co., Ltd. User interface control using impact gestures
WO2017050140A1 (en) * 2015-09-23 2017-03-30 歌尔股份有限公司 Method for recognizing a human motion, method for recognizing a user action and smart terminal
CN105912910A (en) * 2016-04-21 2016-08-31 武汉理工大学 Cellphone sensing based online signature identity authentication method and system
CN107638165B (en) * 2016-07-20 2021-01-26 平安科技(深圳)有限公司 Sleep detection method and device
CN106950927B (en) * 2017-02-17 2019-05-17 深圳大学 A kind of method and intelligent wearable device controlling smart home
CN108108015A (en) * 2017-11-20 2018-06-01 电子科技大学 A kind of action gesture recognition methods based on mobile phone gyroscope and dynamic time warping
CN109491507A (en) * 2018-11-14 2019-03-19 南京邮电大学 Gesture identifying device based on FDC2214
CN109902644A (en) * 2019-03-07 2019-06-18 北京海益同展信息科技有限公司 Face identification method, device, equipment and computer-readable medium
CN110618754B (en) * 2019-08-30 2021-09-14 电子科技大学 Surface electromyogram signal-based gesture recognition method and gesture recognition armband
CN112614263A (en) * 2020-12-30 2021-04-06 浙江大华技术股份有限公司 Method and device for controlling gate, computer equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103809748A (en) * 2013-12-16 2014-05-21 天津三星通信技术研究有限公司 Portable terminal and gesture recognition method thereof
CN103995592A (en) * 2014-05-21 2014-08-20 上海华勤通讯技术有限公司 Wearable equipment and terminal information interaction method and terminal

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9959463B2 (en) * 2002-02-15 2018-05-01 Microsoft Technology Licensing, Llc Gesture recognition system using depth perceptive sensors
US10242255B2 (en) * 2002-02-15 2019-03-26 Microsoft Technology Licensing, Llc Gesture recognition system using depth perceptive sensors

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103809748A (en) * 2013-12-16 2014-05-21 天津三星通信技术研究有限公司 Portable terminal and gesture recognition method thereof
CN103995592A (en) * 2014-05-21 2014-08-20 上海华勤通讯技术有限公司 Wearable equipment and terminal information interaction method and terminal

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"一种基于MEMS惯用传感器的手势识别方法";肖茜 等,;《传感技术学报》;20130531;第26卷(第5期);摘要、第2节、图1 *

Also Published As

Publication number Publication date
CN104484644A (en) 2015-04-01

Similar Documents

Publication Publication Date Title
CN104484644B (en) A kind of gesture identification method and device
Kim et al. deepGesture: Deep learning-based gesture recognition scheme using motion sensors
Li et al. Deep Fisher discriminant learning for mobile hand gesture recognition
WO2018040757A1 (en) Wearable device and method of using same to monitor motion state
WO2017050140A1 (en) Method for recognizing a human motion, method for recognizing a user action and smart terminal
Tian et al. MEMS-based human activity recognition using smartphone
CN109993093A (en) Road anger monitoring method, system, equipment and medium based on face and respiratory characteristic
CN103413113A (en) Intelligent emotional interaction method for service robot
CN107316067A (en) A kind of aerial hand-written character recognition method based on inertial sensor
CN116226691B (en) Intelligent finger ring data processing method for gesture sensing
KR20120052610A (en) Apparatus and method for recognizing motion using neural network learning algorithm
CN107273726B (en) Equipment owner's identity real-time identification method and its device based on acceleration cycle variation law
Shin et al. Korean sign language recognition using EMG and IMU sensors based on group-dependent NN models
Kim et al. Finger language recognition based on ensemble artificial neural network learning using armband EMG sensors
Du et al. Gesture recognition method based on deep learning
Xu et al. Air-writing characters modelling and recognition on modified CHMM
CN107582077A (en) A kind of human body state of mind analysis method that behavior is touched based on mobile phone
CN110443113A (en) A kind of virtual reality Writing method, system and storage medium
CN110327050A (en) People's tumble state embedded intelligence detection method for wearable equipment
CN108108015A (en) A kind of action gesture recognition methods based on mobile phone gyroscope and dynamic time warping
CN108647657A (en) A kind of high in the clouds instruction process evaluation method based on pluralistic behavior data
CN110443309A (en) A kind of electromyography signal gesture identification method of combination cross-module state association relation model
CN112732092A (en) Surface electromyogram signal identification method based on double-view multi-scale convolution neural network
CN107358646A (en) A kind of fatigue detecting system and method based on machine vision
Li et al. Hand gesture recognition and real-time game control based on a wearable band with 6-axis sensors

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant