CN108196668A - A kind of portable gesture recognition system and method - Google Patents
A kind of portable gesture recognition system and method Download PDFInfo
- Publication number
- CN108196668A CN108196668A CN201711271029.3A CN201711271029A CN108196668A CN 108196668 A CN108196668 A CN 108196668A CN 201711271029 A CN201711271029 A CN 201711271029A CN 108196668 A CN108196668 A CN 108196668A
- Authority
- CN
- China
- Prior art keywords
- gesture
- arm
- target
- finger
- turn signal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a kind of portable gesture recognition system and method, wherein system includes:Sensor armlet, for obtaining in target gesture implementation procedure, motor message and arm space turn signal are sent to data processor by the motor message of arm muscles and the arm space turn signal of arm;Gyroscope gloves, for obtaining in target gesture implementation procedure, finger space turn signal is sent to data processor by the finger space turn signal in each joint of finger;Data processor, for according to motor message, arm space turn signal and finger space turn signal, extract target gesture start point and the corresponding target difference characteristic of end point, target is distinguished characteristic distinguishing characteristics data corresponding with gesture in pre-stored gesture classification set to match, according to matching result, the corresponding gesture classification of target gesture is determined.Above system and method reliability are high, and effectively gesture can be identified, and have very high practicability.
Description
Technical field
The present invention relates to gesture identification field, more particularly to a kind of portable gesture recognition system and method.
Background technology
With the development of electronic technology and computer technology, traditional mouse and keyboard entry method can not meet
The use demand of people.In recent years, in view of gesture has the characteristics that intuitive, naturality, the input mode based on user gesture
Have become a kind of important means for human-computer interaction.Customer-centric is more emphasized in human-computer interaction based on gesture, using more
Add and meet user and exchange custom naturally, so as to provide a naturally effective man-machine interaction mode to the user, while require system
It is easy to carry, user experience is good.Since the real-time and accuracy of gesture identification are extremely important for natural interaction, so for
The research of gesture identification method has great significance to natural human-computer interaction.
But the gesture recognition system of traditional approach or method, not only with the hand of gyro sensor identification gesture
Set, the gyroscope and relevant apparatus that largely use reduce the reliability of equipment, and complex circuit designs, equipment is heavy, wears not
It is convenient.It is badly in need of a kind of new identifying system to solve the problems, such as this.
Invention content
For above-mentioned technical problem, the present invention provides a kind of reliability height, effectively gesture can be identified portable
Gesture recognition system and method.
In order to solve the above technical problems, the technical solution used in the present invention is:A kind of portable gesture identification system is provided
System, the sensor armlet and gyroscope gloves being connect including data processor and with the data processor;
The sensor armlet, for obtaining in target gesture implementation procedure, the motor messages of arm muscles and arm
The motor message and the arm space turn signal are sent to the data processor by arm space turn signal;
The gyroscope gloves, for obtaining in target gesture implementation procedure, the finger space rotation letter in each joint of finger
Number, the finger space turn signal is sent to the data processor;
The data processor, for empty according to the motor message, the arm space turn signal and the finger
Between turn signal, the target gesture start point and end point corresponding target difference characteristic are extracted, by the target area
Other characteristic distinguishing characteristics data corresponding with gesture in pre-stored gesture classification set are matched, and are tied according to matching
Fruit determines the corresponding gesture classification of the target gesture.
Using above technical scheme, the technique effect reached is the present invention:Pass through the sensor armlet and gyroscope of setting
Gloves, can effectively the arm space motor message to the motor message of arm muscles during gesture motion, arm and
The finger space turn signal in each joint of finger is detected, and data processor can extract target according to the signal detected
The target difference characteristic of gesture, the target signature data extracted are corresponding with gesture in pre-stored gesture classification set
Distinguishing characteristics data matched, according to matching result, determine the gesture classification of current goal gesture.Above system structure phase
To simple, effectively gesture can be identified, and discrimination is high, there is very high practicability.
More preferably, in the above-mentioned technical solutions, the sensor armlet includes 8 fleshes being connect with the data processor
Electric signal sensor, 1 acceleration transducer and 3 the first gyro sensors;
The electromyography signal sensor is arranged on the inside of the sensor armlet, for being contacted with forearm musculature,
For obtaining the motor message, the motor message is sent to the data processor;
The acceleration transducer is arranged in the sensor armlet, for obtaining movement angle of the arm in space
Degree and the direction of motion, the data processor is sent to by the angle of operation and the rotation direction;
First gyro sensor is arranged in the sensor armlet, is turned in space for obtaining arm
The operating parameters are sent to the data processor by dynamic parameter.
More preferably, in the above-mentioned technical solutions, the gyroscope gloves include it is N number of connect with the data processor the
Two gyro sensors, wherein N are the positive integer more than or equal to 5;
Each second gyro sensor, is arranged in the gyroscope gloves, and each second gyroscope passes
Sensor with the end back portion of corresponding finger for contacting, for obtaining rotational angle and rotation of each joint of finger in space
The rotational angle and the rotation direction are sent to the data processor by direction.
More preferably, in the above-mentioned technical solutions, the identifying system further include by the electromyography signal sensor, it is described plus
Velocity sensor, first gyro sensor and second gyro sensor connect respectively with the data processor
Logical communication module;
The communication module, for by the electromyography signal sensor, the acceleration transducer, first gyroscope
The data information that sensor and second gyro sensor detect is forwarded to the data processor.
A kind of portable gesture identification method is additionally provided, is included the following steps:
Obtain target gesture implementation procedure in, the motor message of arm muscles, the arm space turn signal of arm and
The finger space turn signal in each joint of finger;
According to the motor message of the arm muscles of the target gesture, the arm space turn signal and the finger
Spatial rotational signal extracts the target gesture start point and the corresponding target difference characteristic of end point, by the target
Distinguishing characteristics data distinguishing characteristics data corresponding with gesture in pre-stored gesture classification set are matched, and are tied according to matching
Fruit determines the corresponding gesture classification of the target gesture.
Using above technical scheme, the technique effect reached is the present invention:Pass through the target gesture implementation procedure to acquisition
In, the finger space turn signal of the motor message of arm muscles, the arm space turn signal of arm and each joint of finger
Extraction, can hard objectives gesture start point and end point corresponding target difference characteristic, it is special by the way that target is distinguished
Sign data are matched with the corresponding distinguishing characteristics data of gesture in the gesture classification set being pre-stored, according to matching result, really
Determine the gesture classification of current goal gesture.The above method can effectively be identified gesture, and discrimination is high, have very
High practicability.
More preferably, in the above-mentioned technical solutions, it is empty in the arm of the motor message for obtaining target arm muscles, arm
Between before turn signal and the finger space turn signal in each joint of finger, it is further comprising the steps of:
In acquisition standard gesture implementation procedure, the motor message of arm muscles, the arm space turn signal of arm and
The finger space turn signal in each joint of finger;
The standard gesture start point and the corresponding distinguishing characteristics data of end point are extracted, is that the standard gesture is corresponding
Distinguishing characteristics data build gesture classification set, and the gesture classification set of structure is stored.
Description of the drawings
The invention will be further described below in conjunction with the accompanying drawings:
Fig. 1 is portable gesture recognition system schematic structure schematic diagram provided by the invention;
Fig. 2 is the schematic flow chart of one embodiment of portable gesture identification method provided by the invention;
Fig. 3 is the schematic flow chart of another embodiment of portable gesture identification method provided by the invention.
Specific embodiment
As shown in Figure 1, portable gesture recognition system provided by the invention, including data processor and and data processing
The sensor armlet and gyroscope gloves of device connection;
Sensor armlet, for obtaining in target gesture implementation procedure, the motor message of arm muscles and the arm of arm
Motor message and arm space turn signal are sent to data processor by spatial rotational signal;
Gyroscope gloves, for obtaining in target gesture implementation procedure, the finger space turn signal in each joint of finger will
Finger space turn signal is sent to data processor;
Data processor, for according to motor message, arm space turn signal and finger space turn signal, extracting mesh
Gesture start point and the corresponding target difference characteristic of end point are marked, by target difference characteristic and pre-stored gesture point
The corresponding distinguishing characteristics data of gesture are matched in class set, according to matching result, determine the corresponding gesture class of target gesture
Not.
Wherein:Target gesture is to need to carry out the gesture that gesture classification determines.
More preferably, in the above-mentioned technical solutions, sensor armlet includes 8 electromyography signals being connect with data processor biography
Sensor, 1 acceleration transducer and 3 the first gyro sensors;
Electromyography signal sensor is arranged on the inside of sensor armlet, for being contacted with forearm musculature, for obtaining
Motor message is sent to data processor by motor message;
Acceleration transducer is arranged in sensor armlet, for obtaining movement angle and movement of the arm in space
Angle of operation and rotation direction are sent to data processor by direction;
First gyro sensor is arranged in sensor armlet, will for obtaining rotational parameters of the arm in space
Operating parameters are sent to data processor.
More preferably, in the above-mentioned technical solutions, gyroscope gloves include N number of the second gyroscope being connect with data processor
Sensor, wherein N are the positive integer more than or equal to 5;
Each second gyro sensor, is arranged in gyroscope gloves, and each second gyro sensor is used for and phase
The end back portion contact for the finger answered, for obtaining rotational angle and rotation direction of each joint of finger in space, will rotate
Angle and rotation direction are sent to data processor.
More preferably, in the above-mentioned technical solutions, identifying system further include by electromyography signal sensor, acceleration transducer,
The communication module that first gyro sensor and the second gyro sensor connect respectively with data processor;
Communication module, for by electromyography signal sensor, acceleration transducer, the first gyro sensor and the second gyro
The data information that instrument sensor detects is forwarded to data processor.
Using above technical scheme, the technique effect reached is the present invention:Pass through the sensor armlet and gyroscope of setting
Gloves, can effectively the arm space motor message to the motor message of arm muscles during gesture motion, arm and
The finger space turn signal in each joint of finger is detected, and data processor can extract target according to the signal detected
The target difference characteristic of gesture, the target signature data extracted are corresponding with gesture in pre-stored gesture classification set
Distinguishing characteristics data matched, according to matching result, determine the gesture classification of current goal gesture.Above system structure phase
To simple, effectively gesture can be identified, and discrimination is high, there is very high practicability.
As shown in Fig. 2, the present invention also provides a kind of portable gesture identification method, the device that is used in this method with
And the respective operations that device performs all are described in detail in the corresponding embodiments of Fig. 1, are no longer described in embodiment of the method.
Portable gesture identification method is as follows:
S110:It obtains in target gesture implementation procedure, the motor message of arm muscles, the arm space turn signal of arm
And the finger space turn signal in each joint of finger;
S120:Turned according to the motor message of the arm muscles of target gesture, arm space turn signal and finger space
Dynamic signal extracts target gesture start point and the corresponding target difference characteristic of end point, by target difference characteristic with
The corresponding distinguishing characteristics data of gesture are matched in pre-stored gesture classification set, according to matching result, determine target hand
The corresponding gesture classification of gesture.
Above-mentioned technical proposal is by the target gesture implementation procedure to acquisition, the motor messages of arm muscles, arm
The extraction of arm space turn signal and the finger space turn signal in each joint of finger, being capable of hard objectives gesture start point
Target corresponding with end point distinguishes characteristic, by the way that target is distinguished in characteristic and pre-stored gesture classification set
The corresponding distinguishing characteristics data of gesture are matched, and according to matching result, determine the gesture classification of current goal gesture.Can have
Gesture is identified in effect, and discrimination is high, has very high practicability.
As shown in figure 3, on the basis of Fig. 2 corresponding embodiments, following improvement has also been carried out:
S105:In acquisition standard gesture implementation procedure, the motor message of arm muscles, the arm space turn signal of arm
And the finger space turn signal in each joint of finger;
Extraction standard gesture start point and the corresponding distinguishing characteristics data of end point are the corresponding distinguishing characteristics of standard gesture
Data build gesture classification set, and the gesture classification set of structure is stored;
S110:It obtains in target gesture implementation procedure, the motor message of arm muscles, the arm space turn signal of arm
And the finger space turn signal in each joint of finger;
S120:Turned according to the motor message of the arm muscles of target gesture, arm space turn signal and finger space
Dynamic signal extracts target gesture start point and the corresponding target difference characteristic of end point, by target difference characteristic with
The corresponding distinguishing characteristics data of gesture are matched in pre-stored gesture classification set, according to matching result, determine target hand
The corresponding gesture classification of gesture.
Wherein:Standard gesture, for the reference gesture specifically performed, it may also be said to be standard gesture, pass through the ginseng to acquisition
The arm space turn signal of motor message, arm according to the arm muscles of gesture and the finger space rotation in each joint of finger
The extraction of signal, can be clearly with reference to the starting point of gesture and the corresponding distinguishing characteristics data of end point, and the difference for acquisition is special
Data structure gesture classification set is levied, the distinguishing characteristics data of multiple standards gesture, each area are included in gesture classification set
Other characteristic corresponds to a kind of gesture classification.
In the above-mentioned technical solutions, pass through the hand to the motor message of arm muscles, arm in standard gesture implementation procedure
The acquisition of arm spatial rotational signal and the finger space turn signal in each joint of finger can be built according to the signal of acquisition
Gesture classification set with standard gesture distinguishing characteristics data, to provide better reference, energy to the verification of target gesture
It is enough that more accurately target gesture is identified.
Further, in motor message, arm space turn signal and the finger of the arm muscles for obtaining target gesture
After spatial rotational signal, the various signals of target gesture can also be screened, each target gesture includes 3 kinds of differences
Signal, each corresponding signal of target gesture is one group of signal, can reject that lack a certain signal corresponding in screening process
Group signal, that is, reject invalid target hand signal.
In the above-mentioned technical solutions, by the rejecting to invalid targets hand signal, the knowledge of target gesture is effectively simplified
Flow during not, improves the accuracy identified to target hand signal.
Gesture motion includes finger movement and arm action, and finger movement and arm action is needed to combine and determine that gesture is moved
Make.In the identification of finger movement to identification have key effect data be by connect in sensor armlet with data processor 8
It is N number of with counting in a electromyography signal sensor, 1 acceleration transducer and 3 the first gyro sensors and gyroscope gloves
According to the second gyro sensor that processor connects, wherein N is what the positive integer more than or equal to 5 was acquired.Arm action
The data for having key effect to identification in identification are by 8 electromyography signal sensors, 1 acceleration transducer in sensor armlet
It is obtained with 3 the first gyro sensors.
Training data generates, i.e. the generation acquisition of standard gesture data:
1st, finger movement data gathering algorithm:
(1) with the second spiral shell of the frequency acquisition of 400HZ top instrument sensor in the instantaneous angular speed of three axial directions of x, y, z;
(2) with 2.5*10-3Second obtains (1) by time interval at angular speed angle and carries out trapezoidal integration acquisition x, y, z three axially
Angle value;
(3) in concrete operations, the second gyro sensor include No. 0-4 5 gyro sensors of 5 finger end faces with
And No. 5 gyro sensors of the back of the hand position, 1-4 gyro sensor x, y, z angle values are subtracted into No. 5 gyro sensors
Three axial angle values, No. 0 gyro sensor, the magnitude of angular velocity of x, z subtracts the three of No. 5 gyro sensors again after exchanging
Axial angle angle value, the three final axis angular rate values of acquisition are the distinguishing characteristics number of the finger angular speed in distinguishing characteristics data
According to;
(4) result that (3) result obtains is stored;
(5) (1)-(4) step is performed to all gestures, can thus obtains finger angle in all gesture implementation procedures
The distinguishing characteristics data of speed store the distinguishing characteristics data of the finger rotational angle of acquisition.
2nd, training data generates
(1) the distinguishing characteristics data of the finger rotational angle of gained in 1 are pressed into gesture type classification;
(2) k is calculated as by after gesture classification different gestures are counted with its different numerical value sum occurred;
(3) numerical value that each axial direction occurred respectively counts its frequency of occurrences in different classes of gesture library
And obtain result Ai (i=0,1,2 ..., k);
(4) Si (i=1,2 ..., k) is denoted as by counting its all data count after gesture classification respectively to each gesture;
(5) each gesture difference numerical value probability of occurrence Pi=Ai/Si is acquired.
(6) Pi of different gestures is clustered by gesture classification, obtain nj (j=0,1,2,3,4,5) it is a by Pi divide
Class scope, wherein j are corresponding gyroscope number;
(7) each its centerpoint value of class scope is denoted as Wn as the weights of the range, and n is the range number of gained in (6);
(8) by gesture classification, with the upper of each group of Wn corresponding to each gesture gyro sensor and affiliated range
Lower limit is as record, and per gesture, write-in nj items record to form gesture identification library R, i.e. gesture classification set.
3rd, finger movement recognizer:
(1) with three axial instantaneous angular velocities of 400HZ frequency acquisition spiral shell top instrument sensors x, y, z;
(2) with 2.5*10-3Second obtains (1) by time interval angular speed angle progress trapezoidal integration and obtains z, y, x-axis to angle
Value;
(3) 1-4 gyroscopes x, y, z angle value subtracts No. 5 three axial angle angle value of gyro sensor;No. 0 gyroscope, x,
The axial magnitude of angular velocity of z subtracts No. 5 three axial angle angle value of gyro sensor and is denoted as XAj again after exchanging, YAj, ZAj (j=0,
1.2.3.4.5), each group of XA, YA, ZA are denoted as Dj=[XAj, YAj, YZj];The final array obtained is the target hand obtained
The target difference characteristic of gesture;
(4) each axial reading of each gyro sensor, i.e. target difference characteristic and gesture in corresponding finger reading Dj
The distinguishing characteristics data of finger motion parameter in identification library R are matched, and to each gesture, detecting three axial components of Dj is
It is no to fall in the R of gesture identification library in corresponding identification position data record in effective range, if falling in the effective range of correspondence with
Affiliated range corresponds to the Second Eigenvalue that weights correspond to identification position as the gesture, is denoted as t2, and then determines that gesture is corresponding
Finger movement;
(5) in (4) step as real time data is not fallen in effective range, then returned with each identification position weights with KNN algorithms
Class determines the identification position data weights, and this group of data are stored in individuation data library Rc;
(6) obtained reading is matched all gestures, obtains all gestures and respectively identify position pair by the mode as described in (4) step
In the Second Eigenvalue t2j of the reading, each group eigenvalue cluster is into special vector T kj=[t21j, t22j, t23j, t24j, t25j]
(t21j,t22j,t23j,t24j,t25j)。
(7) feature vector Tkj caused by more all gestures, the corresponding gesture of each element maximum value should by the gesture
Identification position the First Eigenvalue be calculated as 1, be otherwise denoted as 0, obtain each group gesture first eigenvector t1j=[t11, t12, t13,
t14,t15];
(8) all gestures are calculated to close the First Eigenvalue caused by reading;
(9) it calculates with the presence or absence of unique highest T1 values, and if so, being returned the gesture as recognition result;
(10) the Second Eigenvalue T2j=Tkj*E of highest T1 values same gesture is calculated if unique highest T1 is not deposited;
(11) gesture of unique peak is returned as recognition result in T2 values in the identical gesture of highest T1 values;
(12) if T1, T2, which are combined, is still not present unique gesture, minimum with database place sequence return recording number
One.
4th, arm action data acquire:
While the above process is performed, it is also necessary to which 8 electromyography signal sensors, 1 acceleration pass in sensor armlet
The support of sensor and 3 the first gyroscope sensor datas, during corresponding gesture is performed, the part muscle of arm also can
Movement, electromyography signal sensor can obtain the motor message of forearm musculature, and acceleration transducer can obtain arm in space
Interior movement angle and the direction of motion, the first gyro sensor can obtain rotational parameters of the arm in space.
4.1, arm action data gathering algorithm
Active segment is extracted by the amplitude of electromyography signal, using the starting point of electromyography signal and end point as pressure
The starting point and end point of signal;Active segment detection is carried out, and for holding time to electromyography signal using the method for moving average
Hand signal below given threshold T is considered user's unconscious movement, and wherein threshold value T is needed after the completion of acquisition to multiple
Completely hand signal is analyzed to determine, T 800ms.The mean value of pressure signal and standard deviation SD is selected to represent pressure letter
Number feature, select electromyography signal median frequency MF and standard deviation SD represent electromyography signal feature;The calculating of SD and MF is such as
Formula (1), (2), (3) are shown:
Wherein Xi represents current collected signal, and PSD (x) is the power spectral density of electromyography signal:One effective gesture
Electromyography signal feature vector be expressed as Eemg=[e1, e2], wherein e1 and e2 be electromyography signal median frequency and standard deviation;
Feature vector Efsr=[e3, e4], e3 and the e4 of one effective gesture pressure signal are the mean value and standard deviation of pressure signal;One
The spatial rotational feature vector Emng=[e5, e6] of a effective gesture.By electromyography signal feature vector Eemg=[e1, e2], hand
Gesture pressure signal feature vector Efsr=[e3, e4] and spatial rotational feature vector Emng=[e5, e6], it is special labeled as difference
The arm difference characteristic in data is levied, is that every group of arm distinguishes characteristic structure arm feature recognition library.
4.2nd, arm action identifies
When target gesture is compared with standard gesture, also can target gesture implementation procedure be obtained by above-mentioned formula
In three parameters feature vector, i.e., target difference characteristic target arm difference characteristic, with distinguishing characteristics number
Arm difference characteristic in is compared, and judges whether three feature vectors of target arm difference characteristic fall
In three feature vectors in arm difference characteristic, if three feature vectors of target arm difference characteristic are all fallen within
In three feature vectors of arm difference characteristic, it is possible to determine that the corresponding arm of target arm difference characteristic moves
Make, final gesture motion is determined according to finger movement and arm action.
By the way that the gesture motion determined is combined with corresponding arm action, the complete of entire gesture can be accurately determined
Whole action.By the spatial rotational feature of the feature vector to electromyography signal feature vector, gesture pressure signal and gesture to
Amount is obtained and is judged, more accurately gesture motion can be judged.
It can be that professional and technical personnel in the field realize or use that the above embodiment, which is intended to illustrate the present invention, to above-mentioned
Embodiment is modified and be will be apparent for those skilled in the art, therefore the present invention includes but not limited to
The above embodiment, it is any to meet the claims or specification description, meet with principles disclosed herein and novelty,
The method of inventive features, technique, product, each fall within protection scope of the present invention.
Claims (6)
1. a kind of portable gesture recognition system, which is characterized in that connect including data processor and with the data processor
The sensor armlet and gyroscope gloves connect;
The sensor armlet, for obtaining in target gesture implementation procedure, the motor message of arm muscles and the arm of arm
The motor message and the arm space turn signal are sent to the data processor by spatial rotational signal;
The gyroscope gloves, for obtaining in target gesture implementation procedure, the finger space turn signal in each joint of finger will
The finger space turn signal is sent to the data processor;
The data processor, for being turned according to the motor message, the arm space turn signal and the finger space
Dynamic signal extracts the target gesture start point and the corresponding target difference characteristic of end point, the target is distinguished special
Sign data are matched with the corresponding distinguishing characteristics data of gesture in the gesture classification set being pre-stored, according to matching result, really
Determine the corresponding gesture classification of the target gesture.
2. portable gesture recognition system as described in claim 1, which is characterized in that the sensor armlet include with it is described
8 electromyography signal sensors, 1 acceleration transducer and 3 the first gyro sensors of data processor connection;
The electromyography signal sensor is arranged on the inside of the sensor armlet, for being contacted with forearm musculature, is used for
The motor message is obtained, the motor message is sent to the data processor;
The acceleration transducer is arranged in the sensor armlet, for obtain movement angle of the arm in space and
The angle of operation and the rotation direction are sent to the data processor by the direction of motion;
First gyro sensor is arranged in the sensor armlet, for obtaining rotation ginseng of the arm in space
The operating parameters are sent to the data processor by number.
3. portable gesture recognition system as claimed in claim 2, which is characterized in that the gyroscope gloves include it is N number of with
Second gyro sensor of the data processor connection, wherein N are the positive integer more than or equal to 5;
Each second gyro sensor, is arranged in the gyroscope gloves, each second gyro sensor
It is contacted for the end back portion with corresponding finger, for obtaining rotational angle of each joint of finger in space and rotation side
To the rotational angle and the rotation direction are sent to the data processor.
4. portable gesture recognition system as claimed in claim 2 or claim 3, which is characterized in that the identifying system further include by
The electromyography signal sensor, the acceleration transducer, first gyro sensor and second gyroscope pass
The communication module that sensor connects respectively with the data processor;
The communication module, for by the electromyography signal sensor, the acceleration transducer, first gyro sensors
The data information that device and second gyro sensor detect is forwarded to the data processor.
5. a kind of portable gesture identification method, which is characterized in that include the following steps:
It obtains in target gesture implementation procedure, the motor message of arm muscles, the arm space turn signal of arm and finger
The finger space turn signal in each joint;
According to the motor message of the arm muscles of the target gesture, the arm space turn signal and the finger space
Turn signal extracts the target gesture start point and the corresponding target difference characteristic of end point, the target is distinguished
Characteristic distinguishing characteristics data corresponding with gesture in pre-stored gesture classification set are matched, according to matching result,
Determine the corresponding gesture classification of the target gesture.
6. portable gesture identification method as claimed in claim 5, which is characterized in that in the acquisition target arm muscles
Before the finger space turn signal of motor message, the arm space turn signal of arm and each joint of finger, further include with
Lower step:
In acquisition standard gesture implementation procedure, the motor message of arm muscles, the arm space turn signal of arm and finger
The finger space turn signal in each joint;
The standard gesture start point and the corresponding distinguishing characteristics data of end point are extracted, is the corresponding difference of the standard gesture
Characteristic builds gesture classification set, and the gesture classification set of structure is stored.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711271029.3A CN108196668B (en) | 2017-12-05 | 2017-12-05 | Portable gesture recognition system and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711271029.3A CN108196668B (en) | 2017-12-05 | 2017-12-05 | Portable gesture recognition system and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108196668A true CN108196668A (en) | 2018-06-22 |
CN108196668B CN108196668B (en) | 2021-08-03 |
Family
ID=62573761
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711271029.3A Active CN108196668B (en) | 2017-12-05 | 2017-12-05 | Portable gesture recognition system and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108196668B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109656358A (en) * | 2018-11-23 | 2019-04-19 | 南京麦丝特精密仪器有限公司 | A kind of multidimensional sign Language Recognition Method |
CN110703910A (en) * | 2019-09-26 | 2020-01-17 | 深圳大学 | Gesture recognition method and system based on smart watch |
CN110794961A (en) * | 2019-10-14 | 2020-02-14 | 无锡益碧医疗科技有限公司 | Wearable gesture analysis system |
CN113031775A (en) * | 2021-03-24 | 2021-06-25 | Oppo广东移动通信有限公司 | Gesture data acquisition method and device, terminal and storage medium |
CN113419622A (en) * | 2021-05-25 | 2021-09-21 | 西北工业大学 | Submarine operation instruction control system interaction method and device based on gesture operation |
CN114167996A (en) * | 2022-02-14 | 2022-03-11 | 浙江强脑科技有限公司 | Sensor-based action pre-judging method and device and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7565295B1 (en) * | 2003-08-28 | 2009-07-21 | The George Washington University | Method and apparatus for translating hand gestures |
CN103777752A (en) * | 2013-11-02 | 2014-05-07 | 上海威璞电子科技有限公司 | Gesture recognition device based on arm muscle current detection and motion sensor |
CN104485037A (en) * | 2015-01-12 | 2015-04-01 | 重庆中电大宇卫星应用技术研究所 | Gesture sound making talking glove for the deaf and dumb |
CN105138133A (en) * | 2015-09-14 | 2015-12-09 | 李玮琛 | Biological signal gesture recognition device and method |
CN106383579A (en) * | 2016-09-14 | 2017-02-08 | 西安电子科技大学 | EMG and FSR-based refined gesture recognition system and method |
-
2017
- 2017-12-05 CN CN201711271029.3A patent/CN108196668B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7565295B1 (en) * | 2003-08-28 | 2009-07-21 | The George Washington University | Method and apparatus for translating hand gestures |
CN103777752A (en) * | 2013-11-02 | 2014-05-07 | 上海威璞电子科技有限公司 | Gesture recognition device based on arm muscle current detection and motion sensor |
CN104485037A (en) * | 2015-01-12 | 2015-04-01 | 重庆中电大宇卫星应用技术研究所 | Gesture sound making talking glove for the deaf and dumb |
CN105138133A (en) * | 2015-09-14 | 2015-12-09 | 李玮琛 | Biological signal gesture recognition device and method |
CN106383579A (en) * | 2016-09-14 | 2017-02-08 | 西安电子科技大学 | EMG and FSR-based refined gesture recognition system and method |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109656358A (en) * | 2018-11-23 | 2019-04-19 | 南京麦丝特精密仪器有限公司 | A kind of multidimensional sign Language Recognition Method |
CN110703910A (en) * | 2019-09-26 | 2020-01-17 | 深圳大学 | Gesture recognition method and system based on smart watch |
CN110703910B (en) * | 2019-09-26 | 2022-07-12 | 深圳大学 | Gesture recognition method and system based on smart watch |
CN110794961A (en) * | 2019-10-14 | 2020-02-14 | 无锡益碧医疗科技有限公司 | Wearable gesture analysis system |
CN113031775A (en) * | 2021-03-24 | 2021-06-25 | Oppo广东移动通信有限公司 | Gesture data acquisition method and device, terminal and storage medium |
WO2022199312A1 (en) * | 2021-03-24 | 2022-09-29 | Oppo广东移动通信有限公司 | Gesture data acquisition method and apparatus, terminal, and storage medium |
CN113419622A (en) * | 2021-05-25 | 2021-09-21 | 西北工业大学 | Submarine operation instruction control system interaction method and device based on gesture operation |
CN114167996A (en) * | 2022-02-14 | 2022-03-11 | 浙江强脑科技有限公司 | Sensor-based action pre-judging method and device and storage medium |
CN114167996B (en) * | 2022-02-14 | 2022-05-17 | 浙江强脑科技有限公司 | Sensor-based action pre-judging method and device and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN108196668B (en) | 2021-08-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108196668A (en) | A kind of portable gesture recognition system and method | |
US7259756B2 (en) | Method and apparatus for selecting information in multi-dimensional space | |
CN106897592B (en) | User authentication method, user authentication device, and writing instrument | |
Wang et al. | Human activity recognition with user-free accelerometers in the sensor networks | |
JP6064280B2 (en) | System and method for recognizing gestures | |
JP2011523730A (en) | Method and system for identifying a user of a handheld device | |
CN105446461A (en) | Gesture recognition method, palm virtual keyboard using same, and input method | |
US20150370327A1 (en) | Virtual input device and virtual input method | |
CN107678550A (en) | A kind of sign language gesture recognition system based on data glove | |
CN110471529A (en) | Act methods of marking and device | |
CN108268132B (en) | Gesture recognition method based on glove acquisition and man-machine interaction device | |
CN108371545A (en) | A kind of human arm action cognitive method based on Doppler radar | |
CN108629170A (en) | Personal identification method and corresponding device, mobile terminal | |
CN107368820A (en) | One kind becomes more meticulous gesture identification method, device and equipment | |
KR20070060580A (en) | Apparatus and method for handwriting recognition using acceleration sensor | |
CN107533371A (en) | Controlled using the user interface for influenceing gesture | |
Plouffe et al. | Natural human-computer interaction using static and dynamic hand gestures | |
CN110412566A (en) | A kind of fine granularity human arm motion's recognition methods based on Doppler radar time and frequency domain characteristics | |
CN110866468A (en) | Gesture recognition system and method based on passive RFID | |
CN111580660B (en) | Operation triggering method, device, equipment and readable storage medium | |
Iyer et al. | Generalized hand gesture recognition for wearable devices in IoT: Application and implementation challenges | |
Kakkoth et al. | Survey on real time hand gesture recognition | |
CN107992193A (en) | Gesture confirmation method, device and electronic equipment | |
CN110929766A (en) | Self-adaptive pedestrian mobile phone attitude identification method based on Gaussian mixture clustering algorithm | |
JP3678388B2 (en) | Pen-type input device and pattern recognition method for pen-type input device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP03 | Change of name, title or address |
Address after: 400000 # 28-19, No. 28, Wusi Road, Yuzhong District, Chongqing Patentee after: Chongqing Zhongdian Dayu Satellite Application Technology Research Institute Co.,Ltd. Address before: 400065 No.39, Wenfeng section, Huangshan Town, Nan'an District, Chongqing Patentee before: CHONGQING ZHONGDIAN DAYU SATELLITE APPLICATION TECHNOLOGY INSTITUTE |
|
CP03 | Change of name, title or address |