CN108268132A - A kind of gesture identification method and human-computer interaction device based on gloves acquisition - Google Patents

A kind of gesture identification method and human-computer interaction device based on gloves acquisition Download PDF

Info

Publication number
CN108268132A
CN108268132A CN201711432163.7A CN201711432163A CN108268132A CN 108268132 A CN108268132 A CN 108268132A CN 201711432163 A CN201711432163 A CN 201711432163A CN 108268132 A CN108268132 A CN 108268132A
Authority
CN
China
Prior art keywords
axis
gesture
acceleration
angular speed
characteristic sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711432163.7A
Other languages
Chinese (zh)
Other versions
CN108268132B (en
Inventor
崔剑
王凡瑜
刘妍
王丹盈
王林波
魏怡琳
罗震宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201711432163.7A priority Critical patent/CN108268132B/en
Publication of CN108268132A publication Critical patent/CN108268132A/en
Application granted granted Critical
Publication of CN108268132B publication Critical patent/CN108268132B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a kind of gesture identification methods and human-computer interaction device based on gloves acquisition, belong to technical field of hand gesture recognition.The device includes acquisition gloves, conditioner and processing module;Recognition methods first counts the data of acquisition, if the value of counter is not less than preset fixed value, then find angular speed energy maximum axis, the characteristic sequence of its corresponding acceleration and/or angular speed is extracted respectively, it is matched with the characteristic sequence of customized each gesture, if mismatch if respectively calculate X-axis, Y-axis, Z axis acceleration energy, find energy maximum axis.It is still mismatched if repeating the above steps, the characteristic sequence of at least one acceleration corresponding to maximum axis and/or angular speed is combined successively, continues to match, until successful match, carries out the judgement of each digital flexion state, provide recognition result.The present invention is in controlling intelligent household appliances control, industrial control field or even afield has good application potential.

Description

A kind of gesture identification method and human-computer interaction device based on gloves acquisition
Technical field
The invention belongs to technical field of hand gesture recognition, specifically a kind of gesture identification method and man-machine based on gloves acquisition Interactive device.
Background technology
Recently as the rapid development of electronic product, miscellaneous smart machine constantly enters the visual field of people.People Machine interaction is one of key link of smart machine work, but the input units such as traditional keyboard, mouse, remote controler are in portable journey It shows slightly insufficient on degree, affects the usage experience of user to a certain extent.Therefore, the novel people such as gesture control, voice control Machine interactive mode is by more and more extensive concern.For a user, gesture control, which has, uses nature, intuitive original excellent Gesture is very good man-machine interaction mode.
Gesture Recognition is the key problem of gesture control, mainly has view-based access control model at present with being based on two roads of gloves Line.Increasingly burning hot with artificial intelligence, the Gesture Recognition of view-based access control model develops more ripe or even intelligent in Samsung It has been obtained for applying on TV.In contrast, the scheme based on gloves seems and is quite out in the cold, such as document 1:Xiao Qian, Yang Ping, Xu A kind of gesture identification method [J] sensing technology journal .2013 (05) based on MEMS inertial sensor of vertical wave:611-615;Text Offer 2:A kind of gesture identification method [J] sensing technology journals based on acceleration signature extraction of Chen Yi, Yang Ping, Chen Xu light .2012(08):1073-1078;Document 3:Mobile electronic device Gesture Recognitions of the Chen Yi based on MEMS inertial sensor is ground Study carefully [D] University of Electronic Science and Technology, 2013;The researcher of these three documents employs decision tree classifier, has done beneficial exploration And improvement;Document 4:Big gesture collection Gesture Recognition Algorithm improvement of Wang Yuan, Tang Yongming, the Wang Baoping based on acceleration transducer is ground Study carefully [J] sensing technology journal .2013 (10):1345-1351;Researcher employ weight tree structure template library done it is beneficial It explores and improves;Document 5:Wang Wanliang, Yang Jingwei, gesture identification [J] the sensing technologies of one wave of Jiang based on motion sensor Report .2011,24 (12):1723-1727;Researcher has done beneficial exploration and improvement using hidden Markov model;But gesture The progress that discrimination and recognition speed obtain is not notable enough, is still that it moves towards practical important restriction.
The Gesture Recognition of view-based access control model usually requires that user always in face of equipment, and between image collecting device not There can be barrier to block sight, gesture identification rate may significantly reduce under low light condition.
Scheme based on gloves effectively overcomes disadvantages mentioned above, not only has in terms of controlling intelligent household appliances are controlled excellent Gesture also has good application potential or even in battlefield in the control of such as mechanical arm, unmanned vehicle control such industrial circle On can be developed into individual soldier's communication equipment use.
Invention content
The present invention in view of the above-mentioned problems, in order to improve the discrimination and recognition speed of the gesture identification scheme based on gloves, Propose a kind of gesture identification method and human-computer interaction device acquired based on gloves.
The gesture identification human-computer interaction device based on gloves acquisition, including acquisition gloves, conditioner and processing mould Block;Processing module is connect by serial line interface or USB interface etc. with external equipment.
Bending sensor and Inertial Measurement Unit are provided on acquisition gloves, processing module includes analog-digital converter and main place Device is managed, the two integrates on one chip;Conditioner connects bending sensor and analog-digital converter simultaneously;Inertial Measurement Unit is straight It connects and is connected with primary processor.
Bending sensor and Inertial Measurement Unit acquire initial data respectively, and wherein bending sensor outputs it resistance and send It is improved to conditioner, is converted into voltage output to analog-digital converter, analog-digital converter converts the analog quantity of voltage output For digital quantity, primary processor is submitted to.
The initial data of Inertial Measurement Unit acquisition is transferred directly to primary processor by communication protocol;Primary processor is to receiving Data handled, perform Gesture Recognition Algorithm, and export recognition result;It receives instruction simultaneously and performs corresponding operating.
Gesture identification method of the present invention based on gloves gathered data, is as follows:
Step 1: a series of gestures of user's self-defining, with set G={ G1,G2,G3... } and it represents;
Step 2: for some action that user implements at random, crooked sensory in gesture identification human-computer interaction device is acquired Device and the data of Inertial Measurement Unit output.
Data include:The output resistance of bending sensor, the acceleration and angular speed of tri- axis of Inertial Measurement Unit X, Y, Z.
Step 3: the data that the data length count device in processing module exports the Inertial Measurement Unit of acquisition are united Meter, judges whether the value of counter is less than preset fixed value;If it is, by the rejection of data of acquisition, terminate identification; Otherwise, four continuation are entered step subsequently to identify.
Fixed value is according to the different set different value of component;
Step 4: respectively calculate X-axis, Y-axis, Z axis angular speed energy, find angular speed energy maximum axis.
Angular speed energy definition is
In formula, Ek,avFor k axis angular rates energy (k=X, Y, Z);L is the angular velocity data of Inertial Measurement Unit output Length;ωkiFor the k axis angular rates in the i-th frame data.
It is as follows to find angular speed energy maximum axis method:
If a certain axis angular rate energy is more than another two axis angular rates energy and the sum of products of corresponding correction coefficient, judge The axis is angular speed energy maximum axis;If said circumstances are unsatisfactory for, judge that angular speed energy maximum axis is not present.
Step 5: being respectively the situation of X-axis, Y-axis or Z axis for angular speed energy maximum axis, maximum axis pair is extracted respectively The acceleration and/or the characteristic sequence of angular speed answered;
Characteristic sequence is divided into symbol sebolic addressing and differential code sequence;
For the original angular velocity data d that length is lav,1,dav,2,…,dav,l, select zero upper bound ub and zero lower bound Lb does following processing to the segment data:
Obtain sequence d1',d2',…,dl', then equal item adjacent in new sequence is merged, obtain symbol sebolic addressing;
To original angular velocity data dav,1,dav,2,…,dav,lDo following processing:Δi=dav,i+1-dav,i(i=1,2 ..., l-1)
Obtain difference sequence Δ12,…,Δl-1, then the symbol sebolic addressing of difference sequence is extracted, obtain differential code sequence.
Step 6: the characteristic sequence of the corresponding acceleration of angular speed energy maximum axis and/or angular speed is made by oneself with user The characteristic sequence of each gesture of justice is matched, judge maximum axis extraction characteristic sequence whether the characteristic sequence with corresponding gesture It is identical, if so, entering step 11, otherwise, enter step seven.
If angular speed energy maximum axis is X-axis, the acceleration of at least one axis and/or the feature sequence of angular speed are extracted Row, with preset gesture G1Characteristic sequence matched, and determine whether gesture G1, if so, entering step ten One, otherwise, enter step seven.
If angular speed energy maximum axis is Y-axis, the acceleration of at least one axis and/or the feature sequence of angular speed are extracted Row, with preset gesture G2Characteristic sequence matched, determine whether gesture G2, if so, 11 are entered step, Otherwise, seven are entered step.
If angular speed energy maximum axis is Z axis, the acceleration of at least one axis and/or the feature sequence of angular speed are extracted Row, with preset gesture G3Characteristic sequence matched, determine whether gesture G3, if so, 11 are entered step, Otherwise, seven are entered step.
Step 7: be not present for angular speed energy maximum axis or the corresponding acceleration of angular speed energy maximum axis and/or The characteristic sequence of angular speed and the unmatched situation of characteristic sequence of user-defined each gesture, then calculate X-axis, Y respectively The acceleration energy of axis, Z axis finds acceleration energy maximum axis.
Acceleration energy is defined as
In formula, Ek,acFor k axle acceleration energy;akiFor the k axle accelerations in the i-th frame data;akgiFor in the i-th frame data K axis gravitational acceleration components.
It is as follows to find acceleration energy maximum axis method:
If a certain axle acceleration energy is more than another two axle accelerations energy and the sum of products of corresponding correction coefficient, judge The axis is acceleration energy maximum axis;If said circumstances are unsatisfactory for, judge that acceleration energy maximum axis is not present.
Step 8: being respectively the situation of X-axis, Y-axis or Z axis for acceleration energy maximum axis, maximum axis pair is extracted respectively The acceleration and/or the characteristic sequence of angular speed answered;
Characteristic sequence is divided into symbol sebolic addressing and differential code sequence;
For the raw acceleration data d that length is lac,1,dac,2,…,dac,l, select zero upper bound ub' and zero lower bound Lb' does following processing to the segment data:
Obtain sequence d1″,d2″,…,dl", then equal item adjacent in new sequence is merged, obtain symbol sebolic addressing;
To raw acceleration data dac,1,dac,2,…,dac,lDo following processing:Δi'=dac,i+1-dac,i' (i=1, 2 ..., l-1) obtain difference sequence Δ1',Δ2',…,Δl-1', then the symbol sebolic addressing of difference sequence is extracted, obtain differential code Sequence.
Step 9: the characteristic sequence of the corresponding acceleration of acceleration energy maximum axis and/or angular speed is made by oneself with user The characteristic sequence of each gesture of justice is matched, judge maximum axis extraction characteristic sequence whether the characteristic sequence with corresponding gesture It is identical, if so, entering step 11, otherwise, enter step ten.
If acceleration energy maximum axis is X-axis, the acceleration of at least one axis and/or the feature sequence of angular speed are extracted Row, with preset gesture G4Characteristic sequence matched, judge whether the action is gesture G4, if so, entering step 11, otherwise, enter step ten.
If acceleration energy maximum axis is Y-axis, the acceleration of at least one axis and/or the feature sequence of angular speed are extracted Row, with gesture G5Characteristic sequence matched, judge whether the action is gesture G5;If so, 11 are entered step, otherwise, Enter step ten.
If acceleration energy maximum axis is Z axis, the acceleration of at least one axis and/or the feature sequence of angular speed are extracted Row, with gesture G6Characteristic sequence matched, judge whether the action is gesture G6;If so, 11 are entered step, otherwise, Enter step ten.
Step 10: being not present for acceleration energy maximum axis or the action is not yet judged as any one gesture Situation is then successively combined the characteristic sequence of the corresponding at least one acceleration of the maximum axis of extraction and/or angular speed, after It is continuous to set gesture G with subsequent7、G8... characteristic sequence matched.
Step 11: when the characteristic sequence of some extraction is matched with the characteristic sequence of user-defined gesture, carry out The judgement of each digital flexion state.
First, it is believed that same finger is under same stretching, extension or flexuosity, the output Normal Distribution of bending sensor, I.e.
In formula, Fp,stateFor the output of bending sensor, p represents finger number, and state represents finger state, μp,stateFor Population mean,For population variance, population mean μp,stateWith population variancePreset.
Then, if the output of the bending sensor of q root fingers meets:Fq∈(μq,stretch-3σq,stretchq,stretch+ 3σq,stretch), then the finger is judged for extended configuration, if similarly meeting Fq∈(μq,fist-3σq,fistq,fist+3σq,fist) then Judge the finger for flexuosity.
Step 12: gesture type and the flexuosity of each finger that comprehensive descision obtains, provide user and make action Recognition result.
The advantage of the invention is that:
1) a kind of, gesture identification human-computer interaction device based on gloves acquisition, passes through bending sensor and inertia measurement list Member, at the same acquire with digital flexion degree and with the relevant data of hand exercise state, increase identification gesture it is utilizable Data type can more subtly distinguish various gestures, and then increase identifiable gesture and the type sum number of gesture combination Amount.
2) a kind of, gesture identification method based on gloves acquisition, can identification types and a fairly large number of prespecified each Class gesture and gesture combination;This method has the characteristics of operand is small, recognition speed is fast, and deterministic process is natural, intuitive.
Description of the drawings
Fig. 1 is a kind of schematic diagram of the gesture identification human-computer interaction device based on gloves acquisition of the present invention;
Fig. 2 is a kind of step flow chart of the gesture identification method based on gloves acquisition of the present invention.
Specific embodiment
The specific implementation method of the present invention is described in detail below in conjunction with the accompanying drawings.
The present invention proposes a kind of gesture identification method and human-computer interaction device acquired based on gloves, and the gesture is known Other human-computer interaction device, as shown in Figure 1, including acquisition gloves, conditioner and processing module;Processing module by serial line interface or USB interface etc. is connect with external equipment.
Bending sensor and Inertial Measurement Unit are provided on acquisition gloves, processing module includes analog-digital converter and main place Device is managed, the two integrates on one chip;Conditioner connects bending sensor and analog-digital converter simultaneously;Inertial Measurement Unit is straight It connects and is connected with primary processor.
Bending sensor and Inertial Measurement Unit acquire initial data respectively, wherein bending sensor by output resistance send to Conditioner is improved, and is converted into voltage output to analog-digital converter, the analog quantity of voltage output is converted number by analog-digital converter Word amount, submits to primary processor.
The initial data of Inertial Measurement Unit acquisition is transferred directly to primary processor, and primary processor carries out the data of submission Processing, Gesture Recognition Algorithm is performed, and export recognition result;It receives instruction simultaneously and performs corresponding operating.
Gesture identification method of the present invention based on gloves gathered data, employ using characteristic quantity and characteristic sequence as The decision tree classifier of core, successively judges in this way, as shown in Fig. 2, being as follows:
Step 1: a series of gestures of user's self-defining, with set G={ G1,G2,G3... } and it represents;
Step 2: implementing some action at random for user, bending sensor in gesture identification human-computer interaction device is acquired With the data of Inertial Measurement Unit output.
Data include:The output resistance of bending sensor, the acceleration and angular speed of tri- axis of Inertial Measurement Unit X, Y, Z.
In the present embodiment, the output data of Inertial Measurement Unit is opposite plays a major role, the output number of bending sensor It helps out according to opposite.Such as, the output data for only relying on Inertial Measurement Unit just may recognize that and swing arm gesture, introduces bending and passes After sensor output data, it can will swing arm gesture and be subdivided into and gesticulate swinging arm gesture, gesticulating number 2 and swing arm gesture etc. for number 1. From the perspective of permutation and combination, Inertial Measurement Unit complements each other with bending sensor, is added significantly to recognizable gesture Quantity.
Step 3: the data that the data length count device in processing module exports the Inertial Measurement Unit of acquisition are united Meter, judges whether the value of counter is less than preset fixed value;If it is, by the rejection of data of statistics, terminate identification; Otherwise, four are entered step to continue subsequently to identify.
Fixed value is according to the different set different value of component;
Step 4: calculate respectively X-axis in Inertial Measurement Unit, Y-axis, Z axis angular speed energy, find angular speed energy most Big axis.
Angular speed energy definition is
In formula, Ek,avFor k axis angular rates energy (k=X, Y, Z), subscript av represents angular speed (angular velocity);L is the length of the angular velocity data of Inertial Measurement Unit output;ωkiK shaft angles speed for the i-th frame in data length Degree.
It is as follows to find angular speed energy maximum axis method:
If a certain axis angular rate energy is more than another two axis angular rates energy and the sum of products of corresponding correction coefficient, judge The axis is angular speed energy maximum axis.
Specially:If X-axis angular speed energy is more than twice of the sum of Y-axis angular speed energy and Z axis angular speed energy, sentence X-axis is determined for angular speed energy maximum axis;If Y-axis angular speed energy is more than the sum of X-axis angular speed energy and Z axis angular speed energy, Then judge Y-axis for angular speed energy maximum axis;If Z axis angular speed energy be more than X-axis angular speed energy and Y-axis angular speed energy it With then judge Z axis for angular speed energy maximum axis;
If said circumstances are unsatisfactory for, judge that angular speed energy maximum axis is not present.
Step 5: when being X-axis, Y-axis or Z axis for angular speed energy maximum axis, the corresponding acceleration of each axis is extracted respectively The characteristic sequence of degree and/or angular speed;
Characteristic sequence includes symbol sebolic addressing and differential code sequence;
For the original angular velocity data d that length is lav,1,dav,2,…,dav,l, select zero upper bound ub and zero lower bound Lb does following processing to the segment data:
Obtain sequence d1',d2',…,dl', then equal item adjacent in new sequence is merged, obtain symbol sebolic addressing;
To original angular velocity data dav,1,dav,2,…,dav,lDo following processing:Δi=dav,i+1-dav,i(i=1,2 ..., l-1)
Obtain difference sequence Δ12,…,Δl-1, then the symbol sebolic addressing of difference sequence is extracted, obtain differential code sequence.
Step 6: the characteristic sequence of the corresponding acceleration of angular speed energy maximum axis and/or angular speed is made by oneself with user The characteristic sequence of each gesture of justice is matched, judge maximum axis extraction characteristic sequence whether the characteristic sequence with corresponding gesture It is identical, if so, entering step 11, otherwise, enter step seven.
Matching characteristic sequence refers to compare the whether consistent processing procedure of two characteristic sequences;The characteristic sequence of each gesture is pre- It first sets, does not change during gesture identification with the initial data collected and change.
If angular speed energy maximum axis is X-axis, the spy for extracting the action one or more axle acceleration and/or angular speed Sequence is levied, with gesture G1Characteristic sequence matched, judge whether the action is gesture G1
If angular speed energy maximum axis is Y-axis, the spy for extracting the action one or more axle acceleration and/or angular speed Sequence is levied, with gesture G2Characteristic sequence matched, judge whether the action is gesture G2
If angular speed energy maximum axis is Z axis, the spy for extracting the action one or more axle acceleration and/or angular speed Sequence is levied, with gesture G3Characteristic sequence matched, judge whether the action is gesture G3
Step 7: be not present for angular speed energy maximum axis or the corresponding axle acceleration of angular speed energy maximum axis and/ Or the unmatched situation of characteristic sequence of the characteristic sequence of angular speed and user-defined each gesture, X-axis, Y are calculated respectively The acceleration energy of axis, Z axis finds acceleration energy maximum axis.
Acceleration energy is defined as
In formula, EK, acFor k axle acceleration energy, (k=X, Y, Z), subscript ac represents acceleration (acceleration);aki For the k axle accelerations in the i-th frame data;akgiFor the k axis gravitational acceleration components in the i-th frame data.
It is as follows to find acceleration energy maximum axis method:
If a certain axle acceleration energy is more than another two axle accelerations energy and the sum of products of corresponding correction coefficient, judge The axis is acceleration energy maximum axis;If said circumstances are unsatisfactory for, judge that acceleration energy maximum axis is not present.
Specially:If X-axis acceleration energy is more than twice of the sum of Y-axis acceleration energy and Z axis acceleration energy, sentence X-axis is determined for acceleration energy maximum axis;If Y-axis acceleration energy is more than the sum of X-axis acceleration energy and Z axis acceleration energy, Then judge Y-axis for acceleration energy maximum axis;If Z axis acceleration energy be more than X-axis acceleration energy and Y-axis acceleration energy it With then judge Z axis for acceleration energy maximum axis.
Step 8: being respectively the situation of X-axis, Y-axis or Z axis for acceleration energy maximum axis, maximum axis pair is extracted respectively The acceleration and/or the characteristic sequence of angular speed answered;
Characteristic sequence is divided into symbol sebolic addressing and differential code sequence;
For the raw acceleration data d that length is lac,1,dac,2,…,dac,l, select zero upper bound ub' and zero lower bound Lb' does following processing to the segment data:
Obtain sequence d1″,d2″,…,dl", then equal item adjacent in new sequence is merged, obtain symbol sebolic addressing;
To raw acceleration data dac,1,dac,2,…,dac,lDo following processing:Δi'=dac,i+1-dac,i' (i=1, 2 ..., l-1) obtain difference sequence Δ1',Δ2',…,Δl-1', then the symbol sebolic addressing of difference sequence is extracted, obtain differential code Sequence.
Step 9: the feature of the character pair sequence that acceleration energy maximum axis is obtained and user-defined each gesture Sequence is matched, and judges whether the action that the characteristic sequence of maximum axis correspondence extraction is formed is identical with corresponding gesture, if It is to enter step 11, otherwise, enters step ten.
Specially:
If acceleration energy maximum axis is X-axis, the spy for extracting the action one or more axle acceleration and/or angular speed Sequence is levied, with gesture G4Characteristic sequence matched, judge whether the action is gesture G4
If acceleration energy maximum axis is Y-axis, the spy for extracting the action one or more axle acceleration and/or angular speed Sequence is levied, with gesture G5Characteristic sequence matched, judge whether the action is gesture G5
If acceleration energy maximum axis is Z axis, the spy for extracting the action one or more axle acceleration and/or angular speed Sequence is levied, with gesture G6Characteristic sequence matched, judge whether the action is gesture G6
Step 10: it is not present for acceleration energy maximum axis or certain action is not yet judged as any one gesture, then The characteristic sequence of the corresponding at least one axle acceleration of the maximum axis of extraction and/or angular speed is combined successively, continue with Subsequent setting gesture G7、G8... characteristic sequence matched.
Step 11: when the characteristic sequence of some maximum axis extraction is matched with the characteristic sequence of user-defined gesture When, carry out the judgement of each digital flexion state.
If same finger is under same state (stretching, extension or bending), bending sensor output Normal Distribution, i.e.,
In formula, Fp,stateFor the output of bending sensor, p represents finger number, and state represents finger state, μp,stateFor Population mean,For population variance, population mean μp,stateWith population variancePreset.
Then, if the output of the bending sensor of q root fingers meets:Fq∈(μq,stretch-3σq,stretchq,stretch+ 3σq,stretch), then the finger is judged for extended configuration, if meeting Fq∈(μq,fist-3σq,fistq,fist+3σq,fist) then judge The finger is flexuosity.
Step 12: the flexuosity of comprehensive all gesture types and each finger, provides the knowledge that user makes action Other result.
Embodiment:
Bending sensor is made with Flex Sensor 2.2 ";Inertial Measurement Unit is made with MPU6050, sample frequency is set as 100Hz;To be embedded at the chipKIT Max32 development board works of Microchip PIC32MX795F512L one chip microcomputers Manage module;Conditioner is made of the fixed resistance that resistance value is about 68k Ω and integrated operational amplifier LM324, wherein fixed resistance Play partial pressure, the inverting input of integrated operational amplifier is connected with output terminal, makees voltage follower use.
Inertial Measurement Unit MPU6050 has static event feature;The characteristic is shown as, Inertial Measurement Unit MPU6050 Built in the raw acceleration data of acquisition is sent into after the high-pass filter of built-in, cutoff frequency 5Hz (being set as program) Motion process unit, if 3-axis acceleration after high-pass filtering be respectively less than can by program set static event acceleration rate threshold (such as 0.01g, g are acceleration of gravity), and it is more than the static thing that can be set by program to remain less than static event acceleration rate threshold Part time threshold (such as 128 milliseconds), then motion process unit thinks that Inertial Measurement Unit remains static;When inertia measurement list Member becomes stationary state or on the contrary, when becoming nonstatic state from stationary state from nonstatic state, claims motion process unit Detect a static event.
Meanwhile Inertial Measurement Unit MPU6050 has data ready break feature, which shows as, whenever inertia is surveyed After amount unit MPU6050 completions 3-axis acceleration, three axis angular rates sample and sampled value are stored in corresponding registers, Ke Yi INT pins send out interrupt request singal, are embodied in, and INT pin levels change.
Processing module utilizes above-mentioned static event feature and data ready break feature acquisition bending sensor and inertia The data of measuring unit output;
Specially:The data ready of enabled Inertial Measurement Unit MPU6050 interrupts, processing module monitoring Inertial Measurement Unit The INT pins of MPU6050, when pin level state change, it is believed that MPU6050 acquisitions finish a frame data, read inertia and survey It measures in unit MPU6050 with the relevant register value of static event feature, judges Inertial Measurement Unit MPU6050 whether in quiet Only state if Inertial Measurement Unit MPU6050 is in nonstatic state, reads Inertial Measurement Unit MPU6050 data registers Device obtains 3-axis acceleration and three axis angular rate data and is stored in processing module;Meanwhile start analog-digital converter sampling with The analog voltage that conditioner exports is converted to digital quantity and is stored in processing module by conversion;If Inertial Measurement Unit MPU6050 It remains static and processing module has had 3-axis acceleration and three axis angular rate data, then it is assumed that an action has been tied Beam stops data collection;If Inertial Measurement Unit MPU6050 remains static but processing module not yet has 3-axis acceleration And three axis angular rate data, then it is without any processing.
Start every time before acquiring the data that bending sensor is exported with Inertial Measurement Unit, it is necessary to first empty in processing module The data stored.The Origin And Destination acted using the static event feature judgement that Inertial Measurement Unit provides, reduces place The computational burden of module is managed, improves gesture identification speed.
Meanwhile processing module is provided with data length count device, three axis read when changing per secondary response INT pin levels Acceleration and three axis angular rate data are known as a frame data, often read and store a frame data, and data length count device increases one certainly. After stopping data collection, if data length count device value is less than 30, that is, 30 frame of data deficiencies for reading and storing, then not to having stored up The data deposited carry out subsequent processing, directly give up, and start to acquire next time.Acquisition bending sensor and Inertial Measurement Unit every time Before the data of output, it is necessary to first reset data length count device.
For acquisition Inertial Measurement Unit export data, respectively calculate X-axis, Y-axis, Z axis angular speed energy, find Angular speed energy maximum axis:
If angular speed energy maximum axis is X-axis, the symbol sebolic addressing of the action X-axis angular speed is extracted, if the action X-axis angle The symbol sebolic addressing of speed for " -1,1 ", then judge the action to tremble wrist gesture;If angular speed energy maximum axis is Y-axis, carry The symbol sebolic addressing of the action Y-axis angular speed is taken, if the symbol sebolic addressing of the action Y-axis angular speed is " 1 " or " -1 ", judges that this is dynamic As overturning gesture;If angular speed energy maximum axis is Z axis, the symbol sebolic addressing of the action Z axis angular speed is extracted, if the action The symbol sebolic addressing of Z axis angular speed and " 1 ", " -1 ", " 1, -1 ", " any sequence in -1,1 " matches, then judges the action to wave Arm gesture.
If angular speed energy maximum axis is not present or the corresponding axle acceleration of angular speed energy maximum axis and/or angular speed Characteristic sequence fail with given gesture feature sequences match, then calculate respectively X-axis, Y-axis, Z axis acceleration energy, judge Whether Z axis is acceleration energy maximum axis.
If Z axis acceleration energy is more than the sum of X-axis acceleration energy and Y-axis acceleration energy, judge Z axis for acceleration Energy maximum axis;It is on the contrary then judge that acceleration energy maximum axis is not present.In the present embodiment, because pre-defined gesture not with X-axis or Y-axis are acceleration energy maximum axis, therefore only judge whether Z axis is acceleration energy maximum axis.
If acceleration energy maximum axis is Z axis, the symbol sebolic addressing of the action Z axis acceleration is extracted, and judges to act appearance State:If the palm of the hand upward and the symbol sebolic addressing of the action Z axis acceleration for " -1 " or " -1,1 ", then judge the action to move up hand Gesture;If the palm of the hand is upward and the symbol sebolic addressing of the action Z axis acceleration is " 1 " or " 1, -1 ", then judge the action to move down gesture; If the palm of the hand downward and the symbol sebolic addressing of the action Z axis acceleration for " -1 " or " -1,1 ", then judge the action to move down gesture;If The palm of the hand is downward and the symbol sebolic addressing of the action Z axis acceleration is " 1 " or " 1, -1 ", then judges the action to move up gesture;If hand The heart towards left and the action Z axis acceleration symbol sebolic addressing for " -1 " or " -1,1 ", then judge the action to move to left gesture;If the palm of the hand It is " 1 " or " 1, -1 " towards left and the action Z axis acceleration symbol sebolic addressing, then judges the action to move to right gesture;If palm of the hand court Right and the action Z axis acceleration symbol sebolic addressing for " -1 " or " -1,1 ", then judge the action to move to right gesture;If the palm of the hand is towards the right side And the symbol sebolic addressing of the action Z axis acceleration is " 1 " or " 1, -1 ", then judges the action to move to left gesture.
If acceleration energy maximum axis be not present or the characteristic sequence of the action it fails to match with given sequence, extract The symbol sebolic addressing of the action X-axis acceleration and Z axis acceleration is combined judgement, if the action X-axis acceleration symbol sebolic addressing is " -1,1, -1,1 " or 1, -1,1, -1 ", and Z axis acceleration symbol sebolic addressing is " -1,1, -1 ", then judges that the action justifies hand for picture Gesture;If the action X-axis acceleration symbol sebolic addressing be " -1,1, -1,1 " or 1, -1,1, -1 ", and Z axis acceleration symbol sebolic addressing for " - 1,1,0,1, -1 " then judge the action for picture triangle gesture.
A certain gesture has been judged as when acting, then the flexuosity of each finger has been judged.
Specifically, it is all 5V to take conditioner bleeder circuit overall presure drop and 10 analog-digital converter comparison voltages, then modulus turns Ranging from the 0~1023 of parallel operation output digital quantity, with the conditioned device conditioning of each bending sensor analog output, analog-digital converter The digital output being converted to is the characterization of each corresponding digital flexion degree, if digital output is greater than or equal to critical value, Then judge the finger for extended configuration;If digital output is less than critical value, judge the finger for flexuosity.More specifically Ground, to thumb, critical value takes 473;To forefinger, critical value takes 502;To middle finger, critical value takes 529;To the third finger, critical value Take 550;To little finger, critical value takes 640.
Finally, by UART Universal Asynchronous Receiver Transmitter or USB, Wi-Fi, bluetooth etc., the gesture-type that will determine that and each hand Refer to flexuosity and be sent to computer according to agreement form.
Specifically, each gesture for being intended to identification is numbered respectively, represents to judge obtained gesture class with the number data of 1 byte Type, separately with low 5 flexuosity for representing each finger of 1 byte, certain position 1 represents that respective finger is in flexuosity, on the contrary Certain position 0 represents that respective finger is in extended configuration.

Claims (8)

1. it is a kind of based on gloves acquisition gesture identification human-computer interaction device, which is characterized in that including acquisition gloves, conditioner and Processing module;Processing module is connect by serial line interface or USB interface etc. with external equipment;
Bending sensor and Inertial Measurement Unit are provided on acquisition gloves;Processing module includes analog-digital converter and main process task Device, the two integrate on one chip;Conditioner connects bending sensor and analog-digital converter simultaneously;Inertial Measurement Unit is direct It is connected with primary processor;
Bending sensor and Inertial Measurement Unit acquire initial data respectively, and wherein bending sensor outputs it resistance and send to tune Reason device is improved, and is converted into voltage output to analog-digital converter, the analog quantity of voltage output is converted to number by analog-digital converter Word amount, submits to primary processor;
The initial data of Inertial Measurement Unit acquisition is transferred directly to primary processor by communication protocol;Primary processor is to the number of reception According to being handled, Gesture Recognition Algorithm is performed, and export recognition result;It receives instruction simultaneously and performs corresponding operating.
2. based on a kind of gesture identification side of the gesture identification human-computer interaction device based on gloves acquisition described in claim 1 Method, which is characterized in that be as follows:
Step 1: a series of gestures of user's self-defining, with set G={ G1,G2,G3... } and it represents;
Step 2: some action implemented at random for user, acquire in gesture identification human-computer interaction device bending sensor and The data of Inertial Measurement Unit output;
Step 3: the data that the data length count device in processing module exports the Inertial Measurement Unit of acquisition count, Judge whether the value of counter is less than preset fixed value;If it is, by the rejection of data of acquisition, terminate identification;It is no Then, four continuation are entered step subsequently to identify;
Step 4: respectively calculate X-axis, Y-axis, Z axis angular speed energy, find angular speed energy maximum axis;
Angular speed energy definition is
In formula, Ek,avFor k axis angular rates energy (k=X, Y, Z);L is the length of the angular velocity data of Inertial Measurement Unit output; ωkiFor the k axis angular rates in the i-th frame data;
It is as follows to find angular speed energy maximum axis method:
If a certain axis angular rate energy is more than another two axis angular rates energy and the sum of products of corresponding correction coefficient, the axis is judged For angular speed energy maximum axis;If said circumstances are unsatisfactory for, judge that angular speed energy maximum axis is not present;
Step 5: being respectively the situation of X-axis, Y-axis or Z axis for angular speed energy maximum axis, it is corresponding that maximum axis is extracted respectively The characteristic sequence of acceleration and/or angular speed;
Step 6: by the characteristic sequence of the corresponding acceleration of angular speed energy maximum axis and/or angular speed with it is user-defined The characteristic sequence of each gesture is matched, judge the extraction of maximum axis characteristic sequence whether the characteristic sequence phase with corresponding gesture Together, if so, entering step 11, otherwise, seven are entered step;
Step 7: be not present for angular speed energy maximum axis or the corresponding acceleration of angular speed energy maximum axis and/or angle speed The characteristic sequence of degree and the unmatched situation of characteristic sequence of user-defined each gesture, then calculate X-axis, Y-axis, Z respectively The acceleration energy of axis finds acceleration energy maximum axis;
Acceleration energy is defined as
In formula, Ek,acFor k axle acceleration energy;akiFor the k axle accelerations in the i-th frame data;akgiFor the k axis in the i-th frame data Gravitational acceleration component;
It is as follows to find acceleration energy maximum axis method:
If a certain axle acceleration energy is more than another two axle accelerations energy and the sum of products of corresponding correction coefficient, the axis is judged For acceleration energy maximum axis;If said circumstances are unsatisfactory for, judge that acceleration energy maximum axis is not present;
Step 8: being respectively the situation of X-axis, Y-axis or Z axis for acceleration energy maximum axis, it is corresponding that maximum axis is extracted respectively The characteristic sequence of acceleration and/or angular speed;
Step 9: by the characteristic sequence of the corresponding acceleration of acceleration energy maximum axis and/or angular speed with it is user-defined The characteristic sequence of each gesture is matched, judge the extraction of maximum axis characteristic sequence whether the characteristic sequence phase with corresponding gesture Together, if so, entering step 11, otherwise, ten are entered step;
Step 10: be not present for acceleration energy maximum axis or the action is not yet judged as the situation of any one gesture, The characteristic sequence of the corresponding at least one acceleration of the maximum axis of extraction and/or angular speed is combined successively, is continued with after Continuous setting gesture G7、G8... characteristic sequence matched;
Step 11: when the characteristic sequence of some extraction is matched with the characteristic sequence of user-defined gesture, each hand is carried out Refer to the judgement of flexuosity;
Step 12: gesture type and the flexuosity of each finger that comprehensive descision obtains, provide the knowledge that user makes action Other result.
3. the gesture identification method as claimed in claim 2 based on gloves acquisition, which is characterized in that the number described in step 2 According to including:The output resistance of bending sensor, the acceleration and angular speed of tri- axis of Inertial Measurement Unit X, Y, Z.
4. the gesture identification method as claimed in claim 2 based on gloves acquisition, which is characterized in that the spy described in step 5 Sign sequence is divided into symbol sebolic addressing and differential code sequence;
For the original angular velocity data d that length is lav,1,dav,2,…,dav,l, zero upper bound ub and zero lower bound lb are selected, it is right The segment data does following processing:
Obtain sequence d1',d2',…,dl', then equal item adjacent in new sequence is merged, obtain symbol sebolic addressing;
To original angular velocity data dav,1,dav,2,…,dav,lDo following processing:Δi=dav,i+1-dav,i(i=1,2 ..., l-1) Obtain difference sequence Δ12,…,Δl-1, then the symbol sebolic addressing of difference sequence is extracted, obtain differential code sequence.
5. the gesture identification method as claimed in claim 2 based on gloves acquisition, which is characterized in that in the step six,
If angular speed energy maximum axis is X-axis, the acceleration of at least one axis and/or the characteristic sequence of angular speed are extracted, with Preset gesture G1Characteristic sequence matched, and determine whether gesture G1
If angular speed energy maximum axis is Y-axis, the acceleration of at least one axis and/or the characteristic sequence of angular speed are extracted, with Preset gesture G2Characteristic sequence matched, determine whether gesture G2
If angular speed energy maximum axis is Z axis, the acceleration of at least one axis and/or the characteristic sequence of angular speed are extracted, with Preset gesture G3Characteristic sequence matched, determine whether gesture G3
6. the gesture identification method as claimed in claim 2 based on gloves acquisition, which is characterized in that the spy described in step 8 Sign sequence is divided into symbol sebolic addressing and differential code sequence;
For the raw acceleration data d that length is lac,1,dac,2,…,dac,l, zero upper bound ub' and zero lower bound lb' is selected, Following processing is done to the segment data:
Obtain sequence d1”,d2”,…,dl", then equal item adjacent in new sequence is merged, obtain symbol sebolic addressing;
To raw acceleration data dac,1,dac,2,…,dac,lDo following processing:Δi'=dac,i+1-dac,i' (i=1,2 ..., l- 1) difference sequence Δ is obtained1',Δ2',…,Δl-1', then the symbol sebolic addressing of difference sequence is extracted, obtain differential code sequence.
7. the gesture identification method as claimed in claim 2 based on gloves acquisition, which is characterized in that in the step nine,
If acceleration energy maximum axis is X-axis, the acceleration of at least one axis and/or the characteristic sequence of angular speed are extracted, with Preset gesture G4Characteristic sequence matched, judge whether the action is gesture G4
If acceleration energy maximum axis is Y-axis, the acceleration of at least one axis and/or the characteristic sequence of angular speed are extracted, with Gesture G5Characteristic sequence matched, judge whether the action is gesture G5
If acceleration energy maximum axis is Z axis, the acceleration of at least one axis and/or the characteristic sequence of angular speed are extracted, with Gesture G6Characteristic sequence matched, judge whether the action is gesture G6
8. the gesture identification method as claimed in claim 2 based on gloves acquisition, which is characterized in that the step ten one has Body is:
First, it is believed that same finger is under same stretching, extension or flexuosity, the output Normal Distribution of bending sensor, i.e.,
In formula, Fp,stateFor the output of bending sensor, p represents finger number, and state represents finger state, μp,stateFor totality Mean value,For population variance, population mean μp,stateWith population variancePreset;
Then, if the output of the bending sensor of q root fingers meets:Fq∈(μq,stretch-3σq,stretchq,stretch+3 σq,stretch), then the finger is judged for extended configuration, if similarly meeting Fq∈(μq,fist-3σq,fistq,fist+3σq,fist) then sentence The finger break as flexuosity.
CN201711432163.7A 2017-12-26 2017-12-26 Gesture recognition method based on glove acquisition and man-machine interaction device Active CN108268132B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711432163.7A CN108268132B (en) 2017-12-26 2017-12-26 Gesture recognition method based on glove acquisition and man-machine interaction device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711432163.7A CN108268132B (en) 2017-12-26 2017-12-26 Gesture recognition method based on glove acquisition and man-machine interaction device

Publications (2)

Publication Number Publication Date
CN108268132A true CN108268132A (en) 2018-07-10
CN108268132B CN108268132B (en) 2020-03-03

Family

ID=62772550

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711432163.7A Active CN108268132B (en) 2017-12-26 2017-12-26 Gesture recognition method based on glove acquisition and man-machine interaction device

Country Status (1)

Country Link
CN (1) CN108268132B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109325464A (en) * 2018-10-16 2019-02-12 上海翎腾智能科技有限公司 A kind of finger point reading character recognition method and interpretation method based on artificial intelligence
CN110414473A (en) * 2019-08-06 2019-11-05 青海师范大学 A kind of data glove Gesture Recognition Algorithm based on mathematical statistics
CN111124126A (en) * 2019-12-25 2020-05-08 北京航空航天大学 Unmanned aerial vehicle gesture control method
CN113050797A (en) * 2021-03-26 2021-06-29 深圳市华杰智通科技有限公司 Method for realizing gesture recognition through millimeter wave radar
CN113238661A (en) * 2021-07-09 2021-08-10 呜啦啦(广州)科技有限公司 Data processing method and system for data glove, electronic equipment and medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204740561U (en) * 2015-07-13 2015-11-04 刘述亮 Data glove
CN105929940A (en) * 2016-04-13 2016-09-07 哈尔滨工业大学深圳研究生院 Rapid three-dimensional dynamic gesture recognition method and system based on character value subdivision method
CN106445130A (en) * 2016-09-19 2017-02-22 武汉元生创新科技有限公司 Motion capture glove for gesture recognition and calibration method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204740561U (en) * 2015-07-13 2015-11-04 刘述亮 Data glove
CN105929940A (en) * 2016-04-13 2016-09-07 哈尔滨工业大学深圳研究生院 Rapid three-dimensional dynamic gesture recognition method and system based on character value subdivision method
CN106445130A (en) * 2016-09-19 2017-02-22 武汉元生创新科技有限公司 Motion capture glove for gesture recognition and calibration method thereof

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109325464A (en) * 2018-10-16 2019-02-12 上海翎腾智能科技有限公司 A kind of finger point reading character recognition method and interpretation method based on artificial intelligence
CN110414473A (en) * 2019-08-06 2019-11-05 青海师范大学 A kind of data glove Gesture Recognition Algorithm based on mathematical statistics
CN110414473B (en) * 2019-08-06 2022-02-25 青海师范大学 Data glove gesture recognition algorithm based on mathematical statistics
CN111124126A (en) * 2019-12-25 2020-05-08 北京航空航天大学 Unmanned aerial vehicle gesture control method
CN113050797A (en) * 2021-03-26 2021-06-29 深圳市华杰智通科技有限公司 Method for realizing gesture recognition through millimeter wave radar
CN113238661A (en) * 2021-07-09 2021-08-10 呜啦啦(广州)科技有限公司 Data processing method and system for data glove, electronic equipment and medium

Also Published As

Publication number Publication date
CN108268132B (en) 2020-03-03

Similar Documents

Publication Publication Date Title
CN108268132A (en) A kind of gesture identification method and human-computer interaction device based on gloves acquisition
Preis et al. Gait recognition with kinect
CN103676604B (en) Watch and running method thereof
CN104850773B (en) Method for authenticating user identity for intelligent mobile terminal
CN109597485B (en) Gesture interaction system based on double-fingered-area features and working method thereof
WO2018040757A1 (en) Wearable device and method of using same to monitor motion state
JP6064280B2 (en) System and method for recognizing gestures
US20080036737A1 (en) Arm Skeleton for Capturing Arm Position and Movement
US20100023314A1 (en) ASL Glove with 3-Axis Accelerometers
CN105929940A (en) Rapid three-dimensional dynamic gesture recognition method and system based on character value subdivision method
CN107678550A (en) A kind of sign language gesture recognition system based on data glove
CN104731307B (en) A kind of body-sensing action identification method and human-computer interaction device
CN107016342A (en) A kind of action identification method and system
CN110837792B (en) Three-dimensional gesture recognition method and device
CN205721628U (en) A kind of quick three-dimensional dynamic hand gesture recognition system and gesture data collecting device
CN106563260A (en) Table tennis intelligent motion system based on attitude sensor and computing method based on table tennis intelligent motion system
CN106778477A (en) Tennis racket action identification method and device
CN108196668B (en) Portable gesture recognition system and method
CN111708433B (en) Gesture data acquisition glove and sign language gesture recognition method based on gesture data acquisition glove
CN111722713A (en) Multi-mode fused gesture keyboard input method, device, system and storage medium
Du et al. Gesture recognition method based on deep learning
CN105404390A (en) Modeling and gesture action identification method of wireless data glove
CN110866468A (en) Gesture recognition system and method based on passive RFID
Li et al. Hand gesture recognition and real-time game control based on a wearable band with 6-axis sensors
Iyer et al. Generalized hand gesture recognition for wearable devices in IoT: Application and implementation challenges

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant