CN105573498A - Gesture recognition method based on Wi-Fi signal - Google Patents

Gesture recognition method based on Wi-Fi signal Download PDF

Info

Publication number
CN105573498A
CN105573498A CN201510939043.0A CN201510939043A CN105573498A CN 105573498 A CN105573498 A CN 105573498A CN 201510939043 A CN201510939043 A CN 201510939043A CN 105573498 A CN105573498 A CN 105573498A
Authority
CN
China
Prior art keywords
gesture
signal
data
hand signal
template
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510939043.0A
Other languages
Chinese (zh)
Other versions
CN105573498B (en
Inventor
刘东东
王亮
李伟
陈晓江
汤战勇
彭瑶
张洁
王安文
任宇辉
郭松涛
何刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwest University
Original Assignee
Northwest University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwest University filed Critical Northwest University
Priority to CN201510939043.0A priority Critical patent/CN105573498B/en
Publication of CN105573498A publication Critical patent/CN105573498A/en
Application granted granted Critical
Publication of CN105573498B publication Critical patent/CN105573498B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a gesture recognition method based on a Wi-Fi signal. The method comprises the steps of extracting a gesture signal (GS) to be recognized, and matching the GS to be recognized with a plurality of template gesture signals, thus obtaining a matching distance between the GS to be recognized and each templates gesture signal, wherein a gesture represented by the template gesture signal corresponding to the minimal matching distance is accordant with the gesture presented by the GS to be recognized; and therefore, the generation of a large number of redundant data or the extraction error of the gesture data which cause the recognition error and the slow operation of the system are effectively avoided.

Description

A kind of gesture identification method based on Wi-Fi signal
Technical field
The invention belongs to man-machine interaction, machine learning field, relate to a kind of gesture identification method based on Wi-Fi signal.
Background technology
Along with the fast development of computer information technology, human-computer interaction technology plays more and more important role in daily life.Gesture is a kind of people and exchange way the most intuitively during extraneous communication, and people express the idea of oneself intuitively, succinctly, naturally by body or gesture, and the human-computer interaction technology therefore based on gesture becomes the focus studied at present, i.e. Gesture Recognition.
Current Gesture Recognition is mainly divided into two classes, one class is that target carries special sensor or equipment, i.e. active Gesture Recognition, active Gesture Recognition carries 3-axis acceleration sensor mainly through target, gyroscope, the sensor devices such as electronic compass gather hand-type or follow the tracks of hand spatial movement data, in current active gesture identification, data glove is most widely used general, data glove dresses by target the gloves comprising multiple sensor, by sensor record hand movement locus in space and the movable information of finger-joint, thus identify the gesture of target, at home, the Wujiang qin of Harbin Institute of Technology, Gao Wen etc. are in Chinese Sign Language recognition system, use the CyberGlover data glove be made up of 18 sensors, in conjunction with Hidden Markov Model (HMM) (HiddenMarkovModel, and artificial neural network (ArtificialNeuralNetwork HMM), ANN) two kinds of methods, realizing the discrimination of isolated word is 90%, simple statement discrimination is 92%.Active Gesture Recognition directly can obtain staff coordinate in space and finger motion information, the degree of accuracy of data is high, the multiple gesture of identifiable design and identify precision high, but carry Special Equipment owing to also needing user, be inconvenient to operate, be not suitable for the shortcomings such as remote-controlled operation, its application scenarios is limited by very large.
In order to user can better be experienced, no longer rely on the servicing unit such as sensor or specialized equipment, the domestic and international researcher target that begins one's study does not carry the Gesture Recognition of any sensor or equipment, i.e. passive type Gesture Recognition, because its cost is low, simple to operate, meet the features such as user habit, become the focus of research both at home and abroad.Existing passive type Gesture Recognition mainly contains: the Gesture Recognition based on computer vision, the Gesture Recognition based on sound wave, Gesture Recognition based on Wi-Fi signal.Wherein based on the recognition technology of computer vision, it is the action sequence by camera captured target gesture, complex process is carried out to hand signal, then gesture identification is carried out by pattern matching algorithm, quite ripe based on the recognition technology of video at present, be successfully applied in daily life, as the Xbox360 game host of Microsoft's exploitation, but because it is subject to the impact of ambient light power, video processing data amount is large, there is lower deployment cost cost large, easily reveal the shortcomings such as privacy, limit the prospect of its development.Based on the Gesture Recognition of sound wave, the most typical successful case is jointly proposed by University of Washington and Microsoft Research, sent by the loudspeaker on smart mobile phone or notebook computer and microphone and accept the gesture fluctuation that 18Khz sound wave carrys out perception user, but due to sound wave can the scope of perception little, limit its range of application and scene.And nowadays, perfect Wi-Fi infrastructure, make Wi-Fi signal almost ubiquitous, domestic and international scientist utilizes the universality of Wi-Fi signal, set about research by the Gesture Recognition centered by perception Wi-Fi signal disturbance, as professor Q.Pu utilizes USRP (UniversalSoftwareRadioPeripheral, general software radio peripheral hardware) send and receive WI-FI signal, and after OFDM modulation is carried out to Wi-Fi signal, analyze the Doppler frequency deviation of subcarrier, realize 9 kinds of gesture identification.As can be seen from international forward position progress, this utilizes the technology of Wi-Fi signal by the improvement human lives at following high degree, improves the quality of life of the mankind.
Gesture identification based on wireless signal has complied with the trend of following man-machine interactive development, has very important Research Significance and practical value, especially its vast potential for future development, has attracted the interest of a large amount of expert both at home and abroad.Over nearly 3 years, the Gesture Recognition based on wireless signal is developed rapidly, and is mainly divided into two research directions:
(1) RFID label tag is used to realize gesture identification
In 2012, University of Melbourne doctor ParvinAsadzadeh, utilized receiver to the tracking of passive label track, and realizing the identification rate of precision of gesture is 94%.2013, doctor RasmusKrigslund of Aalborg University, the receiver disposing multiple antennas was followed the trail of passive label, realizes simple 3D gesture identification.In 2014, doctor JueWang of MIT, dress RFID label tag by finger, realize virtual handwritten word, by the phase information of receiver analyzing tags, realizing word recognition rate was 96.8%.University of Washington BryceKellogg and doctor VamsiTalla, independent research receives the hardware device of label information by mobile phone, carries out amplitude envelope, in low-power consumption to receiving signal, when low delay, achieve the recognition accuracy to 8 kinds of gestures 97%.
(2) Wi-Fi signal is used to realize gesture identification
In 2013, the D.Katabi professor of MIT, use USRP (UniversalSoftwareRadioPeripheral, general software radio peripheral hardware) connect many antennas, composition mimo system, sent by MIMO technology and accept 2.4GhzWi-Fi signal, realizing wall simple gestures detection behind.The QifanPu professor of University of Washington, amendment USRP-N210 underlying protocol, carries out OFDM technology process to original signal, and by sending 5Gh signal, observe the Doppler frequency deviation of each subcarrier, realize the identification to 9 kinds of gestures, mean accuracy reaches 94%.2014, doctor StephanSigg of brother's Dettingen university utilized the RSSI value of wireless Wi-Fi signal to identify 11 kinds of gestures, and its precision is 72%.Doctor PedroMelgarejo of University of Wisconsin, directional antenna is utilized to receive AP signal at receiving end, research to the identification of 25 kinds of gestures, realizes the accuracy of identification of 92% and under the scene of low signal-to-noise ratio, realizes the accuracy of identification of 84% under two kinds of scenes under the scene of high s/n ratio.
As can be seen from above-mentioned research work, use RFID label tag can realize fine-grained gesture identification, but need target to carry FRID label, limit the freedom of user, can not make user more naturally, man-machine interaction freely, the non-constant of Consumer's Experience.Based on Wi-Fi signal gesture identification, due to the ubiquity of WIFI infrastructure device, user's man-machine interaction under unrestrained condition can be realized simultaneously, be subject to the favor of domestic and international expert, but Chinese scholars sends mainly through USRP and receives WI-FI signal and realizes gesture identification at present, but has some limitations equally, because USRP does not belong to common apparatus, and costly, actual deployment cost is high.
Summary of the invention
The defect existed for above-mentioned prior art or deficiency, the object of the invention is to, and proposes a kind of gesture identification method based on Wi-Fi signal.
To achieve these goals, the present invention adopts following technical scheme:
Based on a gesture identification method for Wi-Fi signal, specifically comprise the following steps:
Step one, transmitting terminal sends signal, and user makes various gestures between transmitting terminal and receiving end, and produce disturbance to signal, two receiving antennas of receiving end receive signal G1 and G3 after by gesture disturbance respectively;
Step 2, carries out pre-service to signal G1 and G3, and described pre-service comprises normalized, the sliding process of conjugation process peace, obtains pretreated signal S1;
Step 3, extracts hand signal GS to be identified according to signal S1;
Step 4, builds template hand signal storehouse, comprises multiple template hand signal in template hand signal storehouse; Hand signal GS to be identified is mated with multiple template hand signal respectively, obtain the matching distance between hand signal GS to be identified and multiple template hand signal, the gesture represented by template hand signal corresponding to wherein minimum matching distance is consistent with the gesture that hand signal to be identified represents.
Particularly, the implementation of described step 2 comprises:
Step 2.1: be normalized signal G1 and G3, obtains signal G4 and G5 after normalization respectively;
Step 2.2: carry out conjugation process to signal G4 and G5 after normalization, obtains the signal S after removing noise;
Step 2.3: to the smoothing process of signal S after denoising, obtain the signal S1 after smoothing processing.
Particularly, the implementation of described step 3 comprises:
Step 3.1: according to signal S1, calculate moving window amplitude and matrix A [n], n represents data sample number;
Step 3.2: determine gesture data constraint condition;
Step 3.3: obtain hand signal GS to be identified according to moving window amplitude and matrix A [n] and gesture data constraint condition.
Particularly, the implementation of described step 4 comprises:
Step 4.1: build template hand signal storehouse, multiple template hand signal ref [k] is comprised in template hand signal storehouse, k represents that kth kind template gesture is believed, single template hand signal data are expressed as ref (i), 1≤i≤d, wherein d represents the number of the data point of single template gesture data;
Step 4.2: hand signal GS to be identified is mated with multiple template hand signal respectively, obtain the matching distance between hand signal GS to be identified and multiple template hand signal, concrete methods of realizing is as follows:
Hand signal data to be identified are GS (j), 1≤j≤n, and wherein n represents the number of the data point of gesture data to be identified;
If d=n, the matching distance value DS [k] of hand signal GS to be identified and all template hand signal ref [k] is then gone out by Euclidean distance formulae discovery, k represents kth kind template gesture, and the gesture represented by template hand signal corresponding to minimum matching distance is consistent with the gesture that hand signal to be identified represents;
If d ≠ n, build the Distance matrix D (i, j) of d*n, 1≤i≤d, 1≤j≤n; According to Distance matrix D (i, j) the matching distance value DS [k] of hand signal to be identified and all template hand signals is obtained, k represents kth kind template gesture, and the gesture represented by template hand signal corresponding to minimum matching distance is consistent with the gesture that hand signal to be identified represents.
Particularly, the implementation of described step 3.3 comprises:
Step 3.3.1: gather one section of non-gesture state data, average as threshold value threshold;
Step 3.3.2: extract hand signal GS according to the threshold value threshold that step 3.3.1 obtains;
Step 3.3.2.1: with A [1] for starting point, A [1] represents first data of moving window amplitude and matrix A, judge one by one | A [n] | whether be greater than threshold value threshold, obtain in A [n] | A [n] | be greater than the first paragraph data interval A [T of threshold value threshold continuously start] ~ A [T end], data interval A [T start] ~ A [T end] number of samples be num; Judge num whether in interval [Min, Max], wherein, Min is gesture data duration minimum value, and Max is gesture data duration maximal value, if so, goes to step 3.3.2.2, if not, goes to step 3.3.2.3;
Step 3.3.2.2: obtain A [T start] ~ A [T end] data S1 [T in corresponding signal S1 start* offset] ~ S1 [T end* offset], offset is that moving window offsets recruitment at every turn, calculates S1 [T start* offset] ~ S1 [T end* offset] in all data absolute value sum V; If V is greater than minimum threshold min_gesture, then think data point S1 [T start* offset] ~ S1 [T end* offset] be gesture data GS, otherwise data point S1 [T start* offset] ~ S1 [T end* offset] be non-gesture data, with A [T end+ 1] be starting point, go to step 3.3.2.1;
Step 3.3.2.3: when num value is greater than Max, data point S1 [T start* offset] ~ S1 [T end* offset] be non-gesture data, with A [T end+ 1] be starting point, go to step 3.3.2.1; When num value is less than Min, continue A [T end] after data judge one by one | A [n] | whether be greater than threshold value threshold, be less than the first paragraph data interval A [T of threshold value threshold continuously end+ 1] ~ A [T end1], the interval A [T of statistics end+ 1] ~ A [T end1] data amount check be num1; Judge whether num1 is greater than Min, if so, go to step 3.3.2.4, if not, go to step 3.3.2.5;
Step 3.3.2.4: data interval A [T can be judged according to gesture duration feature start] ~ A [T end1] be non-gesture data, with A [T end1+ 1] be starting point, go to step 3.3.2.1;
Step 3.3.2.5: continue A [T end1] after data judge one by one | A [n] | whether be greater than threshold value threshold, be greater than the second segment data interval A [T of threshold value threshold continuously end1+ 1] ~ A [T end2], the interval A [T of statistics end1+ 1] ~ A [T end2] data amount check be num2; Judge num, num1 and num2 sum whether in interval [Min, Max], if three number sums are in this interval, judge data interval A [T start] ~ A [T end2] corresponding to signal S1 in data S1 [T start* offset] ~ S1 [T end2* offset] be gesture data GS; Otherwise be non-gesture data, with A [T end2+ 1] be starting point, go to step 3.3.2.1.
Compared with prior art, the present invention has following technique effect:
1, realize, disposing Least-cost and gesture identification under not needing user to carry any sensor or other specialized equipment conditions, making user carry out man-machine interaction under natural environment.
2, method of the present invention is adopted correctly to extract hand signal, hand signal GS to be identified is mated with multiple template hand signal, obtain the matching distance between hand signal GS to be identified and template hand signal, the gesture represented by template hand signal corresponding to wherein minimum matching distance is consistent with the gesture that hand signal to be identified represents.Effectively avoid producing a large amount of redundant datas or gesture data and extract mistake, cause identification error or system cloud gray model slow.
Accompanying drawing explanation
Fig. 1 is algorithm flow chart of the present invention;
Fig. 2 is experiment scene figure;
Fig. 3 is experiment gesture figure;
Fig. 4 is that gesture data carries figure;
Fig. 5 is DTW algorithm search path profile;
Below in conjunction with drawings and Examples the solution of the present invention done and explain in further detail and illustrate.
Embodiment
Defer to technique scheme, the gesture identification method based on Wi-Fi signal of the present invention, specifically comprises the following steps:
Step one: transmitting terminal sends signal, and user makes various gestures between transmitting terminal and receiving end, produce disturbance to signal, two receiving antennas of receiving end receive signal G1 and G3 after by gesture disturbance respectively.
Existing Wi-Fi infrastructure device is utilized to send signal, Wi-Fi infrastructure device adopts router, user does five kinds of gestures in the region of 4*4 rice, push away before wherein five kinds of gestures are respectively one hand, front jumping, push away before and after both hands, left hand draws circle, both hands open, user does different gesture in this region can produce different disturbances to the Wi-Fi signal that Wi-Fi infrastructure device produces, by the Wi-Fi signal that the Inter5300NIC network interface card receiving router be connected with portable computer sends, Inter5300NIC network interface card has three receiving antennas, the Wi-Fi signal that connects received respectively is G1, G2, G3, as shown in Figure 1.This deployment facility is not only general but also cheap, has good market outlook, as somatic sensation television game, and Smart Home.
Step 2, signal G1 and G3, owing to being subject to the impact of multipath, noise and external environment, therefore needing to carry out pre-service to signal, obtains pretreated signal S1, specifically comprise the following steps:
Step 2.1: due to the difference of personal habits and surrounding environment, cause not same gesture motion in the same time to produce difference, therefore by being normalized signal G1 and G3, to reduce space-time impact, improves the robustness of the inventive method.Concrete methods of realizing is as follows:
Wherein, G1 [i] represents i-th data of signal G1, and G4 represents the signal after G1 normalization; G3 [i] represents i-th data of signal G3, and G5 represents the signal after G3 normalization.
Step 2.2: because the impact of multi-path jamming on receiving end three antennas is identical, the present invention utilizes common mode inhibition capacity stress release treatment, and the signal that wherein antenna 1 and antenna 3 receive comprises hand signal and multipath signal.So carry out conjugation process to the signal after the normalization received of antenna 1 and antenna 3 herein, obtain the signal S after removing noise, specific formula for calculation is as follows:
S[i]=G4[i]*conj(G5[i])(3)
Step 2.3: because the signal S after denoising also comprises some accidental datas, can extract following Classification and Identification and gesture data and cause comparatively big error, needing further to the smoothing process of signal S after denoising for this reason, for eliminating the impact of burr data, improving the authenticity of gesture data.Adopt 53 Least square smooth filterings to the smoothing process of signal S, obtain the signal S1 after smoothing processing, concrete methods of realizing is as follows:
Wherein, x 1represent the 1st data in signal S, x 2represent the 2nd data in signal S, x irepresent i-th data in signal S, 3≤i≤m; y irepresent i-th data in the signal S1 after smoothing processing, 1≤i≤m.
Step 3, extracts hand signal GS to be identified according to the signal S1 that step 2 obtains.
Because when unpredictable user makes gesture motion, therefore need to detect the start point data of hand signal and endpoint data, its objective is and detect hand signal from the Wi-Fi signal received, accurately end-point detection for following characteristics extract and pattern-recognition data are accurately provided, while elimination redundant data, reduce complexity and the computing time of pattern match, improve the identification precision of system.In actual applications, we limit a gesture motion needs to comprise an of short duration steady gesture, and after gesture motion terminates, then carry out a steady gesture.The present invention measures average amplitude size by moving window, setting threshold value, and carry out the detection of start point data and endpoint data according to threshold value, concrete grammar is as follows:
Step 3.1: calculate moving window amplitude and matrix
For the signal S1 that step 2 obtains, moving window is adopted to calculate moving window amplitude and matrix A:
Wherein, a [n] is by after moving window, and data sample number is moving window amplitude and the matrix of n, calculated by moving window and can obtain n A, wherein win is the size of sliding window, and offset is that moving window offsets recruitment at every turn; Checking works as win=4, best results during offset=2 by experiment.Fig. 5 is that raw data is by amplitude figure after moving window process.
Step 3.2: according to general gesture feature, gesture data meets following constraint condition:
T Last∈[Min,Max](6)
Wherein T lastfor the gesture data duration, and Min=6, Max=9.
Step 3.3: obtain hand signal GS according to moving window amplitude and matrix A and gesture data constraint condition.
Step 3.3.1: user gathers one section of non-gesture state data (namely user is in Wi-Fi region, the data collected under not doing any gesture) in advance, averages as threshold value threshold, obtains threshold equal 0.2 by experiment;
Step 3.3.2: extract hand signal GS according to the threshold value threshold that step 3.3.1 obtains
Step 3.3.2.1: with A [1] for starting point, A [1] represents first data of moving window amplitude and matrix A, judge one by one | A [n] | whether be greater than threshold value threshold, obtain in A [n] | A [n] | be greater than the first paragraph data interval A [T of threshold value threshold continuously start] ~ A [T end], wherein, T startrepresent first data point being greater than threshold value threshold continuously, T endrepresent that last is greater than the data point of threshold value threshold continuously; Data interval A [T start] ~ A [T end] number of samples be num; Judge num whether in interval [Min=6, Max=9], if so, go to step 3.3.2.2, if not, go to step 3.3.2.3;
Step 3.3.2.2: obtain A [T start] ~ A [T end] data S1 [T in corresponding signal S1 start* offset] ~ S1 [T end* offset], offset is that moving window offsets recruitment at every turn, calculates S1 [T start* offset] ~ S1 [T end* offset] in all data absolute value sum V; If V is greater than minimum threshold min_gesture, then think data point S1 [T start* offset] ~ S1 [T end* offset] be gesture data GS, otherwise data point S1 [T start* offset] ~ S1 [T end* offset] be non-gesture data, with A [T end+ 1] be starting point, go to step 3.3.2.1;
Step 3.3.2.3: when num value is greater than Max, data point S1 [T start* offset] ~ S1 [T end* offset] be non-gesture data, with A [T end+ 1] be starting point, go to step 3.3.2.1; When num value is less than Min, because a complete hand signal may comprise the data being less than threshold value, therefore need to continue A [T end] after data judge one by one | A [n] | whether be greater than threshold value threshold, be less than the first paragraph data interval A [T of threshold value threshold continuously end+ 1] ~ A [T end1], the interval A [T of statistics end+ 1] ~ A [T end1] data amount check be num1; Judge whether num1 is greater than Min, if so, go to step 3.3.2.4, if not, go to step 3.3.2.5;
Step 3.3.2.4: data interval A [T can be judged according to general gesture duration feature start] ~ A [T end1] be non-gesture data, with A [T end1+ 1] be starting point, go to step 3.3.2.1;
Step 3.3.2.5: continue A [T end1] after data judge one by one | A [n] | whether be greater than threshold value threshold, be greater than the second segment data interval A [T of threshold value threshold continuously end1+ 1] ~ A [T end2], the interval A [T of statistics end1+ 1] ~ A [T end2] data amount check be num2; Judge num, num1 and num2 sum whether in interval [Min=6, Max=9], if three number sums are in this interval, judge data interval A [T start] ~ A [T end2] corresponding to signal S1 in data S1 [T start* offset] ~ S1 [T end2* offset] be gesture data GS; Otherwise be non-gesture data, with A [T end2+ 1] be starting point, go to step 3.3.2.1.
Such as left hand draws circle gesture, shown in Fig. 4, when carrying out end points and extracting, what figure center went out is the gesture motion detected, progressively judge according to constraint condition, gesture 1 because the duration is short and amplitude is less than minimum gesture threshold value min_gesture, therefore weeds out, and gesture 2 is because amplitude is less than minimum gesture threshold value min_gesture therefore weeds out.Gesture motion 4 duration exceedes the maximum gesture duration, therefore is judged to be noise signal.Finally identifying gesture motion 3 is real gesture data.
Step 4, builds template hand signal storehouse, comprises multiple template hand signal in template hand signal storehouse; Hand signal GS to be identified is mated with multiple template hand signal respectively, obtain the matching distance between hand signal GS to be identified and multiple template hand signal, the gesture represented by template hand signal corresponding to wherein minimum matching distance is consistent with the gesture that hand signal to be identified represents.
Discriminator is by recognizer, and the gesture data collected and predefined template gesture data are carried out the matching analysis, compares the similarity between the template gesture data that draws and collect, finally identifies concrete gesture.Because gesture all exists spatio-temporal difference, namely same user repeats to do same gesture motion or different user when doing same gesture motion, because the length of arm, gesture motion speed are different, cause gesture amplitude size and have very big-difference the duration.And dynamic time warping (DynamicTimeWarping, DTW), be mainly used in solving the problem that in speech recognition, speech rate is different, according to this function, this algorithm available solves the problem that in native system, measurement module is different with sample form time span.DTW algorithm is based on dynamic programming (DynamicProgramming, DP) thought solves test gesture data and sample gesture data time template matches problem different in size, distance matrix is built by compute euclidian distances, calculate shortest path Distance geometry, finally identify concrete gesture according to minor increment.Specific implementation step is as follows:
Step 4.1: build template hand signal storehouse, comprise multiple template hand signal in template hand signal storehouse;
First user is in Wi-Fi region, does 5 kinds of gesture motion, and the gesture data manual extraction these 5 kinds of gesture motion produced subsequently out, and is stored as template hand signal ref [k], and k represents kth kind template hand signal.Single template hand signal data are expressed as ref (i), 1≤i≤d, and wherein d represents the number of the data point of template gesture data.
Step 4.2: mated with multiple template hand signal respectively by hand signal GS to be identified, obtains the matching distance between hand signal GS to be identified and multiple template hand signal.Concrete methods of realizing is as follows:
Hand signal data to be identified are GS (j), 1≤j≤n, and wherein n represents the number of the data point of gesture data to be identified.
If d=n, then go out the matching distance DS between hand signal GS to be identified and template hand signal ref [k] by Euclidean distance formulae discovery:
Wherein i=[1, d] j=[1, n], DS represents the distance value of gesture data and template data.Obtain the matching distance value DS [k] of hand signal to be identified and all template hand signals, k represents kth kind template hand signal, and the gesture represented by template hand signal corresponding to minimum matching distance is consistent with the gesture that hand signal to be identified represents.
If d ≠ n, ref (i) and GS (j) is then needed to map alignment, DTW algorithm uses the thought of dynamic programming (DP), gesture data to be identified is represented on the transverse axis of rectangular coordinate system, template gesture data represents on the longitudinal axis of rectangular coordinate system, forms the grid matrix of a d*n; Calculate the distance between each data of template gesture data and each data of gesture data to be identified, build the Distance matrix D (i, j) of d*n, computing formula is as follows:
D(i,j)=(ref[i]-GS(j)) 2(8)
Wherein, 1≤i≤d, 1≤j≤n.
Obtain matching distance DS according to Distance matrix D (i, j), path must start from the D of two-dimensional coordinate system (1,1) point, terminates at (d, n) point.
Make DS (1,1)=D (1,1), calculate the value of first row DS (d, 1), computing formula is as follows:
DS (i, 1)=DS (i-1,1)+D (i, 1) wherein 2≤i≤d (8)
Calculating the first row DS (1, value n), computing formula is as follows:
DS (1, j)=DS (1, j-1)+D (1, j) wherein 2≤j≤n (9)
Constraint condition is selected to calculate DS (i, j) according to shortest path:
DS(i,j)=D(i,j)+min[DS(i-1,j),DS(i-1,j-1),DS(i,j-1)](10)
Wherein, 2≤i≤d, 2≤j≤n.
Finally obtain DS (d, n) for matching distance, DTW searching route as shown in Figure 5.Obtain the matching distance value DS [k] of gesture data and all template datas, k represents kth kind template gesture, and the gesture represented by template hand signal corresponding to minimum matching distance is consistent with the gesture that hand signal to be identified represents.
Experimental verification
This sample plot point is selected in Northwest University's Information Institute Stall, and the size of experiment selected areas is 4*4 rice, and subscriber station stands in the middle of region and carries out gesture interaction.This experiment porch transmitting terminal is existing wlan device (TP-LINK router (AP), there are two dual-mode antennas), receiving end uses Intel5300 network interface card (having three receiving antennas) reception from the CSI data of AP, receiving end runs under operating system is Ubuntu10.04.4, and experiment scene as shown in Figure 2.This experiment gathers five kinds of gesture motion altogether and verifies this method, and push away before wherein five kinds of gestures are respectively one hand, push away before and after front jumping, both hands, left hand draws circle, both hands open, gesture motion as shown in Figure 2.We, to often kind of gesture repeated acquisition 50 groups of data, collect altogether 250 groups of data, and wherein we choose 20 groups of gesture datas as with reference to gesture template, remain 30 groups as gesture data to be identified.After building gesture priori storehouse, verify method, recognition result is as shown in table 1:
Table 1
In table, ranks represent test data and template data respectively, template data successful match is pushed away 28 times before pushing away test data and one hand before 28 expression one hands in the first row, the judgement pushing away mistake before one hand is that left hand draws garden by 2 times, as can be seen from the figure, comparatively identical gesture easily produces to be obscured, and causes erroneous judgement.For this reason, when defining specific gesture, the gesture motion that selection area calibration is larger as far as possible, realizes gesture identification to make system with higher precision.

Claims (5)

1. based on a gesture identification method for Wi-Fi signal, it is characterized in that, specifically comprise the following steps:
Step one, transmitting terminal sends signal, and user makes various gestures between transmitting terminal and receiving end, and produce disturbance to signal, two receiving antennas of receiving end receive signal G1 and G3 after by gesture disturbance respectively;
Step 2, carries out pre-service to signal G1 and G3, and described pre-service comprises normalized, the sliding process of conjugation process peace, obtains pretreated signal S1;
Step 3, extracts hand signal GS to be identified according to signal S1;
Step 4, builds template hand signal storehouse, comprises multiple template hand signal in template hand signal storehouse; Hand signal GS to be identified is mated with multiple template hand signal respectively, obtain the matching distance between hand signal GS to be identified and multiple template hand signal, the gesture represented by template hand signal corresponding to wherein minimum matching distance is consistent with the gesture that hand signal to be identified represents.
2., as claimed in claim 1 based on the gesture identification method of Wi-Fi signal, it is characterized in that, the implementation of described step 2 comprises:
Step 2.1: be normalized signal G1 and G3, obtains signal G4 and G5 after normalization respectively;
Step 2.2: carry out conjugation process to signal G4 and G5 after normalization, obtains the signal S after removing noise;
Step 2.3: to the smoothing process of signal S after denoising, obtain the signal S1 after smoothing processing.
3., as claimed in claim 2 based on the gesture identification method of Wi-Fi signal, it is characterized in that, the implementation of described step 3 comprises:
Step 3.1: according to signal S1, calculate moving window amplitude and matrix A [n], n represents data sample number;
Step 3.2: determine gesture data constraint condition;
Step 3.3: obtain hand signal GS to be identified according to moving window amplitude and matrix A [n] and gesture data constraint condition.
4., as claimed in claim 3 based on the gesture identification method of Wi-Fi signal, it is characterized in that, the implementation of described step 4 comprises:
Step 4.1: build template hand signal storehouse, multiple template hand signal ref [k] is comprised in template hand signal storehouse, k represents that kth kind template gesture is believed, single template hand signal data are expressed as ref (i), 1≤i≤d, wherein d represents the number of the data point of single template gesture data;
Step 4.2: hand signal GS to be identified is mated with multiple template hand signal respectively, obtain the matching distance between hand signal GS to be identified and multiple template hand signal, concrete methods of realizing is as follows:
Hand signal data to be identified are GS (j), 1≤j≤n, and wherein n represents the number of the data point of gesture data to be identified;
If d=n, the matching distance value DS [k] of hand signal GS to be identified and all template hand signal ref [k] is then gone out by Euclidean distance formulae discovery, k represents kth kind template gesture, and the gesture represented by template hand signal corresponding to minimum matching distance is consistent with the gesture that hand signal to be identified represents;
If d ≠ n, build the Distance matrix D (i, j) of d*n, 1≤i≤d, 1≤j≤n; According to Distance matrix D (i, j) the matching distance value DS [k] of hand signal to be identified and all template hand signals is obtained, k represents kth kind template gesture, and the gesture represented by template hand signal corresponding to minimum matching distance is consistent with the gesture that hand signal to be identified represents.
5., as claimed in claim 3 based on the gesture identification method of Wi-Fi signal, it is characterized in that, the implementation of described step 3.3 comprises:
Step 3.3.1: gather one section of non-gesture state data, average as threshold value threshold;
Step 3.3.2: extract hand signal GS according to the threshold value threshold that step 3.3.1 obtains;
Step 3.3.2.1: with A [1] for starting point, A [1] represents first data of moving window amplitude and matrix A, judge one by one | A [n] | whether be greater than threshold value threshold, obtain in A [n] | A [n] | be greater than the first paragraph data interval A [T of threshold value threshold continuously start] ~ A [T end], data interval A [T start] ~ A [T end] number of samples be num; Judge num whether in interval [Min, Max], wherein, Min is gesture data duration minimum value, and Max is gesture data duration maximal value, if so, goes to step 3.3.2.2, if not, goes to step 3.3.2.3;
Step 3.3.2.2: obtain A [T start] ~ A [T end] data S1 [T in corresponding signal S1 start* offset] ~ S1 [T end* offset], offset is that moving window offsets recruitment at every turn, calculates S1 [T start* offset] ~ S1 [T end* offset] in all data absolute value sum V; If V is greater than minimum threshold min_gesture, then think data point S1 [T start* offset] ~ S1 [T end* offset] be gesture data GS, otherwise data point S1 [T start* offset] ~ S1 [T end* offset] be non-gesture data, with A [T end+ 1] be starting point, go to step 3.3.2.1;
Step 3.3.2.3: when num value is greater than Max, data point S1 [T start* offset] ~ S1 [T end* offset] be non-gesture data, with A [T end+ 1] be starting point, go to step 3.3.2.1; When num value is less than Min, continue A [T end] after data judge one by one | A [n] | whether be greater than threshold value threshold, be less than the first paragraph data interval A [T of threshold value threshold continuously end+ 1] ~ A [T end1], the interval A [T of statistics end+ 1] ~ A [T end1] data amount check be num1; Judge whether num1 is greater than Min, if so, go to step 3.3.2.4, if not, go to step 3.3.2.5;
Step 3.3.2.4: data interval A [T can be judged according to gesture duration feature start] ~ A [T end1] be non-gesture data, with A [T end1+ 1] be starting point, go to step 3.3.2.1;
Step 3.3.2.5: continue A [T end1] after data judge one by one | A [n] | whether be greater than threshold value threshold, be greater than the second segment data interval A [T of threshold value threshold continuously end1+ 1] ~ A [T end2], the interval A [T of statistics end1+ 1] ~ A [T end2] data amount check be num2; Judge num, num1 and num2 sum whether in interval [Min, Max], if three number sums are in this interval, judge data interval A [T start] ~ A [T end2] corresponding to signal S1 in data S1 [T start* offset] ~ S1 [T end2* offset] be gesture data GS; Otherwise be non-gesture data, with A [T end2+ 1] be starting point, go to step 3.3.2.1.
CN201510939043.0A 2015-12-15 2015-12-15 A kind of gesture identification method based on Wi-Fi signal Active CN105573498B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510939043.0A CN105573498B (en) 2015-12-15 2015-12-15 A kind of gesture identification method based on Wi-Fi signal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510939043.0A CN105573498B (en) 2015-12-15 2015-12-15 A kind of gesture identification method based on Wi-Fi signal

Publications (2)

Publication Number Publication Date
CN105573498A true CN105573498A (en) 2016-05-11
CN105573498B CN105573498B (en) 2018-05-08

Family

ID=55883728

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510939043.0A Active CN105573498B (en) 2015-12-15 2015-12-15 A kind of gesture identification method based on Wi-Fi signal

Country Status (1)

Country Link
CN (1) CN105573498B (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106066995A (en) * 2016-05-25 2016-11-02 西安交通大学 A kind of wireless unbundling human body behavioral value algorithm
CN106774894A (en) * 2016-12-16 2017-05-31 重庆大学 Interactive teaching methods and interactive system based on gesture
CN107067031A (en) * 2017-03-29 2017-08-18 西北大学 A kind of calligraphy posture automatic identifying method based on Wi Fi signals
CN107102729A (en) * 2017-04-05 2017-08-29 河南师范大学 A kind of PPT Demonstration Control Systems based on CSI gesture identifications
CN108459706A (en) * 2018-01-24 2018-08-28 重庆邮电大学 Wi-Fi gesture identification methods based on relative movement orbit tracking
CN109407833A (en) * 2018-09-30 2019-03-01 Oppo广东移动通信有限公司 Manipulate method, apparatus, electronic equipment and the storage medium of electronic equipment
CN109460716A (en) * 2018-10-19 2019-03-12 大连理工大学 A kind of sign language wireless-identification device and method
CN110110580A (en) * 2019-03-12 2019-08-09 西北大学 A kind of network struction of sign language isolated word recognition and classification method towards Wi-Fi signal
CN110458118A (en) * 2019-08-14 2019-11-15 南京邮电大学 Simple sign Language Recognition Method based on channel state information
CN107241698B (en) * 2017-07-17 2020-03-24 北京大学 Non-contact perception tracking method
CN112218303A (en) * 2020-09-28 2021-01-12 上海交通大学 Signal conversion method based on Wi-Fi identification system
CN114208216A (en) * 2020-07-15 2022-03-18 谷歌有限责任公司 Detecting contactless gestures using radio frequency
US11442550B2 (en) * 2019-05-06 2022-09-13 Samsung Electronics Co., Ltd. Methods for gesture recognition and control

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971108A (en) * 2014-05-28 2014-08-06 北京邮电大学 Wireless communication-based human body posture recognition method and device
CN104503575A (en) * 2014-12-18 2015-04-08 大连理工大学 Method for designing low-power-consumption gesture recognition circuit device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971108A (en) * 2014-05-28 2014-08-06 北京邮电大学 Wireless communication-based human body posture recognition method and device
CN104503575A (en) * 2014-12-18 2015-04-08 大连理工大学 Method for designing low-power-consumption gesture recognition circuit device

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106066995B (en) * 2016-05-25 2019-10-11 西安交通大学 A kind of wireless unbundling human body behavioral value algorithm
CN106066995A (en) * 2016-05-25 2016-11-02 西安交通大学 A kind of wireless unbundling human body behavioral value algorithm
CN106774894A (en) * 2016-12-16 2017-05-31 重庆大学 Interactive teaching methods and interactive system based on gesture
CN107067031A (en) * 2017-03-29 2017-08-18 西北大学 A kind of calligraphy posture automatic identifying method based on Wi Fi signals
CN107067031B (en) * 2017-03-29 2020-10-23 西北大学 Calligraphy posture automatic identification method based on Wi-Fi signal
CN107102729A (en) * 2017-04-05 2017-08-29 河南师范大学 A kind of PPT Demonstration Control Systems based on CSI gesture identifications
CN107241698B (en) * 2017-07-17 2020-03-24 北京大学 Non-contact perception tracking method
CN108459706A (en) * 2018-01-24 2018-08-28 重庆邮电大学 Wi-Fi gesture identification methods based on relative movement orbit tracking
CN109407833A (en) * 2018-09-30 2019-03-01 Oppo广东移动通信有限公司 Manipulate method, apparatus, electronic equipment and the storage medium of electronic equipment
CN109460716A (en) * 2018-10-19 2019-03-12 大连理工大学 A kind of sign language wireless-identification device and method
CN110110580A (en) * 2019-03-12 2019-08-09 西北大学 A kind of network struction of sign language isolated word recognition and classification method towards Wi-Fi signal
US11442550B2 (en) * 2019-05-06 2022-09-13 Samsung Electronics Co., Ltd. Methods for gesture recognition and control
CN110458118A (en) * 2019-08-14 2019-11-15 南京邮电大学 Simple sign Language Recognition Method based on channel state information
CN110458118B (en) * 2019-08-14 2022-08-12 南京邮电大学 Simple sign language identification method based on channel state information
CN114208216A (en) * 2020-07-15 2022-03-18 谷歌有限责任公司 Detecting contactless gestures using radio frequency
CN112218303A (en) * 2020-09-28 2021-01-12 上海交通大学 Signal conversion method based on Wi-Fi identification system
CN112218303B (en) * 2020-09-28 2022-02-18 上海交通大学 Signal conversion method based on Wi-Fi identification system

Also Published As

Publication number Publication date
CN105573498B (en) 2018-05-08

Similar Documents

Publication Publication Date Title
CN105573498A (en) Gesture recognition method based on Wi-Fi signal
Ding et al. RF-net: A unified meta-learning framework for RF-enabled one-shot human activity recognition
Chen et al. WristCam: A wearable sensor for hand trajectory gesture recognition and intelligent human–robot interaction
US10061389B2 (en) Gesture recognition system and gesture recognition method
Guo et al. HuAc: Human activity recognition using crowdsourced WiFi signals and skeleton data
CN111399642B (en) Gesture recognition method and device, mobile terminal and storage medium
EP3497467A1 (en) Control system and control processing method and apparatus
CN109325967A (en) Method for tracking target, device, medium and equipment
Chen et al. Human behavior recognition using Wi-Fi CSI: Challenges and opportunities
WO2019061949A1 (en) Motion-behavior-assisted indoor fusion positioning method and apparatus and storage medium
CN103353935A (en) 3D dynamic gesture identification method for intelligent home system
CN107992792A (en) A kind of aerial handwritten Chinese character recognition system and method based on acceleration transducer
Xiao et al. Motion-fi: Recognizing and counting repetitive motions with passive wireless backscattering
CN105844216A (en) Detection and matching mechanism for recognition of handwritten letters using WiFi signals
CN110113116B (en) Human behavior identification method based on WIFI channel information
Jannat et al. Efficient Wi-Fi-based human activity recognition using adaptive antenna elimination
EP2996067A1 (en) Method and device for generating motion signature on the basis of motion signature information
US11630518B2 (en) Ultrasound based air-writing system and method
CN106685590A (en) Channel state information and KNN (K-Nearest Neighbor)-based indoor human body orientation recognition method
CN110866468A (en) Gesture recognition system and method based on passive RFID
CN116057408A (en) Angle of arrival capability in electronic devices
Uysal et al. RF-Wri: An efficient framework for RF-based device-free air-writing recognition
US11841447B2 (en) 3D angle of arrival capability in electronic devices with adaptability via memory augmentation
Zhu et al. Wi-ATCN: Attentional temporal convolutional network for human action prediction using WiFi channel state information
Pan et al. Dynamic hand gesture detection and recognition with WiFi signal based on 1d-CNN

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Chen Xiaojiang

Inventor after: Wang Anwen

Inventor after: Ren Yuhui

Inventor after: Guo Songtao

Inventor after: He Gang

Inventor after: Xu Dan

Inventor after: Liu Dongdong

Inventor after: Chen Feng

Inventor after: Wang Liang

Inventor after: Li Wei

Inventor after: Tang Zhanyong

Inventor after: Peng Yao

Inventor after: Zhang Jie

Inventor before: Liu Dongdong

Inventor before: Guo Songtao

Inventor before: He Gang

Inventor before: Wang Liang

Inventor before: Li Wei

Inventor before: Chen Xiaojiang

Inventor before: Tang Zhanyong

Inventor before: Peng Yao

Inventor before: Zhang Jie

Inventor before: Wang Anwen

Inventor before: Ren Yuhui