CN107103302A - Behavior extracting method based on optimum detection thresholding - Google Patents

Behavior extracting method based on optimum detection thresholding Download PDF

Info

Publication number
CN107103302A
CN107103302A CN201710282897.5A CN201710282897A CN107103302A CN 107103302 A CN107103302 A CN 107103302A CN 201710282897 A CN201710282897 A CN 201710282897A CN 107103302 A CN107103302 A CN 107103302A
Authority
CN
China
Prior art keywords
mrow
msub
mfrac
munderover
msup
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710282897.5A
Other languages
Chinese (zh)
Other versions
CN107103302B (en
Inventor
田增山
王向勇
何艾琳
郭可可
周祥东
高罗莹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University of Post and Telecommunications
Original Assignee
Chongqing University of Post and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University of Post and Telecommunications filed Critical Chongqing University of Post and Telecommunications
Priority to CN201710282897.5A priority Critical patent/CN107103302B/en
Publication of CN107103302A publication Critical patent/CN107103302A/en
Application granted granted Critical
Publication of CN107103302B publication Critical patent/CN107103302B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Alarm Systems (AREA)
  • Other Investigation Or Analysis Of Materials By Electrical Means (AREA)

Abstract

The invention discloses a kind of behavior extracting method based on optimum detection thresholding, there are target and otherness without signal characteristic under dbjective state by analyzing CSI signals, optimum detection thresholding is asked for by Density Estimator method, and caches to extract the behavioral data of detection target with the optimum detection thresholding binding time sequence sought out.The present invention can apply in Human bodys' response technology extract behavior execution phase data;Solve artificial observation detection threshold it is inaccurate, it is inconvenient the problem of.

Description

Behavior extracting method based on optimum detection thresholding
Technical field
The present invention relates to behavior extractive technique, and in particular to a kind of behavior extracting method based on optimum detection thresholding.
Background technology
With the popularization developed rapidly with computer equipment of 21st century science and technology, human-computer interaction technology (Human Computer Interaction, HCI) have become the object of numerous state key concerns and research.So-called man-machine interaction, be Refer between user and computer equipment by the predesignated good interactive mode such as computer hardware, behavior act, sound, go to complete Specifying for task is so as to produce the process of information exchange.Human bodys' response is as its important field of research, to man-machine interaction The development of technology play the role of it is very great, to improve the mankind production and living have huge meaning.Traditional human body behavior Identification technology, is typically necessary carrying special equipment, for example, the Human bodys' response based on computer vision, based on radar Human bodys' response, the Human bodys' response based on wearable sensor device.Human body behavior based on computer vision is known Other technology is (main to monitoring objective mainly by imaging video or photographic intelligence in first-class capture apparatus acquisition monitor area People) key position detected and tracked, and by the attitude parameter residing for these positions, so that it is current to determine human body Behavior.But the Human bodys' response technology based on computer vision can only be operated under the more sufficient environment of light, Night and dim local recognition accuracy are very low.In addition, camera must could work in horizon range, and video surveillance It is related to the privacy concern of individual, its application is limited to a certain extent.Human bodys' response technology based on radar is to use The 60GHZ radar equipment behavior posture current to human body is acquired, and its precision is high, but sphere of action only 10cm, and sets It is standby extremely expensive, it is impossible to popularization and application.Human bodys' response technology based on wearable sensors equipment on the person by dressing Sensor device obtain the current behavior of user, this technology causes the inconvenience of user.Required for these technologies Equipment price is expensive, leverages the popularization of Human bodys' response technology, and with intelligence manufacture, wearable device, auxiliary Help the fast development of the technologies such as driving, smart home, virtual reality, somatic sensation television game, the interaction based on Tactile control increasingly by To limitation, the development of noncontact interactive mode becomes more and more important.Therefore, breaking away from the constraint of hardware turns into Human bodys' response The emphasis of technical research.Meanwhile, WLAN Human bodys' responses technology very big development machine has been given in the continuous popularization of WLAN Meeting.
In the Human bodys' response based on WLAN channel condition informations, it is necessary first to which the data that behavior performs the stage are entered Row is extracted.Existing behavior extracting mode, mainly by directly observing CSI signals in the execution stage of behavior and silent status Otherness, artificially set a threshold value, to detect starting and the end point of behavior, so that it is corresponding to extract behavioral phase Data.This method is after the behavior recognized the need for changing environment or increase newly, and researcher needs again to carry out signal Observation, sets new threshold value, and simple use this method, and system is easily disturbed by accidental error, causes behavior to carry Take mistake or extract incomplete.
The content of the invention
It is an object of the invention to provide a kind of behavior extracting method based on optimum detection thresholding, it can ask for detecting door automatically Limit, can effectively extract the behavioral data in Human bodys' response.
Behavior extracting method of the present invention based on optimum detection thresholding, comprises the following steps:
Step one:The Variance feature x in the CSI signal streams of training data is extracted using sliding window technologyt, wherein silent shape Variance feature matrix under state is expressed as x0, the Variance feature matrix under behavior state is expressed as x1
Step 2:Using Density Estimator method respectively to x0In element and x1In element carry out Multilayer networks, Obtain the distribution function of silent status and signal variance under behavior stateWith
Step 3:Traversal asks for each threshold value κjUnder false alarm rate Pe0,jWith false dismissed rate Pe1,j
Step 4:P will be madee,j=Pe0,j+Pe1,jReach the κ of minimumjIt is used as optimum detection thresholding κ;
Step 5:Setting caching band size S, and initialize caching band, cache=0;
Step 6:Current time data variance δ is calculated using sliding window technologyt
Step 7:Compare δtWith κ sizes, if δt>=κ, into step 8;Otherwise, into step 5;
Step 8:Caching Jia 1, cache=cache+1;
Step 9:Whether observation caching band is full, if cache>S, represents that caching band is full, into step 10;Into step Rapid nine;Otherwise, into step 6;
Step 10:Record current time is behavior start time t0
Step 11:Empty caching band cache=0;
Step 12:The δ at current time is calculated using sliding window technologyt
Step 13:Compare the δ at current timetWith optimum detection thresholding κ size, if δt<κ, enters step 14; Otherwise, into step 11;
Step 14:Caching Jia 1, cache=cache+1;
Step 15:Whether observation caching band is full, if cache>S, represents that caching band is full, enters step 10 six; Otherwise, step 12 is entered;
Step 10 six:Record current time is behavior finish time t1
Step 10 seven:Computing terminates, and returns to t0~t1The data of period.
In the step one:
Wherein, l represents sliding window length, | H0(t+i) | and | H1(t+i) | t+i moment silent status and behavior shape are represented respectively The amplitude for the CSI data that state is received;n0And n1The CSI signal data amounts that silent status and behavior state are received are represented respectively.
In the step 2:
Wherein, x represents characteristic point to be estimated.
In the step 3:
Wherein, κjRepresent a threshold value.
In the step 6:
Wherein, | Hr(t-i) | represent the amplitude of the CSI data at the t-i moment gathered in real time.
Beneficial effects of the present invention:This method can ask for detection threshold automatically, effectively extract in Human bodys' response Behavioral data, by analyzing CSI signals in silent status and the characteristic of behavior state, sliding window variance is extracted using sliding window technology As the feature for differentiating the behavior that whether there is, and utilize sliding window variance and the cunning of behavior state of the Density Estimator method to silent status Window variance carries out Multilayer networks.Meanwhile, using the probability density function under two states, seek out the optimal of behavior extraction Detection threshold.Final binding time sequence caching technology, extracts the corresponding data of behavioral phase.
Brief description of the drawings
Fig. 1 for the present invention in step one to step 9 flow chart;
Fig. 2 is step 10 in the present invention to the flow chart of step 11 seven;
Fig. 3 is one of true experiment environment schematic of the present invention (outdoor spaciousness environment);
Fig. 4 is two (indoor multipath environment) of the true experiment environment schematic of the present invention;
Fig. 5 is (outdoor empty for one of the Density Estimator result of signal variance under silent status in the present invention and behavior state Spacious environment);
Fig. 6 is two (indoor more for the Density Estimator result of signal variance under silent status and behavior state in the present invention Footpath environment);
Fig. 7 is that one of false alarm rate, false dismissed rate, F-score of each behavior calculated by optimum detection thresholding are (outdoor Spacious environment);
Fig. 8 is two (interiors of the false alarm rate, false dismissed rate, F-score of each behavior calculated by optimum detection thresholding Multi-path environment).
Embodiment
The present invention is described in detail below in conjunction with the accompanying drawings.
The behavior extracting method based on optimum detection thresholding as depicted in figs. 1 and 2, comprises the following steps:
Step one:The Variance feature x in the CSI signal streams of training data is extracted using sliding window technologyt, wherein silent shape Variance feature matrix under state is expressed asVariance feature matrix under behavior state is expressed as
Wherein, l represents sliding window length, | H0(t+i) | and | H1(t+i) | t+i moment silent status and behavior shape are represented respectively The amplitude for the CSI data that state is received;n0And n1The CSI signal data amounts that silent status and behavior state are received are represented respectively.
Step 2:Using Density Estimator method respectively to x0In element and x1In element carry out Multilayer networks, Obtain the distribution function of silent status and signal variance under behavior stateWith
Wherein,
Wherein, x represents characteristic point to be estimated.
Step 3:Traversal asks for each threshold value κjUnder false alarm rate Pe0,jWith false dismissed rate Pe1,j
Wherein, κjRepresent a threshold value.
Step 4:P will be madee,j=Pe0,j+Pe1,jReach the κ of minimumjIt is used as optimum detection thresholding κ.
Step 5:Setting caching band size S, and initialize caching band, cache=0.
Step 6:Current time data variance δ is calculated using sliding window technologyt
Wherein, | Hr(t-i) | represent the amplitude of the CSI data at the t-i moment gathered in real time.
Step 7:Compare δtWith κ sizes, if δt>=κ, into step 8;Otherwise, into step 5.
Step 8:Caching Jia 1, cache=cache+1.
Step 9:Whether observation caching band is full, if cache>S, represents that caching band is full, into step 10;Otherwise, enter Enter step 6.
Step 10:Record current time is behavior start time t0
Step 11:Empty caching band cache=0.
Step 12:The δ at current time is calculated using sliding window technologyt
Step 13:Compare the δ at current timetWith optimum detection thresholding κ size, if δt<κ, enters step 14; Otherwise, into step 11.
Step 14:Caching Jia 1, cache=cache+1.
Step 15:Whether observation caching band is full, if cache>S, represents that caching band is full, enters step 10 six; Otherwise, step 12 is entered.
Step 10 six:Record current time is behavior finish time t1
Step 10 seven:Computing terminates, and returns to t0~t1The data of period.
The test environment of the present invention specifically includes two kinds of typical environment, the spacious environment in outdoor as shown in Figure 3, and size is 57.6m × 51.0m, at a distance of 10m between receiver and emitter.Indoor multipath environment as shown in Figure 4, size be 13.3m × 7.8m, at a distance of 7.6m between receiver and emitter.5 kinds of behaviors that people often does in daily life are acquired in experiment, respectively It is:Walk, run, sit down, squat down, fall down, behavior database is set up respectively in above-mentioned two environment., should by taking an environment as an example Above-mentioned 5 kinds of behavioral tests are contained in database, every kind of 30 groups of behavior, 150 groups of behaviors altogether, the acquisition time of every group of behavior is long It is short inconsistent, to ask for optimum detection thresholding, collection silence data 10 minutes;Test data invites different volunteers to test this 5 kinds of behaviors, every kind of action gathers 100 groups, altogether 500 groups of behavioral tests.Data receiver frequency is 1000Hz in experiment, sets and slides Window length l=200 (0.2 second), buffer strip size S=200.
In order to verify the validity and reliability of the behavior extracting method proposed by the present invention based on optimum detection thresholding, such as Shown in Fig. 5 and Fig. 6, the Density Estimator knot of signal variance under silent status and behavior state in two kinds of environment sets forth Really, as a result show under the silent status calculated by Density Estimator method and behavior state under probability density curve have compared with Big difference.
As shown in Figure 7 and Figure 8, it sets forth each behavior for being calculated in two kinds of environment by optimum detection thresholding False alarm rate, false dismissed rate, F-score, as a result show the optimum detection thresholding obtained using the method for the invention to each behavior Variance feature make decisions, obtained false alarm rate and false dismissed rate is relatively low, and F-score can reach higher level.
Tables 1 and 2 sets forth the success rate of extracting for each behavior obtained in two kinds of environment by the inventive method, As a result show almost complete the extraction of all behaviors using the method for the invention.
Table 1 extracts accuracy rate outside Room
Accuracy rate is extracted in the Room of table 2

Claims (5)

1. the behavior extracting method based on optimum detection thresholding, it is characterised in that comprise the following steps:
Step one:The Variance feature x in the CSI signal streams of training data is extracted using sliding window technologyt, wherein under silent status Variance feature matrix be expressed as x0, the Variance feature matrix under behavior state is expressed as x1
Step 2:Using Density Estimator method respectively to x0In element and x1In element carry out Multilayer networks, obtain The distribution function of silent status and signal variance under behavior stateWith
Step 3:Traversal asks for each threshold value κjUnder false alarm rate Pe0,jWith false dismissed rate Pe1,j
Step 4:P will be madee,j=Pe0,j+Pe1,jReach the κ of minimumjIt is used as optimum detection thresholding κ;
Step 5:Setting caching band size S, and initialize caching band, cache=0;
Step 6:Current time data variance δ is calculated using sliding window technologyt
Step 7:Compare δtWith κ sizes, if δt>=κ, into step 8;Otherwise, into step 5;
Step 8:Caching Jia 1, cache=cache+1;
Step 9:Whether observation caching band is full, if cache>S, represents that caching band is full, into step 10;Otherwise, into step Rapid six;
Step 10:Record current time is behavior start time t0
Step 11:Empty caching band cache=0;
Step 12:The δ at current time is calculated using sliding window technologyt
Step 13:Compare the δ at current timetWith optimum detection thresholding κ size, if δt<κ, enters step 14;Otherwise, Into step 11;
Step 14:Caching Jia 1, cache=cache+1;
Step 15:Whether observation caching band is full, if cache>S, represents that caching band is full, enters step 10 six;Otherwise, Enter step 12;
Step 10 six:Record current time is behavior finish time t1
Step 10 seven:Computing terminates, and returns to t0~t1The data of period.
2. the behavior extracting method according to claim 1 based on optimum detection thresholding, it is characterised in that:The step one In:
<mrow> <msub> <mi>x</mi> <mn>0</mn> </msub> <mo>=</mo> <mo>&amp;lsqb;</mo> <msub> <mi>x</mi> <mrow> <mn>0</mn> <mo>,</mo> <mn>1</mn> </mrow> </msub> <mo>,</mo> <msub> <mi>x</mi> <mrow> <mn>0</mn> <mo>,</mo> <mn>2</mn> </mrow> </msub> <mo>,</mo> <mo>...</mo> <mo>,</mo> <msub> <mi>x</mi> <mrow> <mn>0</mn> <mo>,</mo> <msub> <mi>n</mi> <mn>0</mn> </msub> </mrow> </msub> <mo>&amp;rsqb;</mo> <mo>;</mo> </mrow>
<mrow> <msub> <mi>x</mi> <mn>1</mn> </msub> <mo>=</mo> <mo>&amp;lsqb;</mo> <msub> <mi>x</mi> <mrow> <mn>1</mn> <mo>,</mo> <mn>1</mn> </mrow> </msub> <mo>,</mo> <msub> <mi>x</mi> <mrow> <mn>1</mn> <mo>,</mo> <mn>2</mn> </mrow> </msub> <mo>,</mo> <mo>...</mo> <mo>,</mo> <msub> <mi>x</mi> <mrow> <mn>1</mn> <mo>,</mo> <msub> <mi>n</mi> <mn>1</mn> </msub> </mrow> </msub> <mo>&amp;rsqb;</mo> <mo>;</mo> </mrow>
<mrow> <msub> <mi>x</mi> <mrow> <mn>0</mn> <mo>,</mo> <mi>t</mi> </mrow> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>l</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <msup> <mrow> <mo>(</mo> <mo>|</mo> <mrow> <msub> <mi>H</mi> <mn>0</mn> </msub> <mrow> <mo>(</mo> <mrow> <mi>t</mi> <mo>+</mo> <mi>i</mi> </mrow> <mo>)</mo> </mrow> </mrow> <mo>|</mo> <mo>-</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>l</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <mo>|</mo> <mrow> <msub> <mi>H</mi> <mn>0</mn> </msub> <mrow> <mo>(</mo> <mrow> <mi>t</mi> <mo>+</mo> <mi>i</mi> </mrow> <mo>)</mo> </mrow> </mrow> <mo>|</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>,</mo> <mi>t</mi> <mo>=</mo> <mn>1</mn> <mo>,</mo> <mn>2</mn> <mo>,</mo> <mn>...</mn> <mo>,</mo> <msub> <mi>n</mi> <mn>0</mn> </msub> <mo>;</mo> </mrow>
<mrow> <msub> <mi>x</mi> <mrow> <mn>1</mn> <mo>,</mo> <mi>t</mi> </mrow> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>l</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <msup> <mrow> <mo>(</mo> <mo>|</mo> <mrow> <msub> <mi>H</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mrow> <mi>t</mi> <mo>+</mo> <mi>i</mi> </mrow> <mo>)</mo> </mrow> </mrow> <mo>|</mo> <mo>-</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>l</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <mo>|</mo> <mrow> <msub> <mi>H</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mrow> <mi>t</mi> <mo>+</mo> <mi>i</mi> </mrow> <mo>)</mo> </mrow> </mrow> <mo>|</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>,</mo> <mi>t</mi> <mo>=</mo> <mn>1</mn> <mo>,</mo> <mn>2</mn> <mo>,</mo> <mo>...</mo> <mo>,</mo> <msub> <mi>n</mi> <mn>1</mn> </msub> <mo>;</mo> </mrow>
Wherein, l represents sliding window length, | H0(t+i) | and | H1(t+i) | represent that t+i moment silent status and behavior state connect respectively The amplitude of the CSI data of receipts;n0And n1The CSI signal data amounts that silent status and behavior state are received are represented respectively.
3. the behavior extracting method according to claim 2 based on optimum detection thresholding, it is characterised in that the step 2 In:
<mrow> <msub> <mover> <mi>f</mi> <mo>^</mo> </mover> <mn>0</mn> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <msub> <mi>n</mi> <mn>0</mn> </msub> <msub> <mi>h</mi> <mn>0</mn> </msub> </mrow> </mfrac> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>t</mi> <mo>=</mo> <mn>1</mn> </mrow> <msub> <mi>n</mi> <mn>0</mn> </msub> </munderover> <mi>K</mi> <mrow> <mo>(</mo> <mfrac> <mrow> <mi>x</mi> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mn>0</mn> <mo>,</mo> <mi>t</mi> </mrow> </msub> </mrow> <msub> <mi>h</mi> <mn>0</mn> </msub> </mfrac> <mo>)</mo> </mrow> <mo>;</mo> </mrow> 1
<mrow> <msub> <mover> <mi>f</mi> <mo>^</mo> </mover> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <msub> <mi>n</mi> <mn>1</mn> </msub> <msub> <mi>h</mi> <mn>1</mn> </msub> </mrow> </mfrac> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>t</mi> <mo>=</mo> <mn>1</mn> </mrow> <msub> <mi>n</mi> <mn>1</mn> </msub> </munderover> <mi>K</mi> <mrow> <mo>(</mo> <mfrac> <mrow> <mi>x</mi> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mn>1</mn> <mo>,</mo> <mi>t</mi> </mrow> </msub> </mrow> <msub> <mi>h</mi> <mn>1</mn> </msub> </mfrac> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
<mrow> <msub> <mi>h</mi> <mn>0</mn> </msub> <mo>=</mo> <mn>3.49</mn> <mo>&amp;times;</mo> <msqrt> <mrow> <mfrac> <mn>1</mn> <msub> <mi>n</mi> <mn>0</mn> </msub> </mfrac> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>t</mi> <mo>=</mo> <mn>1</mn> </mrow> <msub> <mi>n</mi> <mn>0</mn> </msub> </munderover> <msup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mn>0</mn> <mo>,</mo> <mi>t</mi> </mrow> </msub> <mo>-</mo> <mfrac> <mn>1</mn> <msub> <mi>n</mi> <mn>0</mn> </msub> </mfrac> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>t</mi> <mo>=</mo> <mn>1</mn> </mrow> <msub> <mi>n</mi> <mn>0</mn> </msub> </munderover> <msub> <mi>x</mi> <mrow> <mn>0</mn> <mo>,</mo> <mi>t</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> <mo>&amp;times;</mo> <msup> <msub> <mi>n</mi> <mn>0</mn> </msub> <mrow> <mo>-</mo> <mn>1</mn> <mo>/</mo> <mn>5</mn> </mrow> </msup> <mo>;</mo> </mrow>
<mrow> <msub> <mi>h</mi> <mn>1</mn> </msub> <mo>=</mo> <mn>3.49</mn> <mo>&amp;times;</mo> <msqrt> <mrow> <mfrac> <mn>1</mn> <msub> <mi>n</mi> <mn>1</mn> </msub> </mfrac> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>t</mi> <mo>=</mo> <mn>1</mn> </mrow> <msub> <mi>n</mi> <mn>1</mn> </msub> </munderover> <msup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mn>1</mn> <mo>,</mo> <mi>t</mi> </mrow> </msub> <mo>-</mo> <mfrac> <mn>1</mn> <msub> <mi>n</mi> <mn>1</mn> </msub> </mfrac> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>t</mi> <mo>=</mo> <mn>1</mn> </mrow> <msub> <mi>n</mi> <mn>1</mn> </msub> </munderover> <msub> <mi>x</mi> <mrow> <mn>1</mn> <mo>,</mo> <mi>t</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> <mo>&amp;times;</mo> <msup> <msub> <mi>n</mi> <mn>1</mn> </msub> <mrow> <mo>-</mo> <mn>1</mn> <mo>/</mo> <mn>5</mn> </mrow> </msup> <mo>;</mo> </mrow>
<mrow> <mi>K</mi> <mrow> <mo>(</mo> <mfrac> <mrow> <mi>x</mi> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mn>0</mn> <mo>,</mo> <mi>t</mi> </mrow> </msub> </mrow> <msub> <mi>h</mi> <mn>0</mn> </msub> </mfrac> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msqrt> <mrow> <mn>2</mn> <mi>&amp;pi;</mi> </mrow> </msqrt> </mfrac> <msup> <mi>e</mi> <mrow> <mo>-</mo> <mfrac> <msup> <mrow> <mo>(</mo> <mfrac> <mrow> <mi>x</mi> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mn>0</mn> <mo>,</mo> <mi>t</mi> </mrow> </msub> </mrow> <msub> <mi>h</mi> <mn>0</mn> </msub> </mfrac> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mn>2</mn> </mfrac> </mrow> </msup> <mo>;</mo> </mrow>
<mrow> <mi>K</mi> <mrow> <mo>(</mo> <mfrac> <mrow> <mi>x</mi> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mn>1</mn> <mo>,</mo> <mi>t</mi> </mrow> </msub> </mrow> <msub> <mi>h</mi> <mn>1</mn> </msub> </mfrac> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msqrt> <mrow> <mn>2</mn> <mi>&amp;pi;</mi> </mrow> </msqrt> </mfrac> <msup> <mi>e</mi> <mrow> <mo>-</mo> <mfrac> <msup> <mrow> <mo>(</mo> <mfrac> <mrow> <mi>x</mi> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mn>1</mn> <mo>,</mo> <mi>t</mi> </mrow> </msub> </mrow> <msub> <mi>h</mi> <mn>1</mn> </msub> </mfrac> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mn>2</mn> </mfrac> </mrow> </msup> <mo>;</mo> </mrow>
Wherein, x represents characteristic point to be estimated.
4. the behavior extracting method according to claim 3 based on optimum detection thresholding, it is characterised in that the step 3 In:
<mrow> <msub> <mi>P</mi> <mrow> <mi>e</mi> <mn>0</mn> <mo>,</mo> <mi>j</mi> </mrow> </msub> <mo>=</mo> <mi>P</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>&gt;</mo> <msub> <mi>&amp;kappa;</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <msubsup> <mo>&amp;Integral;</mo> <msub> <mi>&amp;kappa;</mi> <mi>j</mi> </msub> <mrow> <mo>+</mo> <mi>&amp;infin;</mi> </mrow> </msubsup> <msub> <mover> <mi>f</mi> <mo>^</mo> </mover> <mn>0</mn> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mi>d</mi> <mi>x</mi> <mo>;</mo> </mrow>
<mrow> <msub> <mi>P</mi> <mrow> <mi>e</mi> <mn>1</mn> <mo>,</mo> <mi>j</mi> </mrow> </msub> <mo>=</mo> <mi>P</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>&lt;</mo> <msub> <mi>&amp;kappa;</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <msubsup> <mo>&amp;Integral;</mo> <mrow> <mo>-</mo> <mi>&amp;infin;</mi> </mrow> <msup> <mrow></mrow> <msub> <mi>&amp;kappa;</mi> <mi>j</mi> </msub> </msup> </msubsup> <msub> <mover> <mi>f</mi> <mo>^</mo> </mover> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mi>d</mi> <mi>x</mi> <mo>;</mo> </mrow>
Wherein, κjRepresent a threshold value.
5. the behavior extracting method according to claim 4 based on optimum detection thresholding, it is characterised in that the step 6 In:
<mrow> <msub> <mi>&amp;delta;</mi> <mi>t</mi> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>l</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <msup> <mrow> <mo>(</mo> <mo>|</mo> <mrow> <msub> <mi>H</mi> <mi>r</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>t</mi> <mo>-</mo> <mi>i</mi> </mrow> <mo>)</mo> </mrow> </mrow> <mo>|</mo> <mo>-</mo> <mfrac> <mn>1</mn> <mi>l</mi> </mfrac> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>l</mi> </munderover> <mo>|</mo> <mrow> <msub> <mi>H</mi> <mi>r</mi> </msub> <mrow> <mo>(</mo> <mrow> <mi>t</mi> <mo>-</mo> <mi>i</mi> </mrow> <mo>)</mo> </mrow> </mrow> <mo>|</mo> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>;</mo> </mrow>
Wherein, | Hr(t-i) | represent the amplitude of the CSI data at the t-i moment gathered in real time.
CN201710282897.5A 2017-04-26 2017-04-26 Behavior extraction method based on optimal detection threshold Active CN107103302B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710282897.5A CN107103302B (en) 2017-04-26 2017-04-26 Behavior extraction method based on optimal detection threshold

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710282897.5A CN107103302B (en) 2017-04-26 2017-04-26 Behavior extraction method based on optimal detection threshold

Publications (2)

Publication Number Publication Date
CN107103302A true CN107103302A (en) 2017-08-29
CN107103302B CN107103302B (en) 2020-04-17

Family

ID=59656410

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710282897.5A Active CN107103302B (en) 2017-04-26 2017-04-26 Behavior extraction method based on optimal detection threshold

Country Status (1)

Country Link
CN (1) CN107103302B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111683102A (en) * 2020-06-17 2020-09-18 绿盟科技集团股份有限公司 FTP behavior data processing method, and method and device for identifying abnormal FTP behavior

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101984364A (en) * 2010-10-15 2011-03-09 北京航空航天大学 GPS weak signal capturing method based on sequential probability ratio
CN102129692A (en) * 2011-03-31 2011-07-20 中国民用航空总局第二研究所 Method and system for detecting motion target in double threshold scene
CN104955149A (en) * 2015-06-10 2015-09-30 重庆邮电大学 Indoor WLAN (wireless local area network) passive intrusion detection and positioning method based on fuzzy rule updating

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101984364A (en) * 2010-10-15 2011-03-09 北京航空航天大学 GPS weak signal capturing method based on sequential probability ratio
CN102129692A (en) * 2011-03-31 2011-07-20 中国民用航空总局第二研究所 Method and system for detecting motion target in double threshold scene
CN104955149A (en) * 2015-06-10 2015-09-30 重庆邮电大学 Indoor WLAN (wireless local area network) passive intrusion detection and positioning method based on fuzzy rule updating

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ZENGSHAN TIAN: "A Highly-accurate Device-free Passive Motion Detection System Using Cellular Network", 《IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE (WCNC 2016)》 *
谷雨: "基于WIFI背景噪音的被动式人体行为识别研究", 《中国科学技术大学学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111683102A (en) * 2020-06-17 2020-09-18 绿盟科技集团股份有限公司 FTP behavior data processing method, and method and device for identifying abnormal FTP behavior
CN111683102B (en) * 2020-06-17 2022-12-06 绿盟科技集团股份有限公司 FTP behavior data processing method, and method and device for identifying abnormal FTP behavior

Also Published As

Publication number Publication date
CN107103302B (en) 2020-04-17

Similar Documents

Publication Publication Date Title
WO2020052319A1 (en) Target tracking method, apparatus, medium, and device
CN107289949B (en) Indoor guidance identification device and method based on face identification technology
CN111161320B (en) Target tracking method, target tracking device and computer readable medium
CN102831439B (en) Gesture tracking method and system
CN105825524B (en) Method for tracking target and device
EP3633615A1 (en) Deep learning network and average drift-based automatic vessel tracking method and system
CN110113116B (en) Human behavior identification method based on WIFI channel information
CN105590099B (en) A kind of more people&#39;s Activity recognition methods based on improvement convolutional neural networks
CN103105924B (en) Man-machine interaction method and device
WO2017177903A1 (en) Online verification method and system for real-time gesture detection
CN105980963A (en) System and method for controlling playback of media using gestures
CN104821010A (en) Binocular-vision-based real-time extraction method and system for three-dimensional hand information
CN112560723A (en) Fall detection method and system based on form recognition and speed estimation
CN112303848A (en) Air conditioner regulation and control method, device and system
CN113052127A (en) Behavior detection method, behavior detection system, computer equipment and machine readable medium
CN114821753B (en) Eye movement interaction system based on visual image information
Avola et al. Machine learning for video event recognition
CN112995757B (en) Video clipping method and device
CN107103302A (en) Behavior extracting method based on optimum detection thresholding
WO2024012367A1 (en) Visual-target tracking method and apparatus, and device and storage medium
CN111950500A (en) Real-time pedestrian detection method based on improved YOLOv3-tiny in factory environment
CN112541403A (en) Indoor personnel falling detection method utilizing infrared camera
CN116701954A (en) Infrastructure state identification method based on IMU data
CN116704547A (en) Human body posture detection method based on GCN-LSTM under privacy protection
Zheng Gesture recognition real-time control system based on YOLOV4

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant