CN107103302B - Behavior extraction method based on optimal detection threshold - Google Patents

Behavior extraction method based on optimal detection threshold Download PDF

Info

Publication number
CN107103302B
CN107103302B CN201710282897.5A CN201710282897A CN107103302B CN 107103302 B CN107103302 B CN 107103302B CN 201710282897 A CN201710282897 A CN 201710282897A CN 107103302 B CN107103302 B CN 107103302B
Authority
CN
China
Prior art keywords
cache
behavior
detection threshold
entering
moment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710282897.5A
Other languages
Chinese (zh)
Other versions
CN107103302A (en
Inventor
田增山
王向勇
何艾琳
郭可可
周祥东
高罗莹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University of Post and Telecommunications
Original Assignee
Chongqing University of Post and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University of Post and Telecommunications filed Critical Chongqing University of Post and Telecommunications
Priority to CN201710282897.5A priority Critical patent/CN107103302B/en
Publication of CN107103302A publication Critical patent/CN107103302A/en
Application granted granted Critical
Publication of CN107103302B publication Critical patent/CN107103302B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Alarm Systems (AREA)
  • Other Investigation Or Analysis Of Materials By Electrical Means (AREA)

Abstract

The invention discloses a behavior extraction method based on an optimal detection threshold, which is characterized in that the difference of signal characteristics of CSI signals under the states of a target and a non-target is analyzed, the optimal detection threshold is obtained by a kernel density estimation method, and behavior data of the detection target are extracted by combining the obtained optimal detection threshold with time sequence cache. The method can be applied to the human body behavior recognition technology to extract the behavior execution stage data; the problem of artificially observing that the detection threshold is inaccurate and inconvenient is solved.

Description

Behavior extraction method based on optimal detection threshold
Technical Field
The invention relates to a behavior extraction technology, in particular to a behavior extraction method based on an optimal detection threshold.
Background
With the rapid development of science and technology and the popularization of computer equipment in the twenty-first century, Human Computer Interaction (HCI) has become the subject of much focus and research in many countries. The man-machine interaction refers to a process of generating information exchange by completing a specified task through a pre-specified interaction mode such as computer hardware, behavior and action, sound and the like between a user and computer equipment. As an important research field, human behavior recognition plays a very important role in the development of human-computer interaction technology and has great significance in improving human production and life. The traditional human behavior recognition technology generally needs to carry special equipment, such as human behavior recognition based on computer vision, human behavior recognition based on radar, and human behavior recognition based on wearable sensor equipment. The human body behavior recognition technology based on computer vision mainly obtains video or photo information in a monitoring area through shooting equipment such as a camera and the like, detects and tracks key parts of a monitoring target (mainly a person), and parameterizes postures of the parts, so that the current behavior of the human body is judged. However, the human behavior recognition technology based on computer vision can only operate in an environment with sufficient light, and the recognition accuracy is low at night and in dim places. In addition, the camera must be in the range of sight distance to work, and video monitoring involves privacy concerns of individuals, limiting its application to some extent. The human body behavior recognition technology based on the radar is that the current behavior posture of a human body is collected by adopting 60GHZ radar equipment, the precision is high, but the action range is only 10cm, and the equipment is extremely expensive and cannot be popularized and applied. The human behavior recognition technology based on wearable sensor equipment obtains the current behaviors of a user through the sensor equipment worn on the human body, and the technology causes inconvenience to the user. The devices required by the technologies are expensive, popularization of human behavior recognition technology is greatly influenced, interaction based on touch control is increasingly limited along with rapid development of technologies such as intelligent manufacturing, wearable devices, auxiliary driving, intelligent home, virtual reality and motion sensing games, and development of a non-contact interaction mode becomes more and more important. Therefore, the elimination of the hardware becomes the key point of the research on the human behavior recognition technology. Meanwhile, the continuous popularization of the wireless local area network provides great development opportunities for WLAN human behavior recognition technology.
In the human behavior recognition based on the WLAN channel state information, data of a behavior execution stage needs to be extracted first. In the existing behavior extraction mode, differences between the execution phase and the silent state of a CSI signal in a behavior are directly observed, and a threshold is artificially set to detect the start and end points of the behavior, so as to extract data corresponding to the behavior phase. After the environment is changed or a new behavior needing to be identified is added, a researcher needs to observe the signal again and set a new threshold, and the system is easily interfered by accidental errors by simply using the method, so that the behavior extraction is wrong or incomplete.
Disclosure of Invention
The invention aims to provide a behavior extraction method based on an optimal detection threshold, which can automatically obtain the detection threshold and effectively extract behavior data in human behavior recognition.
The behavior extraction method based on the optimal detection threshold comprises the following steps:
the method comprises the following steps: method for extracting variance feature x in CSI signal stream of training data by using sliding window technologytWherein the variance feature matrix in the silence state is represented as x0The variance feature matrix in the behavior state is represented as x1
Step two: using nuclear density estimation method to respectively align x0Element (ii) and x1The elements in the system are subjected to probability density estimation to obtain a distribution function of signal variance in a silent state and a behavior state
Figure BDA0001279963600000021
And
Figure BDA0001279963600000022
step three: traversing to obtain each threshold value kjFalse alarm rate of lower Pe0,jAnd rate of missed alarm Pe1,j
Step four: will make Pe,j=Pe0,j+Pe1,jTo a minimum of kappajAs the optimal detection threshold κ;
step five: setting the size S of a cache band, and initializing the cache band, wherein the cache is equal to 0;
step six: calculating the data variance delta at the current moment by using a sliding window technologyt
Step seven: comparison of deltatAnd the size of kappa, if deltatEntering step eight if not less than kappa; otherwise, entering the step five;
step eight: adding 1 into a cache, wherein the cache is cache + 1;
step nine: observing whether the cache band is full, if the cache is greater than S, indicating that the cache band is full, and entering a step ten; entering the ninth step; otherwise, entering the step six;
step ten: recording the current moment as the action starting moment t0
Step eleven: emptying the cache of the cache tape to be 0;
step twelve: calculating delta at the current time using a sliding window techniquet
Step thirteen: comparing delta at the current timetThe size of the optimum detection threshold k, if δt<Kappa, go to step fourteen; otherwise, entering the step eleven;
fourteen steps: adding 1 into a cache, wherein the cache is cache + 1;
step fifteen: observing whether the cache band is full, if the cache is greater than S, indicating that the cache band is full, and entering the step sixteen; otherwise, go to step twelve;
sixthly, the steps are as follows: recording the current moment as the action end moment t1
Seventeen steps: when the operation is finished, return to t0~t1Data of time periods.
In the first step:
Figure BDA0001279963600000031
Figure BDA0001279963600000032
Figure BDA0001279963600000033
Figure BDA0001279963600000034
wherein, l represents the length of the sliding window, | H0(t + i) | and | H1(t + i) | represents the amplitude of the CSI data received in the silent state and the behavior state at the moment of t + i respectively; n is0And n1Respectively representing the amount of CSI signal data received in the mute state and the behavior state.
In the second step:
Figure BDA0001279963600000035
Figure BDA0001279963600000036
Figure BDA0001279963600000037
Figure BDA0001279963600000038
Figure BDA0001279963600000039
Figure BDA00012799636000000310
wherein x represents a feature point to be estimated.
In the third step:
Figure BDA0001279963600000041
Figure BDA0001279963600000042
wherein, κjA threshold value is indicated.
In the sixth step:
Figure BDA0001279963600000043
wherein, | HrAnd (t-i) | represents the amplitude of the CSI data acquired in real time at the t-i moment.
The invention has the beneficial effects that: the method can automatically obtain the detection threshold, effectively extract the behavior data in human behavior recognition, extract the sliding window variance as the characteristic for distinguishing whether the behavior exists or not by analyzing the characteristics of the CSI signal in the silent state and the behavior state and carry out probability density estimation on the sliding window variance in the silent state and the sliding window variance in the behavior state by using a kernel density estimation method. Meanwhile, the optimal detection threshold extracted by the behavior is solved by using probability density functions in two states. And finally, extracting data corresponding to the behavior stage by combining a time sequence caching technology.
Drawings
FIG. 1 is a flow chart of steps one through nine of the present invention;
FIG. 2 is a flow chart showing steps ten to eleven in the present invention;
FIG. 3 is a schematic diagram of a real experimental environment according to the present invention (open outdoor environment);
FIG. 4 is a second schematic diagram of a real experimental environment (indoor multipath environment) according to the present invention;
FIG. 5 is one of the results of kernel density estimation of signal variance in silence state and behavior state (open outdoor environment) in the present invention;
FIG. 6 shows the second kernel density estimation result of signal variance in silence state and behavior state (indoor multipath environment) according to the present invention;
FIG. 7 is a graph of false alarm rate, F-score (outdoor open environment) for each behavior calculated by the optimal detection threshold;
fig. 8 shows the false alarm rate, and two F-score (indoor multipath environment) of each behavior calculated by the optimal detection threshold.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings.
The behavior extraction method based on the optimal detection threshold as shown in fig. 1 and fig. 2 includes the following steps:
the method comprises the following steps: method for extracting variance feature x in CSI signal stream of training data by using sliding window technologytWherein the variance feature matrix in the silence state is represented as
Figure BDA0001279963600000051
The variance feature matrix in the behavior state is expressed as
Figure BDA0001279963600000052
Figure BDA0001279963600000053
Figure BDA0001279963600000054
Wherein, l represents the length of the sliding window, | H0(t + i) | and | H1(t + i) | represents the amplitude of the CSI data received in the silent state and the behavior state at the moment of t + i respectively; n is0And n1Respectively representing the amount of CSI signal data received in the mute state and the behavior state.
Step two: using nuclear density estimation method to respectively align x0Element (ii) and x1The elements in the system are subjected to probability density estimation to obtain a distribution function of signal variance in a silent state and a behavior state
Figure BDA0001279963600000055
And
Figure BDA0001279963600000056
Figure BDA0001279963600000057
Figure BDA0001279963600000058
wherein the content of the first and second substances,
Figure BDA0001279963600000059
Figure BDA00012799636000000510
Figure BDA00012799636000000511
Figure BDA00012799636000000512
wherein x represents a feature point to be estimated.
Step three: traversing to obtain each threshold value kjFalse alarm rate of lower Pe0,jAnd rate of missed alarm Pe1,j
Figure BDA0001279963600000061
Figure BDA0001279963600000062
Wherein, κjA threshold value is indicated.
Step four: will make Pe,j=Pe0,j+Pe1,jTo a minimum of kappajAs an optimal detection threshold k.
Step five: the cache band size S is set and the cache band is initialized, cache 0.
Step six: calculating the data variance delta at the current moment by using a sliding window technologyt
Figure BDA0001279963600000063
Wherein, | HrAnd (t-i) | represents the amplitude of the CSI data acquired in real time at the t-i moment.
Step seven: comparison of deltatAnd the size of kappa, if deltatEntering step eight if not less than kappa; otherwise, go to step five.
Step eight: and adding 1 to the cache, and changing the cache to cache + 1.
Step nine: observing whether the cache band is full, if the cache is greater than S, indicating that the cache band is full, and entering a step ten; otherwise, go to step six.
Step ten: recording the current moment as the action starting moment t0
Step eleven: and emptying the cache tape to be 0.
Step twelve: calculating delta at the current time using a sliding window techniquet
Step thirteen: comparing delta at the current timetThe size of the optimum detection threshold k, if δt<Kappa, go to step fourteen; otherwise, go to step eleven.
Fourteen steps: and adding 1 to the cache, and changing the cache to cache + 1.
Step fifteen: observing whether the cache band is full, if the cache is greater than S, indicating that the cache band is full, and entering the step sixteen; otherwise, go to step twelve.
Sixthly, the steps are as follows: recording the current moment as the action end moment t1
Seventeen steps: when the operation is finished, return to t0~t1Data of time periods.
The testing environment of the present invention specifically includes two typical environments, such as the outdoor open environment shown in fig. 3, which is 57.6m × 51.0m, and the distance between the receiver and the transmitter is 10 m. The indoor multipath environment shown in fig. 4 is 13.3m × 7.8m, and the distance between the receiver and the transmitter is 7.6 m. 5 behaviors which are frequently done by people in daily life are collected in the experiment and respectively: walking, running, sitting, squatting and falling, and respectively establishing behavior databases in the two environments. Taking an environment as an example, the database contains the 5 test behaviors, each behavior has 30 groups, 150 groups of behaviors are counted, the acquisition time of each group of behaviors is different, and silent data is acquired for 10 minutes in order to obtain an optimal detection threshold; test data different volunteers were invited to test these 5 behaviors, 100 groups were collected for each action, for a total of 500 test behaviors. In the experiment, the data receiving frequency is 1000Hz, the sliding window length l is set to be 200(0.2 seconds), and the size of the buffer zone S is set to be 200.
In order to verify the effectiveness and reliability of the behavior extraction method based on the optimal detection threshold, as shown in fig. 5 and fig. 6, the kernel density estimation results of the signal variance in the silence state and the behavior state in two environments are respectively given, and the result shows that the probability density curves in the silence state and the behavior state calculated by the kernel density estimation method have a larger difference.
As shown in fig. 7 and 8, the false alarm rate, and the F-score of each behavior calculated by the optimal detection threshold in two environments are respectively given, and the results show that the variance characteristics of each behavior are judged by the optimal detection threshold calculated by the method of the present invention, the obtained false alarm rate and the obtained false alarm rate are both low, and the F-score can reach a high level.
The results of table 1 and table 2 show the success rate of extraction of each behavior in two environments, which was determined by the method of the present invention, and show that almost all behaviors can be extracted by the method of the present invention.
TABLE 1 outdoor extraction accuracy
Figure BDA0001279963600000071
TABLE 2 indoor extraction accuracy
Figure BDA0001279963600000072

Claims (1)

1. The behavior extraction method based on the optimal detection threshold is characterized by comprising the following steps of:
the method comprises the following steps: method for extracting variance feature x in CSI signal stream of training data by using sliding window technologytWherein the variance feature matrix in the silence state is represented as x0The variance feature matrix in the behavior state is represented as x1(ii) a Wherein:
Figure FDA0002381758510000011
Figure FDA0002381758510000012
Figure FDA0002381758510000013
Figure FDA0002381758510000014
wherein, l represents the length of the sliding window, | H0(t + i) | and | H1(t + i) | represents the amplitude of the CSI data received in the silent state and the behavior state at the moment of t + i respectively; n is0And n1Respectively representing the data volume of the CSI signals received by the silent state and the behavior state;
step two: using nuclear density estimation method to respectively align x0Element (ii) and x1The elements in the system are subjected to probability density estimation to obtain a distribution function of signal variance in a silent state and a behavior state
Figure FDA0002381758510000015
And
Figure FDA0002381758510000016
Figure FDA0002381758510000017
Figure FDA0002381758510000018
Figure FDA0002381758510000019
Figure FDA00023817585100000110
Figure FDA00023817585100000111
Figure FDA00023817585100000112
wherein x represents a feature point to be estimated;
step three: traversing to obtain each threshold value kjFalse alarm rate of lower Pe0,jAnd rate of missed alarm Pe1,j
Figure FDA0002381758510000021
Figure FDA0002381758510000022
Wherein, κjRepresents a threshold value;
step four: will make Pe,j=Pe0,j+Pe1,jTo a minimum of kappajAs the optimal detection threshold κ;
step five: setting the size S of a cache band, and initializing the cache band, wherein the cache is equal to 0;
step six: calculating the data variance delta at the current moment by using a sliding window technologyt(ii) a Wherein:
Figure FDA0002381758510000023
wherein, | Hr(t-i) | represents the amplitude of the CSI data acquired in real time at the t-i moment;
step seven: comparison of deltatAnd the size of kappa, if deltatEntering step eight if not less than kappa; otherwise, entering the step five;
step eight: adding 1 into a cache, wherein the cache is cache + 1;
step nine: observing whether the cache band is full, if the cache is greater than S, indicating that the cache band is full, and entering a step ten; otherwise, entering the step six;
step ten: recording the current moment as the action starting moment t0
Step eleven: emptying the cache of the cache tape to be 0;
step twelve: calculating delta at the current time using a sliding window techniquet
Step thirteen: comparing delta at the current timetThe size of the optimum detection threshold k, if δt<Kappa, go to step fourteen; otherwise, entering the step eleven;
fourteen steps: adding 1 into a cache, wherein the cache is cache + 1;
step fifteen: observing whether the cache band is full, if the cache is greater than S, indicating that the cache band is full, and entering the step sixteen; otherwise, go to step twelve;
sixthly, the steps are as follows: recording the current moment as the action end moment t1
Seventeen steps: when the operation is finished, return to t0~t1Data of time periods.
CN201710282897.5A 2017-04-26 2017-04-26 Behavior extraction method based on optimal detection threshold Active CN107103302B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710282897.5A CN107103302B (en) 2017-04-26 2017-04-26 Behavior extraction method based on optimal detection threshold

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710282897.5A CN107103302B (en) 2017-04-26 2017-04-26 Behavior extraction method based on optimal detection threshold

Publications (2)

Publication Number Publication Date
CN107103302A CN107103302A (en) 2017-08-29
CN107103302B true CN107103302B (en) 2020-04-17

Family

ID=59656410

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710282897.5A Active CN107103302B (en) 2017-04-26 2017-04-26 Behavior extraction method based on optimal detection threshold

Country Status (1)

Country Link
CN (1) CN107103302B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111683102B (en) * 2020-06-17 2022-12-06 绿盟科技集团股份有限公司 FTP behavior data processing method, and method and device for identifying abnormal FTP behavior

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101984364A (en) * 2010-10-15 2011-03-09 北京航空航天大学 GPS weak signal capturing method based on sequential probability ratio
CN102129692A (en) * 2011-03-31 2011-07-20 中国民用航空总局第二研究所 Method and system for detecting motion target in double threshold scene
CN104955149A (en) * 2015-06-10 2015-09-30 重庆邮电大学 Indoor WLAN (wireless local area network) passive intrusion detection and positioning method based on fuzzy rule updating

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101984364A (en) * 2010-10-15 2011-03-09 北京航空航天大学 GPS weak signal capturing method based on sequential probability ratio
CN102129692A (en) * 2011-03-31 2011-07-20 中国民用航空总局第二研究所 Method and system for detecting motion target in double threshold scene
CN104955149A (en) * 2015-06-10 2015-09-30 重庆邮电大学 Indoor WLAN (wireless local area network) passive intrusion detection and positioning method based on fuzzy rule updating

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A Highly-accurate Device-free Passive Motion Detection System Using Cellular Network;Zengshan Tian;《IEEE Wireless Communications and Networking Conference (WCNC 2016)》;20160915;论文第1-3节 *
基于WIFI背景噪音的被动式人体行为识别研究;谷雨;《中国科学技术大学学报》;20150430;第15卷(第4期);全文 *

Also Published As

Publication number Publication date
CN107103302A (en) 2017-08-29

Similar Documents

Publication Publication Date Title
Ling et al. Ultragesture: Fine-grained gesture sensing and recognition
CN111399642B (en) Gesture recognition method and device, mobile terminal and storage medium
CN105242779B (en) A kind of method and mobile intelligent terminal of identification user action
Saengsri et al. TFRS: Thai finger-spelling sign language recognition system
CN102640085A (en) System and method for recognizing gestures
Ko et al. Online context recognition in multisensor systems using dynamic time warping
CN113609976B (en) Direction-sensitive multi-gesture recognition system and method based on WiFi equipment
CN104394588B (en) Indoor orientation method based on Wi Fi fingerprints and Multidimensional Scaling
CN112001347B (en) Action recognition method based on human skeleton morphology and detection target
Alaoui et al. Fall detection for elderly people using the variation of key points of human skeleton
Xu et al. Attention-based gait recognition and walking direction estimation in wi-fi networks
Zhang et al. WiFiMap+: high-level indoor semantic inference with WiFi human activity and environment
CN109805936B (en) Human body tumbling detection system based on ground vibration signal
CN108182418A (en) A kind of thump recognition methods based on multidimensional acoustic characteristic
CN114038012A (en) Fall detection method and system based on millimeter wave radar and machine learning
CN109657572A (en) Goal behavior recognition methods after a kind of wall based on Wi-Fi
CN107103302B (en) Behavior extraction method based on optimal detection threshold
CN111262637A (en) Human body behavior identification method based on Wi-Fi channel state information CSI
CN111475030A (en) Micro-gesture recognition method using near-infrared sensor
CN112069483A (en) User identification and authentication method of intelligent wearable device
CN106792505A (en) A kind of target tracking system and method
CN110674694B (en) Activity signal separation method based on commercial WiFi
CN109308133A (en) Intelligent interaction projects interaction technique
Chen et al. Long term hand tracking with proposal selection
Melnyk et al. Towards computer assisted international sign language recognition system: a systematic survey

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant