CN110275161B - Wireless human body posture recognition method applied to intelligent bathroom - Google Patents

Wireless human body posture recognition method applied to intelligent bathroom Download PDF

Info

Publication number
CN110275161B
CN110275161B CN201910570977.XA CN201910570977A CN110275161B CN 110275161 B CN110275161 B CN 110275161B CN 201910570977 A CN201910570977 A CN 201910570977A CN 110275161 B CN110275161 B CN 110275161B
Authority
CN
China
Prior art keywords
data
steps
energy
characteristic
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910570977.XA
Other languages
Chinese (zh)
Other versions
CN110275161A (en
Inventor
苏瀚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taizhou Ruilian Technology Co ltd
Original Assignee
Taizhou Ruilian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taizhou Ruilian Technology Co ltd filed Critical Taizhou Ruilian Technology Co ltd
Priority to CN201910570977.XA priority Critical patent/CN110275161B/en
Publication of CN110275161A publication Critical patent/CN110275161A/en
Application granted granted Critical
Publication of CN110275161B publication Critical patent/CN110275161B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • General Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a wireless human body posture identification method applied to an intelligent bathroom, which comprises the following steps: step 1, designing hardware pretreatment; step 2, preprocessing data; step 3, feature extraction; step 4, establishing a training model; and 5, predicting in real time. The invention comprises five steps of hardware design, data preprocessing, feature extraction, model training, real-time prediction and the like, on one hand, the invention has simple and flexible data communication system construction structure, strong universality and expansion capability, and on the other hand, the invention has strong data processing capability, high detection precision and relatively small data calculation amount, thereby effectively realizing accurate identification of the appointed action, and through a large amount of tests, the forward and backward accuracy can reach 98%, the sitting and standing accuracy can reach 98%, and the upward and downward waving accuracy can reach 95% and 95%.

Description

Wireless human body posture recognition method applied to intelligent bathroom
Technical Field
The invention relates to the technical field of computers, in particular to a wireless human body posture identification method applied to an intelligent bathroom.
Background
Gesture recognition, as a human-computer interaction method, has been one of the main research subjects in the field of computer science. This technology allows a computer to understand human pointing without the aid of traditional interaction hardware such as a mouse and keyboard. Traditional gesture recognition systems are mainly based on cameras and image processing algorithms. Although camera-based gesture recognition systems provide reliable recognition rates, they have limitations, the most obvious of which is the susceptibility to the brightness of light. Furthermore, when processor and battery resources are limited, the high demand for computing and power consumption will limit its applications. Moreover, camera-based recognition systems may inherently pose privacy concerns in public use.
Recently, radar-based gesture recognition has attracted public interest. Compared with the traditional method, the gesture recognition based on the radar has unique advantages. First, the camera is difficult to catch clear image under dim light, and the radar signal is not influenced, can use widely in dark environment. Secondly, the continuous wave doppler radar sensor detects the doppler effect of moving objects with time-frequency signal spread, which can be realized by a low-cost framework. I.e. the doppler phase frequency change caused by human gestures is limited to a few hertz, the price of the analog-to-digital converter (ADC) and the cost of the baseband device are low. Therefore, radar-based gesture systems have significant advantages in practical applications. The existing market microwave sensor mainly aims at the detection of the existence of people, has poor universality and expansion capability, has low data processing capability, poor detection precision and relatively large data calculation amount, and cannot effectively realize the accurate identification of specified actions.
Disclosure of Invention
In view of the above defects in the prior art, the present invention provides a wireless human body posture recognition method applied to an intelligent bathroom, so as to solve the defects in the prior art.
In order to achieve the purpose, the invention provides a wireless human body posture recognition method applied to an intelligent bathroom, which comprises the following steps:
step 1, designing hardware pretreatment, wherein a sensor receives a returned microwave signal, and the microwave signal is sent to a back-end data processing unit after being processed by an amplifying circuit, hardware filtering, digital-to-analog conversion and the like;
step 2, data preprocessing, namely acquiring data of the channel I and the channel Q and storing the data into a cache, designing an FIR low-pass filter and removing high-frequency components;
step 3, feature extraction, namely extracting I and Q data in a time window and extracting original features;
step 4, establishing a training model, acquiring a complete waveform of the gesture action by a training data set through real-time sampling, intercepting a starting point and an ending point, taking the middle section as an effective waveform, and manually marking a label;
and 5, predicting in real time, wherein the prediction adopts a sliding window mode, and an svm classifier is used for a result obtained by svm prediction again to serve as a secondary judgment model.
Further, the step 2 data preprocessing specifically includes: collecting data of an I channel and a Q channel every 100ms by a signal, storing the data into a cache, processing the data every 500ms, wherein the walking frequency band is 15-20hz, the arm swinging frequency band is 70-80hz, designing an FIR low-pass filter, removing high-frequency components, and designing parameters of the FIR low-pass filter are as follows: cut-off frequency: 200hz, order: 32.
further, the step 3 of feature extraction specifically comprises: extracting I and Q data in a 500ms time window, and converting the I and Q data into the following complex operation, wherein s is a target signal, I is I channel data, Q is Q channel data, and 1I is an imaginary symbol:
Figure 594069DEST_PATH_IMAGE001
through short-time Fourier change, the specific calculation formula is as follows:
Figure 898011DEST_PATH_IMAGE002
the feature points obtained after STFT are denoted as sp, and are a matrix with 20 rows and 12 columns, where a frequency band from 0 to 2048 is a positive frequency direction and from 2049 to 4096 is a negative frequency direction, and the steps of extracting the original features are mainly as follows:
1) extracting sp positive frequency from 3 to 20 points and negative frequency from 4076 to 4096 points, and adding adjacent characteristic points by an image characteristic point selection rule;
2) carrying out logarithmic operation on the obtained matrix, wherein the span between matrix values is reduced;
3) and performing matrix expansion, connecting each row of the matrix to the end of the previous row, wherein the dimension of the expanded array is 240, and the original characteristic values are 240.
Further, the step 3 of feature extraction further includes: energy information is extracted, experiments show that different gesture motion energy values are obviously different and are used as additional characteristic points to be trained, and an energy calculation formula is as follows:
Figure 235452DEST_PATH_IMAGE003
all the characteristic signals are spliced into a matrix according to original information and energy, each row in the matrix represents each sample, the first 240 of each column are original characteristic values, the 241 th column is energy information, and normalization operation is performed on each row, and the method specifically comprises the following steps:
1) finding out a point with an energy value larger than-1, defaulting that the energy does not reach the gesture standard when the energy is smaller than-1, and setting the point as disturbance;
2) for all energies plus 1, the formula is given as follows, with the integer number up to the maximum energy in all frequency points:
Figure 206819DEST_PATH_IMAGE004
further, the establishing of the training model in the step 4 specifically includes: the method comprises the steps that a training data set obtains a complete waveform of gesture actions through real-time sampling, starting and ending points are intercepted, the middle section is used as an effective waveform, labels are labeled manually and respectively comprise double-click, upward-swing, downward-swing, forward-moving, backward-moving, sitting down and standing up, the characteristics of the waveform are extracted through a characteristic extraction mode in the step 3, the characteristic numerical value is between-1 and 1, the characteristics and the labels are used as training samples, a svm classification algorithm is adopted, and values of a penalty factor and a Gaussian kernel parameter under the best model are obtained through cross validation and comparison, namely the best svm model is obtained.
Further, the step 5 of real-time prediction specifically adopts a sliding window mode, data of 100ms is newly acquired and data of 400ms in a cache are added, 500ms of data are obtained through aggregation and are judged once, a judgment result is obtained every 100ms, continuous sequences such as swing-down and swing-down may occur in the middle, a sequence with the length of 5 is taken to return an actual result, an svm classifier is used again, the sequence with the length of 5 and the actual label result are put into the classifier, and a model for secondary judgment is trained.
The invention has the beneficial effects that:
the invention relates to a wireless human body posture recognition method applied to an intelligent bathroom, which comprises five steps of hardware design, data preprocessing, feature extraction, model training, real-time prediction and the like, wherein on one hand, a data communication system is simple and flexible in construction structure and strong in universality and expansion capability, on the other hand, the data processing capability is strong, the detection precision is high, and the data calculation amount is relatively small, so that the accurate recognition of the specified action can be effectively realized, through a large number of tests, the forward and backward movement accuracy can reach 98%, the sitting and standing accuracy can reach 98%, and the upward and downward waving accuracy can reach 95% and 95%.
The conception, the specific structure and the technical effects of the present invention will be further described with reference to the accompanying drawings to fully understand the objects, the features and the effects of the present invention.
Drawings
FIG. 1 is a flow chart of the present invention.
FIG. 2 is a time domain diagram of the step of feature extraction according to the present invention.
FIG. 3 is a graph of the frequency domain of the volatile phase in the feature extraction step of the present invention.
FIG. 4 is a time domain plot of the volatility of the feature extraction step of the present invention.
FIG. 5 is a plot of the volatility domain at the feature extraction step of the present invention.
FIG. 6 is a flow chart of the feature extraction step matrix expansion operation of the present invention.
Detailed Description
As shown in fig. 1, a wireless human body posture recognition method applied to an intelligent bathroom includes the following steps:
step 1, designing hardware pretreatment, wherein a sensor receives a returned microwave signal, and the microwave signal is sent to a back-end data processing unit after being processed by an amplifying circuit, hardware filtering, digital-to-analog conversion and the like;
step 2, data preprocessing, namely acquiring data of the channel I and the channel Q and storing the data into a cache, designing an FIR low-pass filter and removing high-frequency components;
step 3, feature extraction, namely extracting I and Q data in a time window and extracting original features;
step 4, establishing a training model, acquiring a complete waveform of the gesture action by a training data set through real-time sampling, intercepting a starting point and an ending point, taking the middle section as an effective waveform, and manually marking a label;
and 5, predicting in real time, wherein the prediction adopts a sliding window mode, and an svm classifier is used for a result obtained by svm prediction again to serve as a secondary judgment model.
The step 2 of data preprocessing specifically comprises the following steps: collecting data of an I channel and a Q channel every 100ms by a signal, storing the data into a cache, processing the data every 500ms, wherein the walking frequency band is 15-20hz, the arm swinging frequency band is 70-80hz, designing an FIR low-pass filter, removing high-frequency components, and designing parameters of the FIR low-pass filter are as follows: cut-off frequency: 200hz, order: 32.
the step 3 of feature extraction specifically comprises the following steps: buffered 500msI channel and Q channel data, by
Figure 807564DEST_PATH_IMAGE001
Obtaining a complex form, and through short-time Fourier change, the specific calculation formula is as follows:
Figure 864382DEST_PATH_IMAGE002
the feature points obtained after STFT are denoted as sp, and are a matrix with 20 rows and 12 columns, where a frequency band from 0 to 2048 is a positive frequency direction and from 2049 to 4096 is a negative frequency direction, and the steps of extracting the original features are mainly as follows:
1) extracting sp positive frequency from 3 to 20 points and negative frequency from 4076 to 4096 points, and adding adjacent characteristic points by an image characteristic point selection rule;
2) carrying out logarithmic operation on the obtained matrix, wherein the span between matrix values is reduced as much as possible;
3) and performing matrix expansion, connecting each row of the matrix to the end of the previous row, wherein the dimension of the expanded array is 240, and the original characteristic values are 240.
Wherein, the step 3 of feature extraction further comprises: energy information is extracted, experiments show that different gesture motion energy values are obviously different and are used as additional characteristic points to be trained, and an energy calculation formula is as follows:
Figure 214635DEST_PATH_IMAGE003
all the characteristic signals are spliced into a matrix according to original information and energy, each row in the matrix represents each sample, the first 240 of each column are original characteristic values, the 241 th column is energy information, and normalization operation is performed on each row, and the method specifically comprises the following steps:
1) finding out a point with an energy value larger than-1, defaulting that the energy does not reach the gesture standard when the energy is smaller than-1, and setting the point as disturbance;
2) for all energies plus 1, the formula is given as follows, with the integer number up to the maximum energy in all frequency points:
Figure 40509DEST_PATH_IMAGE004
wherein, the step 4 of establishing the training model specifically comprises the following steps: the method comprises the steps that a training data set obtains a complete waveform of gesture actions through real-time sampling, starting and ending points are intercepted, the middle section is used as an effective waveform, labels are labeled manually and respectively comprise double-click, upward-swing, downward-swing, forward-moving, backward-moving, sitting down and standing up, the characteristics of the waveform are extracted through a characteristic extraction mode in the step 3, the characteristic numerical value is between-1 and 1, the characteristics and the labels are used as training samples, a svm classification algorithm is adopted, and values of a penalty factor and a Gaussian kernel parameter under the best model are obtained through cross validation and comparison, namely the best svm model is obtained.
The step 5 of real-time prediction specifically adopts a sliding window mode, data of 100ms is newly acquired and data of 400ms in a cache are added, 500ms of data are obtained through integration and are judged once, a judgment result is obtained every 100ms, continuous sequences such as a swing-up and swing-down sequence can possibly occur in the middle, a sequence with the length of 5 is taken to return an actual result, an svm classifier is used again, the sequence with the length of 5 and the actual label result are placed into the classifier, and a secondary judgment model is trained.
The invention relates to a wireless human body posture recognition method applied to an intelligent bathroom, which comprises five steps of hardware design, data preprocessing, feature extraction, model training, real-time prediction and the like, and the specific application of the embodiment shows that on one hand, a data communication system is simple and flexible in construction structure, strong in universality and expandability, on the other hand, the data processing capacity is strong, the detection precision is high, and the data calculation amount is relatively small, so that the accurate recognition of the specified action can be effectively realized, through a large number of tests, the forward and backward accuracy can reach 98%, the sitting and standing accuracy can reach 98%, and the hand waving accuracy is 95%.
The foregoing detailed description of the preferred embodiments of the invention has been presented. Many modifications and variations will be apparent to those of ordinary skill in the art in light of the above teachings without undue experimentation. Therefore, the technical solutions available to those skilled in the art through logic analysis, reasoning and limited experiments based on the prior art according to the concept of the present invention should be within the scope of protection defined by the claims.

Claims (5)

1. A wireless human body posture recognition method applied to intelligent bathrooms is characterized by comprising the following steps: the method comprises the following steps:
step 1, designing hardware pretreatment, wherein a sensor receives a returned microwave signal, and the microwave signal is sent to a back-end data processing unit after being processed by an amplifying circuit, hardware filtering, digital-to-analog conversion and the like;
step 2, data preprocessing, namely acquiring data of the channel I and the channel Q and storing the data into a cache, designing an FIR low-pass filter and removing high-frequency components;
step 3, feature extraction, namely extracting I and Q data in a time window and extracting original features;
step 4, establishing a training model, acquiring a complete waveform of the gesture action by a training data set through real-time sampling, intercepting a starting point and an ending point, taking the middle section as an effective waveform, and manually marking a label;
and 5, predicting in real time, namely, using an svm classifier again as a secondary judgment model for a result obtained by predicting the svm in a sliding window mode, wherein the characteristic extraction in the step 3 specifically comprises the following steps: extracting I and Q data in a 500ms time window, and converting the I and Q data into the following complex operation, wherein s is a target signal, I is I channel data, Q is Q channel data, and 1I is an imaginary symbol:
Figure 261696DEST_PATH_IMAGE001
through short-time Fourier change, the specific calculation formula is as follows:
Figure 281605DEST_PATH_IMAGE002
the feature points obtained after STFT are denoted as sp, and are a matrix with 20 rows and 12 columns, where a frequency band from 0 to 2048 is a positive frequency direction and from 2049 to 4096 is a negative frequency direction, and the steps of extracting the original features are mainly as follows:
1) extracting sp positive frequency from 3 to 20 points and negative frequency from 4076 to 4096 points, and adding adjacent characteristic points by an image characteristic point selection rule;
2) carrying out logarithmic operation on the obtained matrix, wherein the span between matrix values is reduced;
3) and performing matrix expansion, connecting each row of the matrix to the end of the previous row, wherein the dimension of the expanded array is 240, and the original characteristic values are 240.
2. The wireless human body posture recognition method applied to the intelligent bathroom in claim 1, characterized in that: the step 2 of data preprocessing specifically comprises the following steps: collecting data of an I channel and a Q channel every 100ms by a signal, storing the data into a cache, processing the data every 500ms, wherein the walking frequency band is 15-20hz, the arm swinging frequency band is 70-80hz, designing an FIR low-pass filter, removing high-frequency components, and designing parameters of the FIR low-pass filter are as follows: cut-off frequency: 200hz, order: 32.
3. the method for recognizing the human body posture in the intelligent bathroom in claim 1, wherein the step 3 of feature extraction further comprises: energy information is extracted, experiments show that different gesture motion energy values are obviously different and are used as additional characteristic points to be trained, and an energy calculation formula is as follows:
Figure 80933DEST_PATH_IMAGE003
all the characteristic signals are spliced into a matrix according to original information and energy, each row in the matrix represents each sample, the first 240 of each column are original characteristic values, the 241 th column is energy information, and normalization operation is performed on each row, and the method specifically comprises the following steps:
1) finding out a point with an energy value larger than-1, defaulting that the energy does not reach the gesture standard when the energy is smaller than-1, and setting the point as disturbance;
2) all the energies plus 1 are normalized by the value of the upward integer with the maximum energy in all the frequency points, and the formula is as follows:
Figure 932215DEST_PATH_IMAGE004
4. the method for recognizing the wireless human body posture applied to the intelligent bathroom in claim 1, wherein the step 4 of establishing the training model specifically comprises the following steps: the method comprises the steps that a training data set obtains a complete waveform of gesture actions through real-time sampling, starting and ending points are intercepted, the middle section is used as an effective waveform, labels are labeled manually and respectively comprise double-click, upward-swing, downward-swing, forward-moving, backward-moving, sitting down and standing up, the characteristics of the waveform are extracted through a characteristic extraction mode in the step 3, the characteristic numerical value is between-1 and 1, the characteristics and the labels are used as training samples, a svm classification algorithm is adopted, and values of a penalty factor and a Gaussian kernel parameter under the best model are obtained through cross validation and comparison, namely the best svm model is obtained.
5. The method as claimed in claim 1, wherein the step 5 of real-time prediction specifically includes predicting by using a sliding window method, determining the data of 500ms by combining the newly acquired data of 100ms and the data of 400ms in the buffer memory, determining the result once every 100ms, taking a sequence with a length of 5 and returning an actual result, and putting the sequence with a length of 5 and the actual result of the tag into the classifier by using the svm classifier to train a model for secondary determination.
CN201910570977.XA 2019-06-28 2019-06-28 Wireless human body posture recognition method applied to intelligent bathroom Active CN110275161B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910570977.XA CN110275161B (en) 2019-06-28 2019-06-28 Wireless human body posture recognition method applied to intelligent bathroom

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910570977.XA CN110275161B (en) 2019-06-28 2019-06-28 Wireless human body posture recognition method applied to intelligent bathroom

Publications (2)

Publication Number Publication Date
CN110275161A CN110275161A (en) 2019-09-24
CN110275161B true CN110275161B (en) 2021-12-07

Family

ID=67963560

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910570977.XA Active CN110275161B (en) 2019-06-28 2019-06-28 Wireless human body posture recognition method applied to intelligent bathroom

Country Status (1)

Country Link
CN (1) CN110275161B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112180357A (en) * 2020-09-15 2021-01-05 珠海格力电器股份有限公司 Safety protection method and system
CN115884120A (en) * 2022-11-22 2023-03-31 南方医科大学珠江医院 AD patient dynamic posture recognition system based on wireless signal

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002020287A1 (en) * 2000-09-08 2002-03-14 Automotive Technologies International, Inc. Vehicle wireless sensing and communication system
CN101897640A (en) * 2010-08-10 2010-12-01 北京师范大学 Novel movement imagery electroencephalogram control-based intelligent wheelchair system
KR20120089948A (en) * 2010-12-30 2012-08-16 인제대학교 산학협력단 Real-time gesture recognition using mhi shape information
CN103914149A (en) * 2014-04-01 2014-07-09 复旦大学 Gesture interaction method and gesture interaction system for interactive television
CN104020848A (en) * 2014-05-15 2014-09-03 中航华东光电(上海)有限公司 Static gesture recognizing method
CN104279766A (en) * 2014-11-07 2015-01-14 苏州市格普斯电器有限公司 Control equipment of water heater
CN105138131A (en) * 2015-09-01 2015-12-09 冯仕昌 General gesture command transmitting and operating device and method
CN105423558A (en) * 2014-09-09 2016-03-23 芜湖美的厨卫电器制造有限公司 Water heater, water heater system and control method for water heater
CN106127110A (en) * 2016-06-15 2016-11-16 中国人民解放军第四军医大学 A kind of human body fine granularity motion recognition method based on UWB radar with optimum SVM
CN109214431A (en) * 2018-08-15 2019-01-15 深圳先进技术研究院 Sample training method, classification method, recognition methods, device, medium and system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108062170A (en) * 2017-12-15 2018-05-22 南京师范大学 Multi-class human posture recognition method based on convolutional neural networks and intelligent terminal
CN108509897A (en) * 2018-03-29 2018-09-07 同济大学 A kind of human posture recognition method and system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002020287A1 (en) * 2000-09-08 2002-03-14 Automotive Technologies International, Inc. Vehicle wireless sensing and communication system
CN101897640A (en) * 2010-08-10 2010-12-01 北京师范大学 Novel movement imagery electroencephalogram control-based intelligent wheelchair system
KR20120089948A (en) * 2010-12-30 2012-08-16 인제대학교 산학협력단 Real-time gesture recognition using mhi shape information
CN103914149A (en) * 2014-04-01 2014-07-09 复旦大学 Gesture interaction method and gesture interaction system for interactive television
CN104020848A (en) * 2014-05-15 2014-09-03 中航华东光电(上海)有限公司 Static gesture recognizing method
CN105423558A (en) * 2014-09-09 2016-03-23 芜湖美的厨卫电器制造有限公司 Water heater, water heater system and control method for water heater
CN104279766A (en) * 2014-11-07 2015-01-14 苏州市格普斯电器有限公司 Control equipment of water heater
CN105138131A (en) * 2015-09-01 2015-12-09 冯仕昌 General gesture command transmitting and operating device and method
CN106127110A (en) * 2016-06-15 2016-11-16 中国人民解放军第四军医大学 A kind of human body fine granularity motion recognition method based on UWB radar with optimum SVM
CN109214431A (en) * 2018-08-15 2019-01-15 深圳先进技术研究院 Sample training method, classification method, recognition methods, device, medium and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HUMAN GAIT RECOGNITION BASED ON MOTION ANALYSIS;HAN SU and FENG-GANG HUANG;《Proceedings of the Fourth International Conference on Machine Learning and Cybernetics》;20050821;第4464-4468页 *
无线感知网络中基于CSI的室内入侵检测与行为识别研究;周健;《中国优秀硕士学位论文全文数据库 信息科技辑》;20190215;第三章和第四章 *

Also Published As

Publication number Publication date
CN110275161A (en) 2019-09-24

Similar Documents

Publication Publication Date Title
CN110309690B (en) Gesture recognition detection method based on time frequency spectrum and range-Doppler spectrum
Li et al. Towards domain-independent and real-time gesture recognition using mmwave signal
Zhang et al. Dynamic hand gesture classification based on radar micro-Doppler signatures
Lahiani et al. Hand gesture recognition method based on HOG-LBP features for mobile devices
CN110275161B (en) Wireless human body posture recognition method applied to intelligent bathroom
CN111178331B (en) Radar image recognition system, method, apparatus, and computer-readable storage medium
CN111445500B (en) Analysis method, device, equipment and storage medium for experimental living body behaviors
Zhao et al. Continuous human motion recognition using micro-Doppler signatures in the scenario with micro motion interference
WO2023029390A1 (en) Millimeter wave radar-based gesture detection and recognition method
Jin et al. Action recognition using vague division DMMs
CN113064483A (en) Gesture recognition method and related device
Xie et al. Radar target detection using convolutional neutral network in clutter
Magno et al. Fanncortexm: An open source toolkit for deployment of multi-layer neural networks on arm cortex-m family microcontrollers: Performance analysis with stress detection
CN116482680B (en) Body interference identification method, device, system and storage medium
Hayajneh et al. Channel state information based device free wireless sensing for IoT devices employing TinyML
CN117092592A (en) Gesture recognition method based on wavelet analysis and CBAM attention mechanism improvement
CN110309689B (en) Gabor domain gesture recognition detection method based on ultra-wideband radar
CN111680540A (en) Dynamic gesture recognition method and device
CN115754956A (en) Millimeter wave radar gesture recognition method based on envelope data time sequence
Sheng et al. Dyliteradhar: Dynamic lightweight slowfast network for human activity recognition using mmwave radar
Zhou et al. Efficiently user-independent ultrasonic-based gesture recognition algorithm
Amaravati et al. A light-powered smart camera with compressed domain gesture detection
Lei et al. Automatic recognition of basic strokes based on FMCW radar system
Gu et al. Millimeter Wave Radar-based Human Activity Recognition for Healthcare Monitoring Robot
Jian et al. A robust real-time human activity recognition method based on attention-augmented GRU

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant