CN114384999B - User-independent myoelectric gesture recognition system based on self-adaptive learning - Google Patents

User-independent myoelectric gesture recognition system based on self-adaptive learning Download PDF

Info

Publication number
CN114384999B
CN114384999B CN202111376022.4A CN202111376022A CN114384999B CN 114384999 B CN114384999 B CN 114384999B CN 202111376022 A CN202111376022 A CN 202111376022A CN 114384999 B CN114384999 B CN 114384999B
Authority
CN
China
Prior art keywords
samples
sample
user
value
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111376022.4A
Other languages
Chinese (zh)
Other versions
CN114384999A (en
Inventor
李玉榕
郑楠
张文萱
李吉祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuzhou University
Original Assignee
Fuzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou University filed Critical Fuzhou University
Priority to CN202111376022.4A priority Critical patent/CN114384999B/en
Publication of CN114384999A publication Critical patent/CN114384999A/en
Application granted granted Critical
Publication of CN114384999B publication Critical patent/CN114384999B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24147Distances to closest patterns, e.g. nearest neighbour classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Dermatology (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Human Computer Interaction (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a user-independent myoelectric gesture recognition system based on self-adaptive learning, which comprises a data acquisition unit, a clustering unit, a self-adaptive KNN neighbor classifier and a risk evaluator which are connected in sequence; the data acquisition unit acquires the existing user data and performs data processing; the clustering unit is used for clustering the signal data after data processing, adopting K-Means to find clustering centers of different actions, extracting N samples with shortest distances between each action of each user and the clustering centers, and taking the N samples as a training set for training the self-adaptive KNN neighbor classifier; the self-adaptive KNN neighbor classifier is used for obtaining corresponding labels according to new user data; the risk evaluator evaluates the new user data, and the qualified samples are used to replace remote samples of the training set and update weights of the training set samples. The invention solves the problem of model non-universality caused by individual variability of electromyographic signals, does not need a user retraining step, greatly improves the use experience of the user, and dynamically improves the recognition accuracy.

Description

User-independent myoelectric gesture recognition system based on self-adaptive learning
Technical Field
The invention relates to the field of gesture recognition, in particular to a user-independent myoelectric gesture recognition system based on self-adaptive learning.
Background
With the development of device intelligence, a computer-centric human-computer interaction (Human Machine Interaction, HMI) mode is being changed into a human-centric natural human-computer interaction mode in the past, and the change makes human beings realize natural communication with a computer only by gestures or ideas. The surface electromyographic signals directly reflect the activity degree of muscles, and the movement intention of a user can be obtained after analysis, so that the method is widely applied to the field of human-computer interaction.
Currently, human-computer interaction systems based on surface electromyographic signals generally comprise two main processes: 1) Offline training a gesture classification model by using a training sample of a user; 2) The trained classification model is utilized to conduct online gesture recognition, and the recognition result is used as control input of man-machine interaction, so that the system can obtain better recognition performance under various gesture actions. However, the method applied by the above procedure is user-dependent, i.e. the training data and the test data of the model are both from the same user. Due to differences in skin impedance, muscle morphology, physiology, psychology, and the like, there may be a large difference in electromyographic signals when different users perform the same action, so that the classification performance of the user-related model application in new user gesture recognition becomes worse. Therefore, in order to ensure excellent performance of the man-machine interaction system, a great deal of time is generally required for different users to acquire, process and model data, and the process is tedious, so that the use experience of the users can be greatly reduced. Therefore, it is necessary to design a user independent method that can simplify the use process of a new user and can be applied to a wearable human-computer interaction system.
Disclosure of Invention
Therefore, the invention aims to provide the user-independent myoelectric gesture recognition system based on self-adaptive learning, which solves the problem of model non-universality caused by individual variability of myoelectric signals, does not need a user retraining step, greatly improves the use experience of users, and dynamically improves the recognition accuracy.
In order to achieve the above purpose, the invention adopts the following technical scheme:
a user-independent myoelectric gesture recognition system based on self-adaptive learning comprises a data acquisition unit, a clustering unit, a self-adaptive KNN neighbor classifier and a risk evaluator which are connected in sequence;
the data acquisition unit acquires the existing user data and performs data processing;
the clustering unit is used for clustering the preprocessed signal data by adopting K-Means to find clustering centers of different actions, extracting N samples, which are closest to the clustering centers, of each action of each user to serve as a training set, and training the self-adaptive KNN neighbor classifier;
the self-adaptive KNN neighbor classifier is used for obtaining corresponding labels according to new user data;
the risk evaluator evaluates the new user data, and the qualified samples are used to replace remote samples of the training set and update weights of the training set samples.
Further, the data processing comprises the steps of removing power frequency 50Hz noise in the signals, full-wave rectification, 3-order Butterworth low-pass filtering, active segment extraction, feature extraction and normalization processing.
Further, in the process of extracting the active segment, a sliding window with a window length of 200ms and a step length of 50ms is set, sliding window processing is carried out on the envelope line after low-pass filtering, and each sliding window processing is marked as T (T); taking a threshold value which is 0.015 times of the peak value in the window, and determining the segment as an active segment when the data lasting 180ms in the T (T) are all larger than the threshold value; when the duration of data within T (T) greater than the threshold is less than 180ms, the segment is determined to be a rest segment;
for each active segment, extracting muscle synergy using a non-negative matrix factorization algorithm:
wherein V is a sample set matrix, W is a muscle co-matrix, H is an activated sparse matrix, i=1, 2,3, o=1, 2, 3..o, j=1, 2, 3..j, I is the number of muscles, J is the number of sampling points, O is the number of muscle cooperations;
the number of muscle cooperations is determined by the VAF value
Further, the normalization process specifically includes, for each muscle, W:
wherein W_max and W_min are the maximum and minimum of W, respectively.
Further, the method adopts K-Means clustering to find cluster centers of different actions, specifically:
let the sample to be measured be muscle cooperated W, and the category cluster be C 1 ,C 2 ,C 3 ,C 4 ,μ i For category cluster C i Is optimized for minimizing the error E:
setting an initial centroid as a vector mean value of each gesture category, recording category centroids of each generation, and ending when the iteration number reaches an upper limit or the difference between the last generation centroid and the last generation centroid is smaller than a preset value;
the final class centroid serves as a template, N samples, nearest to the template, of each action of each user are selected to serve as representative samples, and a training set is formed.
Further, the adaptive KNN classifier may automatically select an appropriate K value, specifically:
for a new user sample W to be detected, firstly calculating a training set Y= [ Y ] obtained by the new user sample W and the second part 1 ,y 2 ,...y n ]Euclidean distance of (c):
where i=1, 2, ·l, l is the number of samples of the training set;
then, ascending sorting is carried out on the distance matrix D, the value of the nearest neighbor number k is a multiple of 5 each time, and the number of the maximum category and the second category of each k is recorded until the value of k reaches the upper limit; then, judging whether the number of the two categories has crossing points, and if so, taking the optimal K value as follows:
wherein p is first Is the first intersection;
if there is no intersection, the score for each k is calculated as:
s is the number of the maximum class, d is the distance farthest from the sample to be tested in the maximum class, then the score is ordered, and K corresponding to the maximum value is K.
Further, the risk assessment is specifically as follows:
finding a training set of the gesture corresponding to the test sample label, finding the farthest point between the training set and the sample template by using Euclidean distance, marking as p, and calculating the risk of replacing the farthest point p by the sample W to be tested by using the following formula:
wherein the first term of the denominator represents the degree of similarity in the classes, the second term represents the degree of difference between the classes, A is the sample to be tested or the furthest sample, mu 1 Templates, mu, representing the corresponding classes of tags 24 Representing templates of other classes. I is the characteristic length, i.e., the number of muscles;
if the risk value of the test sample is smaller than that of the furthest point, the inter-class distance and the intra-class distance of the sample are better than those of the furthest point, and the test sample is a qualified sample, so that the test sample can be replaced by the furthest point sample;
meanwhile, samples in the K value, which are the same as the qualified sample labels, are changed as follows:
samples with different labels from the qualified samples in the K value are changed as follows:
wherein q ago Represents the weight before update, q now1 And q now2 Representing the updated weights.
Compared with the prior art, the invention has the following beneficial effects:
the invention solves the problem of model non-universal caused by individual difference of electromyographic signals. The data volume is reduced by the step of extracting the representative sample, and the recognition speed and recognition accuracy are improved. The feature distribution of the new user is automatically learned by the self-adaptive learning method, the user retraining step is not needed, the use experience of the user is greatly improved, and the recognition accuracy is dynamically improved.
Drawings
FIG. 1 is a schematic diagram of the control of the system of the present invention;
FIG. 2 is a schematic diagram of a data processing flow in an embodiment of the invention.
FIG. 3 is a flow chart of an adaptive K value in an embodiment of the invention;
FIG. 4 is a schematic diagram showing the risk value ratio of the test sample to the most distant point according to an embodiment of the present invention.
Detailed Description
The invention will be further described with reference to the accompanying drawings and examples.
Referring to fig. 1, the invention provides a user-independent myoelectric gesture recognition system based on adaptive learning, which comprises a data acquisition unit, a clustering unit, an adaptive KNN neighbor classifier and a risk evaluator which are sequentially connected;
the data acquisition unit acquires the existing user data and processes the data;
the clustering unit is used for clustering the signal data after data processing, adopting K-Means to find clustering centers of different actions, extracting N samples, which are closest to the clustering centers, of each action of each user to serve as a training set, and training the self-adaptive KNN neighbor classifier;
the self-adaptive KNN neighbor classifier is used for obtaining corresponding labels according to new user data;
the risk evaluator evaluates the new user data and the qualified samples are used to replace remote samples of the training set and to update weights of the training set samples.
In this embodiment, referring to fig. 2, the data processing is specifically as follows:
for the obtained original data, data processing is needed, and the method mainly comprises the steps of filtering power frequency noise of 50Hz, full-wave rectification, 3-order Butterworth low-pass filtering with a cut-off frequency of 3Hz (the size of the cut-off frequency can be determined according to the actual filtering condition), active segment extraction and the like. In the extraction process of the active segment, in order to ensure the real-time performance of gesture recognition, a sliding window with the window length of 200ms and the step length of 50ms is set, sliding window processing is carried out on the envelope line after low-pass filtering, and each sliding window processing is marked as T (T); taking the threshold to be 0.015 times the peak in the window, when the data lasting 180ms in T (T) is all greater than the threshold, the segment is determined to be an active segment. When the data duration within T (T) greater than the threshold is less than 180ms, the segment is determined to be a rest segment.
Then, for each active segment, muscle synergy is extracted using a non-negative matrix factorization (non-negative matrix factorization, NMF) algorithm:
wherein V is a sample set matrix, W is a muscle co-matrix, H is an activated sparse matrix, i=1, 2,3,
o=1, 2, 3..o, j=1, 2, 3..j, I is the number of muscles, J is the number of sampling points, O is the number of muscle cooperations.
The number of muscle cooperations can be determined by the VAF value, the larger the VAF value is, the smaller the difference between the reconstructed matrix V' and the data matrix V under the cooperated number is, and the VAF value is generally taken as 85% -90%, and the specific formula can be expressed as:
for each muscle co-ordination W, normalization was performed according to the following formula:
wherein W_max and W_min are the maximum and minimum of W, respectively.
In this embodiment, the clustering unit extracts representative samples, specifically as follows:
assuming that the sample to be detected is W, and the class cluster is C 1 ,C 2 ,C 3 ,C 4 ,μ i For category cluster C i Is optimized for minimizing the error E:
in order to solve the equation, setting the initial centroid as the vector mean value of each gesture category, the iteration times as 100 generations, recording the category centroid of each generation, and ending when the iteration times reach the upper limit or the difference between the last generation centroid and the last generation centroid is smaller than 0.02.
The final class centroid serves as a template, 30 samples, closest to the template, of each action of each user are selected to serve as representative samples, the representative samples of a plurality of users form a training set for the self-adaptive KNN algorithm, and the weight of each sample is set to be 1.
In this embodiment, referring to fig. 3, in order to make up for the drawbacks of the conventional KNN algorithm, this embodiment proposes a method capable of automatically selecting an appropriate K value, which is specifically as follows:
for a new user sample W to be detected, firstly calculating a training set Y= [ Y ] obtained by the new user sample W and the second part 1 ,y 2 ,...y n ]Euclidean distance of (c):
where i=1, 2, ·l, l is the number of samples of the training set.
Then, the distances D are sorted in ascending order, the value of the nearest neighbor number k is a multiple of 5, and the number of the maximum class and the number of the second class of each k are recorded until the value of k reaches the upper limit. Then, judging whether the number of the two categories has crossing points, and if so, taking the optimal K value as follows:
wherein p is first Is the first intersection.
If there is no intersection, the score for each k is calculated as:
wherein s is the number of the maximum class, and d is the distance farthest from the sample to be measured in the maximum class.
Then, score is sorted, and K corresponding to the maximum value is K.
Referring to fig. 4, in this embodiment, by representing a sample and an adaptive KNN classifier, new user data to be measured may obtain an estimated label, and whether the label is reliable or not needs to be estimated, and whether the risk of updating the training set by the sample corresponding to the label is minimum or not. Therefore, a training set of gestures corresponding to the test sample labels needs to be found, and the farthest point between the training set and the sample template is marked as p by using the Euclidean distance. The risk of replacing the furthest point p with the sample W to be measured is calculated using the following formula:
in the above equation, the first term of the denominator represents the degree of similarity within the class, and the second term represents the degree of difference between the classes. A is the sample to be tested or the furthest sample, mu 1 Representative ofTemplates, μ for label-corresponding categories 24 Representing templates of other classes. I is the characteristic length, i.e. the number of muscles.
If the risk value of the test sample is smaller than that of the most distant point, the inter-class distance and the intra-class distance of the sample are better than those of the most distant point, and the test sample is a qualified sample, so that the test sample can be replaced by the most distant point sample.
Meanwhile, samples in the K value, which are the same as the qualified sample labels, are changed as follows:
samples with different labels from the qualified samples in the K value are changed as follows:
wherein q ago Represents the weight before update, q now1 And q now2 Representing the updated weights.
The foregoing description is only of the preferred embodiments of the invention, and all changes and modifications that come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

Claims (4)

1. The user-independent myoelectric gesture recognition system based on self-adaptive learning is characterized by comprising a data acquisition unit, a clustering unit, a self-adaptive KNN neighbor classifier and a risk evaluator which are connected in sequence;
the data acquisition unit acquires the existing user data and performs data processing;
the clustering unit is used for clustering the processed signal data by adopting K-Means to find clustering centers of different actions, extracting N samples, which are closest to the clustering centers, of each action of each user to serve as a training set, and training the self-adaptive KNN neighbor classifier;
the self-adaptive KNN neighbor classifier is used for obtaining corresponding labels according to new user data;
the risk evaluator evaluates the new user data, and the qualified samples are used for replacing remote samples of the training set and updating weights of samples of the training set;
the method adopts K-Means clustering to find the clustering centers of different actions, and specifically comprises the following steps:
let the sample to be measured be muscle cooperated W, and the category cluster be C 1 ,C 2 ,C 3 ,C 4 ,μ i For category cluster C i Is optimized for minimizing the error E:
setting an initial centroid as a vector mean value of each gesture category, recording category centroids of each generation, and ending when the iteration number reaches an upper limit or the difference between the last generation centroid and the last generation centroid is smaller than a preset value;
the final class centroid serves as a template, N samples, closest to the template, of each action of each user are selected to serve as representative samples, and a training set is formed;
the self-adaptive KNN classifier can automatically select a proper K value, and specifically comprises the following steps:
for a new user sample W to be detected, firstly calculating a training set Y= [ Y ] obtained by the new user sample W and the second part 1 ,y 2 ,...y n ]Euclidean distance of (c):
where i=1, 2, ·l, l is the number of samples of the training set;
then, ascending sorting is carried out on the distances D, the value of the nearest neighbor number k is a multiple of 5, and the number of the maximum class and the number of the second class of each k are recorded until the value of k reaches the upper limit; then, judging whether the number of the two categories has crossing points, and if so, taking the optimal K value as follows:
wherein p is first Is the first intersection;
if there is no intersection, the score for each k is calculated as:
s is the number of the maximum class, d is the distance from the sample to be tested to the farthest in the maximum class, then the score is ordered, and K corresponding to the maximum value is K;
the risk assessment is specifically as follows:
finding a training set of the gesture corresponding to the test sample label, finding the farthest point between the training set and the sample template by using Euclidean distance, marking as p, and calculating the risk of replacing the farthest point p by the sample W to be tested by using the following formula:
wherein, the first term of denominator represents the similarity degree in the class, the second term represents the difference degree between the classes, A is the sample to be detected or the furthest point; mu (mu) 1 Templates, mu, representing the corresponding classes of tags 24 Templates representing other categories; a and mu 14 All are 1 row and I column vectors, I is the characteristic length, namely the number of muscles;
if the risk value of the test sample is smaller than that of the furthest point, the inter-class distance and the intra-class distance of the sample are better than those of the furthest point, and the test sample is a qualified sample, so that the test sample can be replaced by the furthest point sample;
meanwhile, samples in the K value, which are the same as the qualified sample labels, are changed as follows:
samples with different labels from the qualified samples in the K value are changed as follows:
wherein q ago Represents the weight before update, q now1 And q now2 Representing the updated weights.
2. The adaptive learning-based user-independent myoelectric gesture recognition system of claim 1, wherein the data processing includes removing power frequency 50Hz noise from the signal, full wave rectification, butterworth 3-order low pass filtering, active segment extraction, feature extraction, and normalization processing.
3. The adaptive learning-based user-independent myoelectric gesture recognition system according to claim 2, wherein in the active segment extraction process, a sliding window with a window length of 200ms and a step length of 50ms is set, and sliding window processing is performed on the envelope after low-pass filtering, wherein each sliding window processing is denoted as T (T); taking a threshold value which is 0.015 times of the peak value in the window, and determining the segment as an active segment when the data lasting 180ms in the T (T) are all larger than the threshold value; when the duration of data within T (T) greater than the threshold is less than 180ms, the segment is determined to be a rest segment;
for each active segment, extracting muscle synergy using a non-negative matrix factorization algorithm:
wherein V is a sample set matrix, W is a muscle synergy matrix, H is an activated sparse matrix, i=1, 2,3 … I, o=1, 2,3 … O, j=1, 2,3 … J, I is the number of muscles, J is the number of sampling points, and O is the number of muscle synergies;
the number of muscle cooperations is determined by the VAF value
4. The adaptive learning-based user-independent myoelectric gesture recognition system according to claim 2, wherein the normalization process is specific to each muscle synergy W:
wherein W_max and W_min are the maximum and minimum of W, respectively.
CN202111376022.4A 2021-11-19 2021-11-19 User-independent myoelectric gesture recognition system based on self-adaptive learning Active CN114384999B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111376022.4A CN114384999B (en) 2021-11-19 2021-11-19 User-independent myoelectric gesture recognition system based on self-adaptive learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111376022.4A CN114384999B (en) 2021-11-19 2021-11-19 User-independent myoelectric gesture recognition system based on self-adaptive learning

Publications (2)

Publication Number Publication Date
CN114384999A CN114384999A (en) 2022-04-22
CN114384999B true CN114384999B (en) 2023-07-21

Family

ID=81195464

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111376022.4A Active CN114384999B (en) 2021-11-19 2021-11-19 User-independent myoelectric gesture recognition system based on self-adaptive learning

Country Status (1)

Country Link
CN (1) CN114384999B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114625257B (en) * 2022-05-16 2022-08-16 浙江强脑科技有限公司 Action recognition method and device based on electromyographic signals
CN115204242B (en) * 2022-09-09 2022-12-09 深圳市心流科技有限公司 Method and device for adjusting action template comparison threshold and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111723367A (en) * 2020-06-12 2020-09-29 国家电网有限公司 Power monitoring system service scene disposal risk evaluation method and system
CN112732090A (en) * 2021-01-20 2021-04-30 福州大学 Muscle cooperation-based user-independent real-time gesture recognition method
CN112821559A (en) * 2021-01-22 2021-05-18 西安理工大学 Non-invasive household appliance load depth re-identification method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111723367A (en) * 2020-06-12 2020-09-29 国家电网有限公司 Power monitoring system service scene disposal risk evaluation method and system
CN112732090A (en) * 2021-01-20 2021-04-30 福州大学 Muscle cooperation-based user-independent real-time gesture recognition method
CN112821559A (en) * 2021-01-22 2021-05-18 西安理工大学 Non-invasive household appliance load depth re-identification method

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
A fast clustering-based feature subset selection algorithm for high-dimensional data;SONG Q;IEEE;第01卷;全文 *
一种基于形状特征的图像分类方法;韦东兴;陈晓云;徐荣聪;;微计算机信息(第21期);全文 *
一种改进的分类算法在不良信息过滤中的应用;刘志刚;杜娟;衣治安;;微计算机应用(第02期);全文 *
一种融合Kmeans和KNN的网络入侵检测算法;华辉有;陈启买;刘海;张阳;袁沛权;;计算机科学(第03期);全文 *
基于主成分分析和K近邻的文件类型识别算法;鄢梦迪;秦琳琳;吴刚;;计算机应用(第11期);全文 *
基于类内K-means聚簇的KNN改进算法;许奇功;郭洪;;木工机床(第04期);全文 *
基于自适应多分类器融合的手势识别;刘肖;袁冠;张艳梅;闫秋艳;王志晓;;计算机科学(第07期);全文 *

Also Published As

Publication number Publication date
CN114384999A (en) 2022-04-22

Similar Documents

Publication Publication Date Title
WO2021143353A1 (en) Gesture information processing method and apparatus, electronic device, and storage medium
CN111134666B (en) Emotion recognition method of multi-channel electroencephalogram data and electronic device
CN114384999B (en) User-independent myoelectric gesture recognition system based on self-adaptive learning
CN111844032B (en) Electromyographic signal processing and exoskeleton robot control method and device
CN111103976B (en) Gesture recognition method and device and electronic equipment
CN110333783B (en) Irrelevant gesture processing method and system for robust electromyography control
CN107256392A (en) A kind of comprehensive Emotion identification method of joint image, voice
CN110008674B (en) High-generalization electrocardiosignal identity authentication method
CN103186774A (en) Semi-supervised learning-based multi-gesture facial expression recognition method
CN108703824B (en) Bionic hand control system and control method based on myoelectricity bracelet
CN110399846A (en) A kind of gesture identification method based on multichannel electromyography signal correlation
CN104484644A (en) Gesture identification method and device
Li et al. EEG signal classification method based on feature priority analysis and CNN
CN110443113A (en) A kind of virtual reality Writing method, system and storage medium
CN115936944B (en) Virtual teaching management method and device based on artificial intelligence
CN105354532A (en) Hand motion frame data based gesture identification method
Kanoga et al. Subject transfer framework based on source selection and semi-supervised style transfer mapping for sEMG pattern recognition
CN109933202B (en) Intelligent input method and system based on bone conduction
CN111913575B (en) Method for recognizing hand-language words
CN117235576A (en) Method for classifying motor imagery electroencephalogram intentions based on Riemann space
CN111714122A (en) Electromyographic pattern recognition method, electromyographic pattern recognition device, electromyographic pattern recognition equipment and computer-readable storage medium
CN112036357A (en) Upper limb action recognition method and system based on surface electromyogram signal
Tormási et al. Comparing the efficiency of a fuzzy single-stroke character recognizer with various parameter values
Mendes et al. Subvocal speech recognition based on EMG signal using independent component analysis and neural network MLP
CN110547806A (en) gesture action online recognition method and system based on surface electromyographic signals

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant