CN112733721A - Surface electromyographic signal classification method based on capsule network - Google Patents
Surface electromyographic signal classification method based on capsule network Download PDFInfo
- Publication number
- CN112733721A CN112733721A CN202110034453.6A CN202110034453A CN112733721A CN 112733721 A CN112733721 A CN 112733721A CN 202110034453 A CN202110034453 A CN 202110034453A CN 112733721 A CN112733721 A CN 112733721A
- Authority
- CN
- China
- Prior art keywords
- capsule
- dimensional
- matrix
- signal
- surface electromyographic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/08—Feature extraction
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/048—Activation functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/12—Classification; Matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/15—Biometric patterns based on physiological signals, e.g. heartbeat, blood flow
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Mathematical Physics (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Computing Systems (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Signal Processing (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Fuzzy Systems (AREA)
- Physiology (AREA)
- Psychiatry (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a surface electromyographic signal classification method based on a capsule network, which adopts a window analysis method to preprocess an original surface electromyographic signal; extracting features in the signals to form a feature sequence, and converting the feature sequence into a feature matrix by using a two-dimensional method; simply cutting and stacking the original signals to generate a two-dimensional signal matrix; convolving the two matrixes through convolution kernels with different sizes to obtain characteristic graphs with the same size and then overlapping channels; and sending the obtained abstract feature diagram into a capsule network for training, and storing the network weight. The invention improves the method for generating abstract characteristics, has high accuracy and strong robustness, and particularly aims at actions similar to muscle movement.
Description
Technical Field
The invention relates to a surface electromyographic signal classification problem.
Background
The hand plays an extremely important role in the daily life and work of human beings. In recent years, the number of elderly people with handicapped hands and feet and patients with physical disabilities has increased year by year as the degree of aging of the population has increased and various diseases, accidents, and natural disasters have occurred. The intelligent artificial limb can imitate the mode of a hand of a person, and basic help is provided for the old and the disabled in life. The hand movements of a person are governed by consciousness, which is transmitted to the whole body in the form of bioelectrical signals, and corresponding muscle groups perform corresponding movements after receiving the signals. For the control of prostheses, signal sources are also required, the surface electromyography (sEMG) being the most common at present.
The surface electromyographic signals are biological signals of the muscle movement recorded by sticking electrodes on the surface of the muscle. Different gestures correspond to different muscle movement patterns and therefore produce different signals. By analyzing the surface myoelectric signals, the gesture actions can be predicted. Meanwhile, the surface electromyogram signal also has the advantages of no wound, easy acquisition and the like, so the surface electromyogram signal is an ideal human-computer interaction signal source. The successful application of using surface electromyographic signals as a medium for human-computer interaction depends on accurate identification of the surface electromyographic signals. Therefore, the surface electromyogram signal has an application value to gesture action recognition, and the hand action recognition research based on the surface electromyogram signal has great social significance.
Most of the existing surface electromyographic signal classification methods adopt a feature extraction mode, and manually extracted features are sent to a machine learning or deep learning model for training to obtain a prediction result. There are also some studies that put raw data into deep learning models for training. However, the current method has the following problems: (1) the result of the method for training by using the characteristics depends on the selected characteristics and the combination mode to a great extent, the information of the original signal cannot be completely restored no matter how appropriate the characteristics are, the classification effect is good only for the actions with large differences, and when the action differences are small, the classification effect is not good, and the robustness of the algorithm is poor; (2) although the method for transmitting the original signal to the model for training can avoid extracting the features, the extracted abstract features are too wide due to too much information in the original signal, so the classification precision is low; (3) the machine learning or deep learning model used at present can only reflect whether certain features exist, but cannot reflect the relation between the features and the relative relationship between the features.
Disclosure of Invention
Aiming at the existing problems, the invention provides a classification method based on a capsule network and fusing original signals and extracted features, which adopts a novel deep learning model capsule network and fuses multi-level features in the network, and aims to solve the problems in the prior art and finally solve the problem of surface electromyographic signal classification with higher robustness and accuracy.
In order to achieve the purpose, the invention adopts the technical scheme that the surface electromyographic signal classification method based on the capsule network comprises the following steps:
s1: raw signal data preprocessing, i.e. window analysis. And intercepting the signals by using a window with the length of w aiming at the surface electromyographic signals. Wherein w is the window length, t is the increment interval, and τ is the time for signal processing and obtaining the prediction result. After each time interval t, the data is intercepted sequentially for all channels simultaneously.
S2: and extracting features from each window of each channel to obtain a one-dimensional multi-channel feature sequence.
S3: the signature sequence and the original signal are respectively two-dimensioned. The one-dimensional feature matrix is multiplied by the transpose matrix after being increased by "1" to obtain a two-dimensional feature matrix, which is specifically referred to as S31 to S32. Each windowed original signal is sliced and stacked to obtain a two-dimensional original signal matrix, which is described in detail in S33 to S34.
S31: to the sequence x, 1 is added as shown in equation (2) to give the sequence f.
f=[1,x1,x2,...,xm] (1)
S32: and (4) performing feature combination on the feature vectors by using a matrix operation, and specifically converting as shown in formulas (3) and (4).
F=G(α*(f×fT)β) (2)
Where α and β are transition parameters, both set to 0.5. F is the characteristic diagram obtained by conversion. G is a nonlinear activation function sigmoid for limiting the abstract eigenvalue between [0,1 ].
S33: each window with the length w is arranged according to the scale l1Is cut into n1Segment, n1Satisfying formula (4). In this experiment, set l1Is 20, n1Is 15.
n1=w/l1 (4)
S34: n is to be1Short sequences of segments are vertically stacked to obtain1×n1Is used for the two-dimensional matrix of (1).
S4: and (3) convolving the two matrixes obtained by the two-dimensional method by convolution kernels with different sizes respectively to finally obtain two multi-channel two-dimensional matrixes with the same size. Both matrices are stacked on the channels.
S5: and sending the data to a capsule network for training, and storing the network weight.
Compared with the prior art, the invention has the following advantages:
(1) compared with the traditional feature extraction method, the method for generating the abstract features is simple, new abstract features can be generated in a network, only existing common features need to be selected, and the method for generating the new features does not need to be researched.
(2) The accuracy is high. Compared with the method of directly inputting the features, the method of fusing the original signals and the features is used, more original information can be contained in the input, and the extracted abstract features do not generate a large amount of redundant features like the method of directly inputting the original signals, so that the method is helpful for extracting more characteristic features.
(3) And the robustness is strong. The adopted capsule network model is helpful for extracting the relative relation between the features, so that the capsule network model can only identify the action with larger difference and can also accurately identify the action with smaller difference, and the accuracy rate is higher, unlike the convolutional neural network.
Drawings
FIG. 1 is an overall flow chart of the method of the present invention.
FIG. 2 is a schematic diagram of a window analysis method of the present invention.
FIG. 3 is a diagram of a network model according to the present invention.
Fig. 4 is a dynamic routing mechanism for a capsule network.
Detailed Description
For a better understanding of the present invention, reference is made to the following detailed description taken in conjunction with the accompanying drawings and examples.
Example (c): an ELONXI myoelectricity acquisition instrument is adopted for research. The device supports a maximum of 16 bipolar channels, with a sampling resolution of 24 bits and a sampling frequency between 1000 hz and 2000 hz. In the experiment, 16 channels are selected, a sampling frequency is 1000 Hz, and a filter of the system is used for obtaining filtered signals. The data of 5 gesture actions of 1 person are collected in total, 10s of data are collected for each gesture action, and the data are collected for 4 times repeatedly. The signals collected 3 times before each action are taken as a training set, and the signals collected the last time are taken as a testing set.
S1: the raw signal data preprocessing, i.e. the window analysis method, as shown in fig. 2, intercepts the signal with a window for the surface electromyogram signal. Where w is the window length set to 300 milliseconds and t is the increment interval set to 50 milliseconds. After every 50 milliseconds, a window of 300 milliseconds length is sequentially intercepted for all channels simultaneously.
S2: and (3) extracting features from each window of each channel to obtain a one-dimensional multi-channel feature sequence x, as shown in formula (5). In this example, m features were collected, m having a value of 14, and the formula used is shown in Table 1, where x isiRepresenting the ith characteristic.
x=[x1,x2,...,xm] (5)
TABLE 1 discrete characteristics taken
Wherein S isiIs defined as a time-varying signal value, SfFor the frequency spectrum of the signal S obtained by a fast Fourier transform, P (S)f) Is the power spectral strength of that band.
S3: the signature sequence and the original signal are respectively two-dimensioned. And operating the one-dimensional feature sequence according to S31 to S32 to obtain a two-dimensional feature matrix. And cutting and stacking the windowed original signals according to the steps from S33 to S34 to obtain a two-dimensional original signal matrix.
S31: adding 1 to the sequence x as shown in formula (1) to obtain the sequence f.
f=[1,x1,x2,...,xm] (1)
S32: and (3) performing feature combination on the feature vectors by using a matrix operation, and specifically converting as shown in formulas (2) and (3).
F=G(α*(f×fT)β) (2)
Where α and β are transition parameters, both set to 0.5. F is the characteristic diagram obtained by conversion. G is a nonlinear activation function sigmoid for limiting the abstract eigenvalue between [0,1 ].
S33: each window with the length w is arranged according to the scale l1Is cut into n1Segment, n1Satisfying formula (4). In this experiment, set l1Is 20, n1Is 15 of。
n1=w/l1 (4)
S34: n is to be1Short sequences of segments are vertically stacked to obtain1×n1Is used for the two-dimensional matrix of (1).
S4: and (4) respectively convolving the two matrixes with convolution kernels with different sizes to finally obtain two multichannel two-dimensional matrixes with the same size. Both matrices are stacked on the channels. For the resulting two-dimensional feature matrix, a convolution kernel of 3 × 8 × 128 with a step size of 2 is used. For the resulting two-dimensional original signal matrix, a convolution kernel of 3 × 3 × 128 with a step size of 2 is used. Stacking to obtain 7 × 7 × 256 characteristic diagram S1。
S5: according to the steps from S51 to S54, the capsule network is sent to train, and the network weight is stored.
S51: checking S by a convolution of 3X 2561Convolution to obtain a characteristic map S2。
S52: will S1And S2And connecting on channels, fusing multi-level features, and reducing the number of the channels by using a convolution kernel of 1 multiplied by 256 to obtain a 7 multiplied by 256 feature map.
S53: the 7 × 7 × 256 characteristic diagram is divided into 32 groups according to 1 group of 8 channels, namely 7 × 7 × 8 × 32, namely 32 primary capsules, and each primary capsule uses UiAnd (4) showing.
S54: the action capsule layer is 5 multiplied by 16, and the parameters between the action capsule and the primary capsule are updated by a dynamic routing mechanism. The dynamic routing mechanism can be specifically described by fig. 4, the number of updates of the dynamic routing is set to be 3, the specific formulas are (6), (7), (8), (9), (10), and the loss function is formula (11).
U′j|i=WijUi (6)
Wherein, WijIs the connection weight of the ith low-level capsule to the jth high-level capsule; u'j|iIs the prediction vector of the ith low-level capsule to the jth high-level capsule; c. CijIs a coupling coefficient determined by dynamic routing; bijLogarithmic prior probability of being the ith low-level capsule to the jth high-level capsule
bij=bij+U′j|i·vj (10)
The prediction vectors of all the low-grade capsules are multiplied by the coupling coefficients correspondingly and accumulated to obtain the input s of the high-grade capsulesjThen converted into the activation value v of the advanced capsule by a non-linear 'compression' activation function (Squash function)j. This activation function both preserves the direction of the input vector and compresses the modulus of the input vector between [0, 1). Finally, updating b by using log-likelihood functionij. A larger product represents a closer orientation of the low-grade capsules to the high-grade capsules. The final prediction result can be obtained by repeating the formula for a certain number of times. In this experiment, the number of iterations was set to 3.
Where N is the number of all actions, which is 5 in this experiment; t isk1 means that class k is present, otherwise 0; m is+Punishment false positive, and the value is 0.9; m is-Punishment false negative, and the value is 0.1; and lambda is a weight parameter used for reducing the influence caused by the prediction error label, and the value of lambda is 0.5.
In a word, the invention provides a surface electromyographic signal classification method based on a capsule network for solving the problem of surface electromyographic signal classification. The method of fusing the original signal with the manually extracted features is used, so that the input contains richer information, and more abstract features are extracted. In addition, the capsule network can well represent the connection among the characteristics, and the information which cannot be mined by the convolutional neural network is mined, so that the recognition rate can be effectively improved.
Although the present invention has been described in connection with the accompanying drawings, the present invention is not limited to the above-described embodiments, the above-described examples and the description are only for illustrating the principle of the present invention, and the present invention may be further modified and improved without departing from the spirit and scope of the present invention, and the modifications and improvements fall within the scope of the claimed invention. The scope of the invention is defined by the appended claims and equivalents thereof.
Claims (5)
1. A surface electromyographic signal classification method based on a capsule network is characterized by comprising the following steps: comprises the following steps:
s1: preprocessing original signal data, namely a window analysis method; intercepting a signal by using a window with the length of w aiming at a surface electromyographic signal; wherein w is the window length, t is the increment interval, and τ is the time for signal processing and obtaining the prediction result; after every time interval t, sequentially intercepting data of all channels at the same time;
s2: extracting features from each window of each channel to obtain a one-dimensional multichannel feature sequence x, x ═ x1,x2,...,xm]M represents the number of features;
s3: respectively bidimensionalizing the characteristic sequence and the original signal; adding 1 to the one-dimensional feature matrix and then multiplying the one-dimensional feature matrix by the transpose matrix to obtain a two-dimensional feature matrix, which is specifically seen from S31 to S32; cutting and stacking each windowed original signal to obtain a two-dimensional original signal matrix, specifically seen in S33 to S34;
s31: adding 1 into the sequence x, and obtaining a sequence f as shown in a formula (2);
f=[1,x1,x2,...,xm] (1)
s32: performing feature combination on the feature vectors by using matrix operation, wherein the specific conversion is shown as formulas (3) and (4);
F=G(α*(f×fT)β) (2)
wherein, alpha and beta are conversion parameters which are set to be 0.5; f is a feature map obtained by conversion; g is a nonlinear activation function sigmoid and is used for limiting the abstract characteristic value to be between [0,1 ];
s33: each window with the length w is arranged according to the scale l1Is cut into n1Segment, n1Satisfies formula (4); in this experiment, set l1Is 20, n1Is 15;
n1=w/l1 (4)
s34: n is to be1Short sequences of segments are vertically stacked to obtain1×n1A two-dimensional matrix of (a);
s4: performing convolution on two matrixes obtained by a two-dimensional method by convolution kernels with different sizes respectively to finally obtain two multi-channel two-dimensional matrixes with the same size; stacking the two matrixes on the channel;
s5: and sending the data to a capsule network for training, and storing the network weight.
2. The method for classifying surface electromyographic signals based on a capsule network according to claim 1, wherein: the features extracted in step 2 are shown in the following table:
wherein S isiIs defined as a time-varying signal value, SfFor the frequency spectrum of the signal S obtained by a fast Fourier transform, P (S)f) Is the power spectral strength of that band.
3. The method for classifying surface electromyographic signals based on a capsule network according to claim 1, wherein: the stacking method comprises the following specific steps: for the obtained two-dimensional feature matrix, a convolution kernel with the length of 3 multiplied by 8 multiplied by 128 and the step length of 2 is adopted; for the obtained two-dimensional original signal matrix, a convolution kernel with the length of 3 multiplied by 128 and the step length of 2 is adopted; stacking to obtain 7 × 7 × 256 characteristic diagram S1。
4. The method for classifying surface electromyographic signals based on a capsule network according to claim 1, wherein: the step 5 specifically comprises the following steps:
s51: checking S by a convolution of 3X 2561Convolution to obtain a characteristic map S2;
S52: will S1And S2Connecting channels, fusing multi-level features, and reducing the number of the channels by using 1 × 1 × 256 convolution kernels to obtain a 7 × 7 × 256 feature map;
s53: the 7 × 7 × 256 characteristic diagram is divided into 32 groups according to 1 group of 8 channels, namely 7 × 7 × 8 × 32, namely 32 primary capsules, and each primary capsule uses UiRepresents;
s54: the action capsule layer is 5 multiplied by 16, and the parameters between the action capsule and the primary capsule are updated by a dynamic routing mechanism; the dynamic routing mechanism can be specifically described by fig. 4, the number of updates of the dynamic routing is set to be 3, the specific formulas are (5), (6), (7), (8), (9), and the loss function is formula (10);
U′j|i=WijUi (5)
wherein, WijIs the connection right of the ith low-level capsule and the jth high-level capsuleWeighing; u'j|iIs the prediction vector of the ith low-level capsule to the jth high-level capsule; c. CijIs a coupling coefficient determined by dynamic routing; bijIs the log prior probability of the ith low-grade capsule to the jth high-grade capsule;
bij=bij+U′j|i·vj (9)
the prediction vectors of all the low-grade capsules are multiplied by the coupling coefficients correspondingly and accumulated to obtain the input s of the high-grade capsulesjThen converted into the activation value v of the advanced capsule by a non-linear 'compression' activation function (Squash function)j(ii) a This activation function both preserves the direction of the input vector and compresses the modulus of the input vector between [0, 1); finally, updating b by using log-likelihood functionij(ii) a The larger the product is, the closer the direction of the low-grade capsule and the high-grade capsule is; repeating the formula for a certain number of times to obtain the final prediction result; in this experiment, the number of iterations was set to 3;
where N is the number of all actions; t isk1 means that class k is present, otherwise 0; m is+Is a punishing false positive; m is-Is a punishing false negative; λ is a weight parameter for reducing the impact of prediction error labeling.
5. A surface electromyographic signal classification method based on a capsule network is characterized by comprising the following steps: in step S54, N is 5; m is+The value is 0.9; m is-The value is 0.1; lambda takes the value 0.5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110034453.6A CN112733721B (en) | 2021-01-12 | 2021-01-12 | Surface electromyographic signal classification method based on capsule network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110034453.6A CN112733721B (en) | 2021-01-12 | 2021-01-12 | Surface electromyographic signal classification method based on capsule network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112733721A true CN112733721A (en) | 2021-04-30 |
CN112733721B CN112733721B (en) | 2022-03-15 |
Family
ID=75590344
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110034453.6A Active CN112733721B (en) | 2021-01-12 | 2021-01-12 | Surface electromyographic signal classification method based on capsule network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112733721B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113842151A (en) * | 2021-09-30 | 2021-12-28 | 杭州电子科技大学 | Cross-tested EEG (electroencephalogram) cognitive state detection method based on efficient multi-source capsule network |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105997064A (en) * | 2016-05-17 | 2016-10-12 | 成都奥特为科技有限公司 | Method for identifying human lower limb surface EMG signals (electromyographic signals) |
US20180239956A1 (en) * | 2017-01-19 | 2018-08-23 | Mindmaze Holding Sa | Systems, methods, devices and apparatuses for detecting facial expression |
CN110569781A (en) * | 2019-09-05 | 2019-12-13 | 河海大学常州校区 | time sequence classification method based on improved capsule network |
CN111616706A (en) * | 2020-05-20 | 2020-09-04 | 山东中科先进技术研究院有限公司 | Surface electromyogram signal classification method and system based on convolutional neural network |
CN113505822A (en) * | 2021-06-30 | 2021-10-15 | 中国矿业大学 | Multi-scale information fusion upper limb action classification method based on surface electromyographic signals |
-
2021
- 2021-01-12 CN CN202110034453.6A patent/CN112733721B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105997064A (en) * | 2016-05-17 | 2016-10-12 | 成都奥特为科技有限公司 | Method for identifying human lower limb surface EMG signals (electromyographic signals) |
US20180239956A1 (en) * | 2017-01-19 | 2018-08-23 | Mindmaze Holding Sa | Systems, methods, devices and apparatuses for detecting facial expression |
CN110569781A (en) * | 2019-09-05 | 2019-12-13 | 河海大学常州校区 | time sequence classification method based on improved capsule network |
CN111616706A (en) * | 2020-05-20 | 2020-09-04 | 山东中科先进技术研究院有限公司 | Surface electromyogram signal classification method and system based on convolutional neural network |
CN113505822A (en) * | 2021-06-30 | 2021-10-15 | 中国矿业大学 | Multi-scale information fusion upper limb action classification method based on surface electromyographic signals |
Non-Patent Citations (2)
Title |
---|
YU LIU ET AL: "Multi-channel EEG-based emotion recognition via a multi-level features guided capsule network", 《COMPUTERS IN BIOLOGY AND MEDICINE》 * |
骆俊锦 等: "基于时序二维化和卷积特征融合的", 《模式识别与人工智能 》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113842151A (en) * | 2021-09-30 | 2021-12-28 | 杭州电子科技大学 | Cross-tested EEG (electroencephalogram) cognitive state detection method based on efficient multi-source capsule network |
CN113842151B (en) * | 2021-09-30 | 2024-01-05 | 杭州电子科技大学 | Cross-test EEG cognitive state detection method based on efficient multi-source capsule network |
Also Published As
Publication number | Publication date |
---|---|
CN112733721B (en) | 2022-03-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113288183B (en) | Silent voice recognition method based on facial neck surface myoelectricity | |
Zhao et al. | Noise rejection for wearable ECGs using modified frequency slice wavelet transform and convolutional neural networks | |
CN103294199B (en) | A kind of unvoiced information identifying system based on face's muscle signals | |
Ibrahimy et al. | Design and optimization of Levenberg-Marquardt based neural network classifier for EMG signals to identify hand motions | |
CN111584029B (en) | Electroencephalogram self-adaptive model based on discriminant confrontation network and application of electroencephalogram self-adaptive model in rehabilitation | |
CN112043473B (en) | Parallel nested and autonomous preferred classifier for brain-myoelectricity fusion perception of intelligent artificial limb | |
CN110610172B (en) | Myoelectric gesture recognition method based on RNN-CNN architecture | |
CN110598676B (en) | Deep learning gesture electromyographic signal identification method based on confidence score model | |
Rahimian et al. | Few-shot learning for decoding surface electromyography for hand gesture recognition | |
Montazerin et al. | ViT-HGR: vision transformer-based hand gesture recognition from high density surface EMG signals | |
Xu et al. | Intelligent emotion detection method based on deep learning in medical and health data | |
CN112733721B (en) | Surface electromyographic signal classification method based on capsule network | |
Qureshi et al. | E2cnn: An efficient concatenated cnn for classification of surface emg extracted from upper limb | |
CN115177273B (en) | Multi-head re-attention mechanism-based movement intention recognition method and system | |
CN115271033A (en) | Medical image processing model construction and processing method based on federal knowledge distillation | |
CN112472101A (en) | Deep learning electrocardiogram data classification method and device based on conversion technology | |
CN113116363A (en) | Method for judging hand fatigue degree based on surface electromyographic signals | |
CN110321856B (en) | Time-frequency multi-scale divergence CSP brain-computer interface method and device | |
CN112998725A (en) | Rehabilitation method and system of brain-computer interface technology based on motion observation | |
CN115024735B (en) | Cerebral apoplexy patient rehabilitation method and system based on movement intention recognition model | |
Kurzynski et al. | Multiple classifier system applied to the control of bioprosthetic hand based on recognition of multimodal biosignals | |
Ye et al. | Attention bidirectional LSTM networks based mime speech recognition using sEMG data | |
CN111783669B (en) | Surface electromyogram signal classification and identification method for individual user | |
CN114999461A (en) | Silent voice decoding method based on facial neck surface myoelectricity | |
Chen et al. | Recognition of american sign language gestures based on electromyogram (emg) signal with xgboost machine learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |