CN109662710A - A kind of EMG Feature Extraction based on convolutional neural networks - Google Patents
A kind of EMG Feature Extraction based on convolutional neural networks Download PDFInfo
- Publication number
- CN109662710A CN109662710A CN201811489106.7A CN201811489106A CN109662710A CN 109662710 A CN109662710 A CN 109662710A CN 201811489106 A CN201811489106 A CN 201811489106A CN 109662710 A CN109662710 A CN 109662710A
- Authority
- CN
- China
- Prior art keywords
- training
- neural networks
- convolutional neural
- feature
- myoelectricity
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/389—Electromyography [EMG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
Abstract
The invention discloses a kind of EMG Feature Extractions based on convolutional neural networks.For this method using the original electromyography signal without any processing as input sample, the Training strategy combined using pre-training with essence training obtains a Feature Selection Model based on convolutional neural networks.With the output of articulamentum complete in network model for completely new myoelectricity feature, this feature can be used alone, can also be used in combination with traditional myoelectricity feature this method, be used for myoelectricity pattern classification.The myoelectricity feature obtained using method of the invention, can be used as the necessary complement of traditional myoelectricity feature, to improve the accuracy and robustness of myoelectricity pattern classification.
Description
Technical field
The present invention relates to physiological single processing and analytical technology more particularly to a kind of new sides for extracting electromyography signal feature
Method.
Background technique
The ultra-weak electronic signal generated when electromyography signal is contraction of muscle.The electric signal generated when contraction of muscle passes through internal group
The conduction knitted forms potential change in skin surface.When this potential change passes through the processing of amplifying circuit, collected, storage
Get off, referred to as surface electromyogram signal.Surface electromyogram signal mainly has both sides purposes: 1) clinical diagnosis and pathological analysis;
2) human-computer interaction, as prosthetic hand or artificial limb leg control.Since electromyography signal is substantially closely related with the execution of user intention, lead to
Crossing reasonable manner decoding electromyography signal can produce intuitive control command.Opposite EEG signals and nerve signal, myoelectricity letter
Number more stable and signal amplitude is larger, is acknowledged as most potential artificial limb end-of-pipe control signal source.
Nevertheless, electromyography signal still will receive the interference of various unfavorable factors, as muscular fatigue, dislocation of electrode, across with
Family otherness etc., so that the man-machine interface based on electromyography signal faces the test in terms of stability.Electromyography signal is substantially one
Kind random signal.Feature extraction is to analyze a necessary means of electromyography signal.The feature of traditional electromyography signal can be divided into time domain
Feature, frequency domain character and time and frequency domain characteristics.However, comprehensively utilizing these features, also there is no solve myoelectricity pattern classification precision
Practical problem low, stability is poor.It is made as example with myoelectric limb manual control, successful case in the market is only limitted to utilize binary channels flesh
Electricity realizes the control and closure of prosthetic hand.From the point of view of current status, based on the human-computer interaction of electromyography signal still in the primary stage,
Need further to be developed.A kind of myoelectricity feature with anti-noise ability is found, at the important opportunity for breaking through the bottleneck.
Convolutional neural networks are a kind of multilayer neural networks, are good to obtain from initial data and meet the reliable of target call
Information.The present invention proposes a kind of method for extracting invariant feature from original electromyography signal using convolutional neural networks.This method
A large amount of electromyography signals by multi-user, long time integration are input, train the convolutional neural networks of a high robust, and will
Completely new myoelectricity feature of the output of full articulamentum as similar traditional characteristic in network.Correlative data analysis shows that utilization is this
The precision for the gesture identification every other day that the feature that method is extracted can be improved.
Therefore, those skilled in the art is dedicated to developing a kind of electromyography signal feature with compared with strong anti-interference ability,
To be promoted based on electromyography signal as the stability of man-machine interface.
Summary of the invention
In view of the drawbacks described above of the prior art, the technical problem to be solved by the present invention is to traditional electromyography signal features cannot
The problem of meeting the stability requirement of man-machine interface control.
To achieve the above object, the present invention provides a kind of electromyography signal feature extraction side based on convolutional neural networks
Method,
Including the pre-training and essence training for training convolutional neural networks;With articulamentum output valve complete in convolutional neural networks work
For electromyography signal characteristic value, the convolutional neural networks are the multilayer neural networks comprising two convolutional layers, full articulamentum.
Further, the pre-training for training convolutional neural networks includes the following steps: with smart Training strategy
Step 1: carrying out pre-training to convolutional neural networks using the myoelectricity data of all subjects;
Step 2:, to above-mentioned pre-training neural network produced, being carried out further using the myoelectricity data of target subject
Training obtains smart neural network model.
Further, the full articulamentum output valve includes the following steps: as the characteristic value of electromyography signal
Step 1: using the smart neural network model of acquisition as the network of feature extraction, and save network structure and parameter;
Step 2: the full articulamentum for above-mentioned network structure increases output interface.
Step 3: being input with real-time myoelectricity data, the output valve of full articulamentum is obtained, the feature as electromyography signal.
Detailed description of the invention
Fig. 1 is a kind of convolutional network knot of EMG Feature Extraction based on convolutional neural networks of the invention
Structure;
The wherein original electromyography signal of 1-, 2- convolutional layer, 3- convolutional layer, the full articulamentum of 4-, the full articulamentum of 5-, 6- output layer, 7- are complete
The output valve of articulamentum;
Fig. 2 is a kind of execution process of EMG Feature Extraction based on convolutional neural networks of the invention;
Fig. 3 is the test result that specific embodiment is obtained by pre-training and essence training;
Fig. 4 A- Fig. 4 C compared traditional characteristic space and the feature space based on convolutional neural networks when distinguishing myoelectricity sample
Difference row;
Fig. 5 illustrates influence of the feature and traditional characteristic of the invention extracted to gesture nicety of grading.
Specific embodiment
With reference to the accompanying drawings and detailed description, special to a kind of electromyography signal based on convolutional neural networks of the invention
Sign extracting method is further described.
The embodiment is using a myoelectricity database as analysis object.The database contains the upper hand of 6 subjects
Arm myoelectricity data, each subject acquire 10 days myoelectricity data.Myoelectricity data acquire under 13 different gesture motions.It should
7 days electromyography signals are as training data before embodiment, and later 3 days electromyography signals are as test data.
The embodiment uses TensorFlow platform building network structure, and passes through the GeForce based on CUDA 8.0.44
GTX 1080TI video card accelerating algorithm executes.
Network structure determined by the embodiment includes 1 input layer, 2 convolutional layers, 2 full articulamentums.The embodiment
Input data be the dimension through over-segmentation be the original electromyography signal of 16*256.2 convolutional layers separately include 32 and 64 3*3
Filter.The output of convolutional layer carries out Data Dimensionality Reduction by the MAX Pooling scheme of 2*2.2 full articulamentums wrap respectively
Containing 128 and 13 concealed nodes.The output of 128 concealed nodes of trained network is the original flesh from 16*256
The electromyography signal feature extracted in electric signal.The output of 13 concealed nodes can obtain 13 gestures by softmax function
The classification of movement.In network training, second convolutional layer and first full articulamentum pass through Dropout processing respectively, keep
Probability is respectively set to 0.8 and 0.5.
The pre-training of the embodiment uses the training data of all subjects, and obtains pre-training net by 500 iteration
Network.The essence training of the embodiment uses the training data of target subject, and obtains essence training network by 500 iteration.
The accuracy of identification variation of pre-training and essence training is as shown in Figure 3.
The feature of test data is extracted using the character network obtained above by pre-training with essence training, and 128 are tieed up
After Feature Dimension Reduction, can get feature distribution as shown in Figure 4 as a result, wherein Fig. 4 A be sample traditional characteristic space distribution,
It in the distribution of the feature space based on convolutional neural networks, Fig. 4 C is sample in two kinds of superimposed spaces of feature that Fig. 4 B, which is sample,
In distribution.
Fig. 5 compared traditional characteristic and superposition state feature at LDA and SVM classifier to not as a result, i.e. traditional characteristic
Feature of the superposition based on convolutional neural networks can obtain better classifying quality.
Claims (3)
1. a kind of EMG Feature Extraction based on convolutional neural networks, it is characterised in that:
Pre-training and essence training for training convolutional neural networks;Full articulamentum output valve is as flesh using in convolutional neural networks
Signal characteristics value, the convolutional neural networks are the multilayer neural networks comprising two convolutional layers, full articulamentum.
2. the EMG Feature Extraction according to claim 1 based on convolutional neural networks, it is characterised in that: institute
It states and includes the following steps: for the pre-training of training convolutional neural networks with smart Training strategy
Step 1: carrying out pre-training to convolutional neural networks using the myoelectricity data of all subjects;
Step 2:, to above-mentioned pre-training neural network produced, being carried out further using the myoelectricity data of target subject
Training obtains smart neural network model.
3. the EMG Feature Extraction according to claim 2 based on convolutional neural networks, it is characterised in that: institute
The full articulamentum output valve stated includes the following steps: as the characteristic value of electromyography signal
Step 1: using the smart neural network model of acquisition as the network of feature extraction, and save network structure and parameter;
Step 2: the full articulamentum for above-mentioned network structure increases output interface;
Step 3: being input with real-time myoelectricity data, the output valve of full articulamentum is obtained, the feature as electromyography signal.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811489106.7A CN109662710A (en) | 2018-12-06 | 2018-12-06 | A kind of EMG Feature Extraction based on convolutional neural networks |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811489106.7A CN109662710A (en) | 2018-12-06 | 2018-12-06 | A kind of EMG Feature Extraction based on convolutional neural networks |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109662710A true CN109662710A (en) | 2019-04-23 |
Family
ID=66143634
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811489106.7A Pending CN109662710A (en) | 2018-12-06 | 2018-12-06 | A kind of EMG Feature Extraction based on convolutional neural networks |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109662710A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110991223A (en) * | 2019-10-18 | 2020-04-10 | 武汉虹识技术有限公司 | Method and system for identifying beautiful pupil based on transfer learning |
CN111222398A (en) * | 2019-10-28 | 2020-06-02 | 南京航空航天大学 | Myoelectric signal decoding method based on time-frequency feature fusion |
CN112957056A (en) * | 2021-03-16 | 2021-06-15 | 苏州大学 | Method and system for extracting muscle fatigue grade features by utilizing cooperative network |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080161878A1 (en) * | 2003-10-15 | 2008-07-03 | Tehrani Amir J | Device and method to for independently stimulating hemidiaphragms |
US7733224B2 (en) * | 2006-06-30 | 2010-06-08 | Bao Tran | Mesh network personal emergency response appliance |
CN105608432A (en) * | 2015-12-21 | 2016-05-25 | 浙江大学 | Instantaneous myoelectricity image based gesture identification method |
CN105654037A (en) * | 2015-12-21 | 2016-06-08 | 浙江大学 | Myoelectric signal gesture recognition method based on depth learning and feature images |
CN106980367A (en) * | 2017-02-27 | 2017-07-25 | 浙江工业大学 | A kind of gesture identification method based on myoelectricity topographic map |
CN108345873A (en) * | 2018-03-22 | 2018-07-31 | 哈尔滨工业大学 | A kind of multiple degrees of freedom body motion information analytic method based on multilayer convolutional neural networks |
CN108491077A (en) * | 2018-03-19 | 2018-09-04 | 浙江大学 | A kind of surface electromyogram signal gesture identification method for convolutional neural networks of being divided and ruled based on multithread |
-
2018
- 2018-12-06 CN CN201811489106.7A patent/CN109662710A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080161878A1 (en) * | 2003-10-15 | 2008-07-03 | Tehrani Amir J | Device and method to for independently stimulating hemidiaphragms |
US7733224B2 (en) * | 2006-06-30 | 2010-06-08 | Bao Tran | Mesh network personal emergency response appliance |
CN105608432A (en) * | 2015-12-21 | 2016-05-25 | 浙江大学 | Instantaneous myoelectricity image based gesture identification method |
CN105654037A (en) * | 2015-12-21 | 2016-06-08 | 浙江大学 | Myoelectric signal gesture recognition method based on depth learning and feature images |
CN106980367A (en) * | 2017-02-27 | 2017-07-25 | 浙江工业大学 | A kind of gesture identification method based on myoelectricity topographic map |
CN108491077A (en) * | 2018-03-19 | 2018-09-04 | 浙江大学 | A kind of surface electromyogram signal gesture identification method for convolutional neural networks of being divided and ruled based on multithread |
CN108345873A (en) * | 2018-03-22 | 2018-07-31 | 哈尔滨工业大学 | A kind of multiple degrees of freedom body motion information analytic method based on multilayer convolutional neural networks |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110991223A (en) * | 2019-10-18 | 2020-04-10 | 武汉虹识技术有限公司 | Method and system for identifying beautiful pupil based on transfer learning |
CN110991223B (en) * | 2019-10-18 | 2023-07-28 | 武汉虹识技术有限公司 | Pupil identification method and system based on transfer learning |
CN111222398A (en) * | 2019-10-28 | 2020-06-02 | 南京航空航天大学 | Myoelectric signal decoding method based on time-frequency feature fusion |
CN111222398B (en) * | 2019-10-28 | 2023-04-18 | 南京航空航天大学 | Myoelectric signal decoding method based on time-frequency feature fusion |
CN112957056A (en) * | 2021-03-16 | 2021-06-15 | 苏州大学 | Method and system for extracting muscle fatigue grade features by utilizing cooperative network |
CN112957056B (en) * | 2021-03-16 | 2022-12-30 | 苏州大学 | Method and system for extracting muscle fatigue grade features by utilizing cooperative network |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Dal Seno et al. | Online detection of P300 and error potentials in a BCI speller | |
Sanchez et al. | Brain-machine interface engineering | |
Zhou et al. | Classifying mental tasks based on features of higher-order statistics from EEG signals in brain–computer interface | |
Esfahani et al. | Classification of primitive shapes using brain–computer interfaces | |
Wood et al. | On the variability of manual spike sorting | |
US6546378B1 (en) | Signal interpretation engine | |
CN108776788A (en) | A kind of recognition methods based on brain wave | |
CN106980367A (en) | A kind of gesture identification method based on myoelectricity topographic map | |
CN110555468A (en) | Electroencephalogram signal identification method and system combining recursion graph and CNN | |
CN109662710A (en) | A kind of EMG Feature Extraction based on convolutional neural networks | |
CN108280414A (en) | A kind of recognition methods of the Mental imagery EEG signals based on energy feature | |
CN110399846A (en) | A kind of gesture identification method based on multichannel electromyography signal correlation | |
CN110333783A (en) | A kind of unrelated gesture processing method and system for robust myoelectric control | |
Li et al. | EEG signal classification method based on feature priority analysis and CNN | |
CN109730818A (en) | A kind of prosthetic hand control method based on deep learning | |
CN109009098B (en) | Electroencephalogram signal feature identification method under motor imagery state | |
Lv et al. | Common spatial pattern and particle swarm optimization for channel selection in BCI | |
Tuncer et al. | Classification of EMG signals taken from arm with hybrid CNN‐SVM architecture | |
CN110738093B (en) | Classification method based on improved small world echo state network electromyography | |
Kulwa et al. | Analyzing the impact of varied window hyper-parameters on deep CNN for sEMG based motion intent classification | |
Ming-Ai et al. | Feature extraction and classification of mental EEG for motor imagery | |
CN111772629B (en) | Brain cognitive skill transplanting method | |
Tomczyński et al. | Hand gesture-based interface with multichannel sEMG band enabling unknown gesture discrimination | |
Steyrl et al. | Random forests for feature selection in non-invasive brain-computer interfacing | |
Wu et al. | Classification of motor imagery based on multi-scale feature extraction and the channeltemporal attention module |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190423 |