CN113057657A - Electroencephalogram emotion classification method based on multi-scale connectivity characteristics and element migration learning - Google Patents
Electroencephalogram emotion classification method based on multi-scale connectivity characteristics and element migration learning Download PDFInfo
- Publication number
- CN113057657A CN113057657A CN202110299833.2A CN202110299833A CN113057657A CN 113057657 A CN113057657 A CN 113057657A CN 202110299833 A CN202110299833 A CN 202110299833A CN 113057657 A CN113057657 A CN 113057657A
- Authority
- CN
- China
- Prior art keywords
- electroencephalogram
- training
- scale
- residual error
- error network
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Psychiatry (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Educational Technology (AREA)
- Hospice & Palliative Care (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
Abstract
The invention discloses an electroencephalogram emotion classification method based on multi-scale connectivity characteristics and element migration learning. The method comprises the following steps: collecting 32-lead electroencephalogram signals in an evoked state; preprocessing the electroencephalogram signals, and segmenting in a sliding window mode; calculating a phase-locked value matrix according to the electroencephalogram signals of each window; constructing a multi-scale residual error network; training a multi-scale residual error network by using a meta-migration learning framework, and eliminating individual differences of electroencephalograms; and inputting the phase-locked value matrix into a multi-scale residual error network for emotion classification. According to the method, through the multi-scale residual error network, the connectivity characteristics of the electroencephalogram can be fully mined, the relation between the connectivity characteristics and the emotional state is discussed, and the emotional interaction of different human brain areas is captured; the invention can effectively relieve the problem of large individual difference of the electroencephalogram in the scene of crossing the tested scene through the framework of meta-transfer learning.
Description
Technical Field
The invention relates to the field of electroencephalogram signal processing, in particular to an electroencephalogram emotion classification method based on multi-scale connectivity characteristics and element migration learning.
Background
Emotion recognition has become a very hot problem in the brain-computer interface (BCI) field, and a great deal of research is being conducted on emotion recognition by more and more researchers. Because the electroencephalogram signal has good time resolution and is easy to acquire, the electroencephalogram signal is widely used in most brain-computer interface applications. Although facial expressions or other physiological signals such as electrocardiosignals, electrodermal signals, etc. can also be used for emotion recognition, the subject is easily hidden, resulting in a less objective result. However, the electroencephalogram signal cannot be covered, is more objective than facial expressions and other physiological signals, and can better and truly reflect emotional changes. Therefore, more and more researchers are performing emotion recognition through electroencephalogram signals.
However, under the stimulation of the same stimulus source, the electroencephalograms sampled from different subjects have large difference in voltage amplitude, which reflects that the activity degrees of the electroencephalograms of different subjects are also different, so that the emotion recognition of the electroencephalograms has the problem of large individual difference, which greatly influences the accuracy of the model in the cross-subject experiment.
Therefore, a novel emotion classification algorithm needs to be designed, the problem of large electroencephalogram individual difference in cross-subject experimental scenes is solved, and emotional interactivity among different brain areas is better mined.
Disclosure of Invention
In order to overcome the defects and shortcomings of the prior art, the invention provides an electroencephalogram emotion classification method based on multi-scale connectivity characteristics and meta-migration learning. The method excavates the relation between connectivity characteristics and emotional states, effectively captures emotional interaction of different brain areas through a multi-scale residual error network, combines the advantages of meta learning and transfer learning, can well eliminate the individual difference of electroencephalogram, and improves the accuracy of an emotion recognition classification model in a cross-subject experiment.
The purpose of the invention is realized by the following technical scheme: the electroencephalogram emotion classification method based on the multi-scale connectivity characteristics and the element migration learning comprises the following steps:
s1, acquiring electroencephalogram signals in an evoked state;
s2, preprocessing the electroencephalogram signals, and segmenting in a sliding window mode to obtain electroencephalogram segments;
s3, for each segmented electroencephalogram segment, calculating the phase synchronism among electroencephalogram signals of each channel, and solving the average value of phase differences among the signals to obtain a phase-locked value matrix;
s4, constructing a multi-scale residual error network to effectively capture emotional interactivity of different brain areas;
s5, training a multi-scale residual error network by using a meta-migration learning framework, and eliminating individual differences of electroencephalograms;
and S6, inputting the phase-locked value matrix into a multi-scale residual error network for emotion classification.
In a preferred embodiment, step S4 includes the steps of:
s41, constructing a convolutional layer with 64 filters, convolution size of 3x3 and step length of 2 for feature extraction;
s42, constructing a maximum pooling layer with the receptive field size of 3x3 and the step length of 2, and using the maximum pooling layer for feature down-sampling;
s43, constructing a multi-scale module, which comprises three branches: 3x residual blocks, 5x residual blocks and 7x residual blocks are used for capturing and fusing shallow semantic features of the electroencephalogram signals, fully capturing connectivity features of the electroencephalogram signals of different channels by utilizing different receptive fields, fully reflecting the activity degrees of different brain areas and capturing emotional interaction among different brain areas;
s44, constructing a global average pooling layer for compressing the features and reducing the calculated amount of the model;
and S45, constructing a full connection layer for dimension reduction, and constructing a softmax layer for distinguishing negative emotional states from positive emotional states.
In a preferred embodiment, step S5 includes the steps of:
s51, in a pre-training stage, all training samples of a training set are used for training an initial model, and a feature extractor is learned through a gradient descent method;
s52, repeatedly sampling electroencephalogram data from the training set, and constructing two data sets: a support set and a query set, wherein the support set is used for training the base learner, and the query set is used for training the meta learner;
s53, loading the weight obtained in the pre-training stage, training a multi-scale residual error network by using support set data, learning general electroencephalogram characteristics, updating parameters of convolution layers in the multi-scale residual error network, and completing first gradient updating;
s54, fixing parameters of convolution layers in the multi-scale residual error network, training the multi-scale residual error network by using data of the query set, learning personalized electroencephalogram characteristics, updating parameters of full connection layers in the multi-scale residual error network, and finishing second gradient updating;
and S55, loading the weight trained by the element training, and evaluating the effect of the model on the test set.
Compared with the prior art, the invention has the following advantages and beneficial effects:
1. the invention extracts the phase-locked value matrix of the electroencephalogram signal, discusses the relation between connectivity characteristics and emotional states, and effectively captures the emotional interaction of different human brain areas through a multi-scale residual error network.
2. According to the invention, through a meta-transfer learning framework, the advantages of meta-learning and transfer learning are fully utilized, the problem of large difference of electroencephalogram individuals under a cross-tested scene can be effectively alleviated, and the accuracy of electroencephalogram emotion classification is improved.
Drawings
FIG. 1 is a general flow chart of an electroencephalogram emotion classification method based on multi-scale connectivity characteristics and meta-migration learning according to the present invention;
FIG. 2 is a schematic diagram of a multi-scale residual error network in an embodiment of the invention;
FIG. 3 is a diagram illustrating multi-scale modules in a multi-scale residual network according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a meta migration learning framework in an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to examples and drawings, but the present invention is not limited to these examples.
Examples
As shown in fig. 1, the electroencephalogram emotion classification method based on multi-scale connectivity characteristics and meta-migration learning in this embodiment mainly includes the following steps:
and S1, acquiring 32-lead electroencephalogram signal data in an evoked state.
In practical application, EEG signal acquisition can be carried out based on 32 leads specified in International 10-20, and EEG signals of 32 electrode positions are acquired.
And S2, preprocessing the electroencephalogram signals, and segmenting in a sliding window mode to obtain electroencephalogram segments.
In the embodiment, the band-pass filter is used for removing noise and motion artifacts and removing the baseline drift of the electroencephalogram signal. The electroencephalogram signal is decomposed into four bands: theta, alpha, beta, gamma. And segmenting the electroencephalogram signal by using a sliding window with the time length of 3 seconds and without overlapping to obtain an electroencephalogram segment. Step S2 includes:
s21, removing noise and motion artifacts of the electroencephalogram signals by using a band-pass filter with the frequency range of 4-45HZ, and removing baseline drift of the electroencephalogram signals;
s22, decomposing the electroencephalogram signals into four wave bands: theta, alpha, beta and gamma, wherein the frequency range of the theta wave band is 4-7Hz, the frequency range of the alpha wave band is 8-13Hz, the frequency range of the beta wave band is 14-30Hz, and the frequency range of the gamma wave band is 31-50 Hz.
And S23, segmenting the electroencephalogram signal in a sliding window mode with the time length of 3 seconds and no overlap to obtain electroencephalogram segments.
S3, for each segmented electroencephalogram segment, calculating the phase synchronism among electroencephalogram signals of each channel, and solving the average value of phase differences among the electroencephalogram signals to obtain a phase-locked value matrix.
In the embodiment, each segmented electroencephalogram segment is subjected to Hilbert-Huang transformation, and a continuous time domain signal is converted into a discrete frequency domain signal; calculating the phase synchronism between every two channels of electroencephalogram signals, and solving the average value of the phase difference between the electroencephalogram signals to obtain a phase-locked value matrix, wherein the dimension of the phase-locked value matrix is 32 x 32.
S4, constructing a multi-scale residual error network so as to effectively capture emotional interactivity of different brain areas.
In this embodiment, the convolutional layer, the maximum pooling layer, the multi-scale module, the global pooling layer, the full link layer, and the softmax layer are sequentially constructed to form the multi-scale residual error network. Wherein the multi-scale module mainly comprises three residual blocks with different convolution kernel sizes: a 3x residual block, a 5x residual block, and a 7x residual block. Multiple scales can effectively capture emotional interactivity of different brain regions. The method specifically comprises the following steps:
and S41, constructing a convolutional layer with 64 filters, convolution size of 3x3 and step size of 2 for feature extraction.
And S42, constructing a maximum pooling layer with the receptive field size of 3x3 and the step length of 2, and using the maximum pooling layer for feature down-sampling.
S43, constructing a multi-scale module, which mainly comprises three branches: a 3x residual block, a 5x residual block, and a 7x residual block. The difference between these three branches is that the size of the convolution kernel used varies from 3 to 7. The residual block can better capture and fuse the shallow semantic features of the EEG electroencephalogram signal through quick connection. Each residual block is repeated 3 times. The phase-locked value matrix reflects the correlation of different electroencephalogram signal channels, and the nodes of the phase-locked value matrix represent the intensity of activity in the electroencephalogram signals of different channels. A multi-scale method is adopted, wherein the convolution kernel sizes of the convolution layers of each branch are different; the method can fully capture the connectivity characteristics of the electroencephalogram signals of different channels by using different receptive fields, thereby fully reflecting the activity degrees of different brain areas and capturing emotional interaction among different brain areas.
And S44, constructing a global average pooling layer for compressing the features and reducing the calculation amount of the model.
And S45, constructing a full connection layer for dimension reduction, and constructing a softmax layer for distinguishing negative emotional states from positive emotional states.
S5, training a multi-scale residual error network by using a meta-migration learning framework, and eliminating individual differences of electroencephalogram signals. The framework of meta-migration learning is mainly divided into a pre-training stage, a meta-training stage and a meta-testing stage. Wherein the pre-training stage aims at learning the shallow semantic features of the electroencephalogram; the meta-training stage aims at learning general electroencephalogram characteristics and personalized electroencephalogram characteristics, and the meta-testing stage aims at verifying the performance of the model in a test set. The method comprises the following steps:
s51, in the pre-training phase, all training samples of the training set are used to train the initial model, so that the model can be converged quickly in the subsequent phase. In the pre-training phase, the feature extractor is learned by a gradient descent method. Through transfer learning, the model can learn common shallow semantic features of the electroencephalogram features.
S52, repeatedly sampling electroencephalogram data from the training set, and constructing two data sets: the support set is used for training the base learner, and the query set is used for training the meta learner.
S53, loading the weight obtained in the pre-training stage, firstly using the support set data to train the multi-scale residual error network in the step four, learning the universal electroencephalogram characteristics, updating the parameters of the convolution layer in the multi-scale residual error network, and completing the first gradient updating.
S54, fixing the parameters of the convolution layer in the multi-scale residual error network, using the data of the query set to train the multi-scale residual error network in the step four, learning the personalized electroencephalogram characteristics, updating the parameters of the full connection layer in the multi-scale residual error network, and completing the second gradient updating.
And S55, loading the weight trained by the element training, and evaluating the effect of the model on the test set.
And S6, inputting the phase-locked value matrix into a multi-scale residual error network for emotion classification.
As shown in fig. 2, the multi-scale residual error network mainly includes a convolutional layer, a maximum pooling layer, a multi-scale module, a global pooling layer, a full-link layer, and a softmax layer, and a phase-locked value matrix with a dimension of 32 × 32 is input to the multi-scale residual error network for training to obtain a result of emotion classification.
As shown in fig. 3, the multi-scale module is mainly composed of three branches: a 3x residual block, a 5x residual block, and a 7x residual block. The difference between these three branches is that the size of the convolution kernel used varies from 3 to 7. The residual block may better capture and fuse the shallow semantic features of the EEG signal through a shortcut connection. Each residual block is trained at least 3 times. The phase-locked value matrix reflects the correlation of different electroencephalogram signal channels, and the nodes of the phase-locked value matrix represent the intensity of activity in the electroencephalogram signals of different channels. The present invention employs a multi-scale approach, where the convolution kernel size of the convolution layer of each branch is different, which means that different receptive fields can be used to fully capture the connectivity characteristics of different channels.
As shown in fig. 4, the meta-migration learning framework is divided into a pre-training phase, a meta-training phase, and a meta-testing phase.
In the pre-training phase, all training samples of the training set are used to train the initial model, so that the model can be converged quickly in the subsequent phase. In the pre-training phase, the feature extractor is learned by a gradient descent method. Through transfer learning, the model can learn common shallow semantic features of the electroencephalogram features.
In the meta-training phase, brain electrical data is first repeatedly sampled from a training set, constructing two data sets: the support set is used for training the base learner, and the query set is used for training the meta learner. And then, loading the weight obtained in the pre-training stage, training support set data and learning the universal electroencephalogram characteristics. Keeping the general electroencephalogram characteristics unchanged, and then using the data fine-tuning model of the query set to learn the personalized electroencephalogram characteristics. In the meta-training stage, the general electroencephalogram characteristics are mainly learned through a training support set, and on the basis, a query set is trained to fine-tune the personalized electroencephalogram characteristics.
In the meta-test stage, the trained weights in the meta-training stage are loaded first, and the effect of the model on the test set is evaluated.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.
Claims (6)
1. The electroencephalogram emotion classification method based on the multi-scale connectivity characteristics and the element migration learning is characterized by comprising the following steps of:
s1, acquiring electroencephalogram signals in an evoked state;
s2, preprocessing the electroencephalogram signals, and segmenting in a sliding window mode to obtain electroencephalogram segments;
s3, for each segmented electroencephalogram segment, calculating the phase synchronism among electroencephalogram signals of each channel, and solving the average value of phase differences among the signals to obtain a phase-locked value matrix;
s4, constructing a multi-scale residual error network to effectively capture emotional interactivity of different brain areas;
s5, training a multi-scale residual error network by using a meta-migration learning framework, and eliminating individual differences of electroencephalograms;
and S6, inputting the phase-locked value matrix into a multi-scale residual error network for emotion classification.
2. The electroencephalogram emotion classification method according to claim 1, wherein the electroencephalogram signal in the evoked state in step S1 refers to an electroencephalogram signal acquired on the premise that a specific emotion of a subject is excited by a cognitive task.
3. The electroencephalogram emotion classification method according to claim 1, wherein step S2 includes the steps of:
s21, removing noise, motion artifacts and baseline drift of the electroencephalogram signal by using a band-pass filter;
s22, decomposing the electroencephalogram signal into four wave bands of theta, alpha, beta and gamma;
s23, segmenting the electroencephalogram signal in a sliding window mode with the time length of n seconds and no overlap to obtain electroencephalogram segments, wherein n is a preset time length.
4. The electroencephalogram emotion classification method according to claim 1, wherein step S3 includes the steps of:
s31, performing Hilbert-Huang transform on each segmented electroencephalogram segment, and converting continuous time domain signals into discrete frequency domain signals;
s32, calculating the phase synchronism between every two channels of electroencephalogram signals, and solving the average value of the phase difference between the electroencephalogram signals to obtain a phase-locked value matrix.
5. The electroencephalogram emotion classification method according to claim 1, wherein step S4 includes the steps of:
s41, constructing a convolutional layer with 64 filters, convolution size of 3x3 and step length of 2 for feature extraction;
s42, constructing a maximum pooling layer with the receptive field size of 3x3 and the step length of 2, and using the maximum pooling layer for feature down-sampling;
s43, constructing a multi-scale module, which comprises three branches: 3x residual blocks, 5x residual blocks and 7x residual blocks are used for capturing and fusing shallow semantic features of the electroencephalogram signals, fully capturing connectivity features of the electroencephalogram signals of different channels by utilizing different receptive fields, fully reflecting the activity degrees of different brain areas and capturing emotional interaction among different brain areas;
s44, constructing a global average pooling layer for compressing the features and reducing the calculated amount of the model;
and S45, constructing a full connection layer for dimension reduction, and constructing a softmax layer for distinguishing negative emotional states from positive emotional states.
6. The electroencephalogram emotion classification method according to claim 1, wherein step S5 includes the steps of:
s51, in a pre-training stage, all training samples of a training set are used for training an initial model, and a feature extractor is learned through a gradient descent method;
s52, repeatedly sampling electroencephalogram data from the training set, and constructing two data sets: a support set and a query set, wherein the support set is used for training the base learner, and the query set is used for training the meta learner;
s53, loading the weight obtained in the pre-training stage, training a multi-scale residual error network by using support set data, learning general electroencephalogram characteristics, updating parameters of convolution layers in the multi-scale residual error network, and completing first gradient updating;
s54, fixing parameters of convolution layers in the multi-scale residual error network, training the multi-scale residual error network by using data of the query set, learning personalized electroencephalogram characteristics, updating parameters of full connection layers in the multi-scale residual error network, and finishing second gradient updating;
and S55, loading the weight trained by the element training, and evaluating the effect of the model on the test set.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110299833.2A CN113057657B (en) | 2021-03-22 | 2021-03-22 | Electroencephalogram emotion classification method based on multi-scale connectivity characteristics and element migration learning |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110299833.2A CN113057657B (en) | 2021-03-22 | 2021-03-22 | Electroencephalogram emotion classification method based on multi-scale connectivity characteristics and element migration learning |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113057657A true CN113057657A (en) | 2021-07-02 |
CN113057657B CN113057657B (en) | 2022-09-13 |
Family
ID=76563296
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110299833.2A Active CN113057657B (en) | 2021-03-22 | 2021-03-22 | Electroencephalogram emotion classification method based on multi-scale connectivity characteristics and element migration learning |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113057657B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114145744A (en) * | 2021-11-22 | 2022-03-08 | 华南理工大学 | Cross-device forehead electroencephalogram emotion recognition method and system |
CN114431862A (en) * | 2021-12-22 | 2022-05-06 | 山东师范大学 | Multi-modal emotion recognition method and system based on brain function connection network |
CN115969388A (en) * | 2023-02-28 | 2023-04-18 | 河北工业大学 | Doppler radar heartbeat detection method based on multi-scale residual shrinkage network |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105956546A (en) * | 2016-04-28 | 2016-09-21 | 杭州电子科技大学 | Emotion recognition method based on EEG signals |
CN111513735A (en) * | 2020-05-31 | 2020-08-11 | 天津大学 | Major depressive disorder identification system based on brain-computer interface and deep learning and application |
CN111639679A (en) * | 2020-05-09 | 2020-09-08 | 西北工业大学 | Small sample learning method based on multi-scale metric learning |
CN112057089A (en) * | 2020-08-31 | 2020-12-11 | 五邑大学 | Emotion recognition method, emotion recognition device and storage medium |
CN112120694A (en) * | 2020-08-19 | 2020-12-25 | 中国地质大学(武汉) | Motor imagery electroencephalogram signal classification method based on neural network |
CN112200211A (en) * | 2020-07-17 | 2021-01-08 | 南京农业大学 | Small sample fish identification method and system based on residual error network and transfer learning |
WO2021026400A1 (en) * | 2019-08-06 | 2021-02-11 | Neuroenhancement Lab, LLC | System and method for communicating brain activity to an imaging device |
-
2021
- 2021-03-22 CN CN202110299833.2A patent/CN113057657B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105956546A (en) * | 2016-04-28 | 2016-09-21 | 杭州电子科技大学 | Emotion recognition method based on EEG signals |
WO2021026400A1 (en) * | 2019-08-06 | 2021-02-11 | Neuroenhancement Lab, LLC | System and method for communicating brain activity to an imaging device |
CN111639679A (en) * | 2020-05-09 | 2020-09-08 | 西北工业大学 | Small sample learning method based on multi-scale metric learning |
CN111513735A (en) * | 2020-05-31 | 2020-08-11 | 天津大学 | Major depressive disorder identification system based on brain-computer interface and deep learning and application |
CN112200211A (en) * | 2020-07-17 | 2021-01-08 | 南京农业大学 | Small sample fish identification method and system based on residual error network and transfer learning |
CN112120694A (en) * | 2020-08-19 | 2020-12-25 | 中国地质大学(武汉) | Motor imagery electroencephalogram signal classification method based on neural network |
CN112057089A (en) * | 2020-08-31 | 2020-12-11 | 五邑大学 | Emotion recognition method, emotion recognition device and storage medium |
Non-Patent Citations (1)
Title |
---|
柳长源: "基于MAResnet的脑电情感识别研究", 《仪器仪表学报》 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114145744A (en) * | 2021-11-22 | 2022-03-08 | 华南理工大学 | Cross-device forehead electroencephalogram emotion recognition method and system |
CN114145744B (en) * | 2021-11-22 | 2024-03-29 | 华南理工大学 | Cross-equipment forehead electroencephalogram emotion recognition based method and system |
CN114431862A (en) * | 2021-12-22 | 2022-05-06 | 山东师范大学 | Multi-modal emotion recognition method and system based on brain function connection network |
CN115969388A (en) * | 2023-02-28 | 2023-04-18 | 河北工业大学 | Doppler radar heartbeat detection method based on multi-scale residual shrinkage network |
CN115969388B (en) * | 2023-02-28 | 2024-05-03 | 河北工业大学 | Doppler radar heartbeat detection method based on multi-scale residual error shrinkage network |
Also Published As
Publication number | Publication date |
---|---|
CN113057657B (en) | 2022-09-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113057657B (en) | Electroencephalogram emotion classification method based on multi-scale connectivity characteristics and element migration learning | |
CN107203692B (en) | Electrocardio data digital signal processing method based on deep convolutional neural network | |
CN109299751B (en) | EMD data enhancement-based SSVEP electroencephalogram classification method of convolutional neural model | |
CN112120694B (en) | Motor imagery electroencephalogram signal classification method based on neural network | |
CN109598222B (en) | EEMD data enhancement-based wavelet neural network motor imagery electroencephalogram classification method | |
CN110353702A (en) | A kind of emotion identification method and system based on shallow-layer convolutional neural networks | |
Suleiman et al. | Features extraction techniques of EEG signal for BCI applications | |
CN113065526B (en) | Electroencephalogram signal classification method based on improved depth residual error grouping convolution network | |
CN114041795B (en) | Emotion recognition method and system based on multi-mode physiological information and deep learning | |
Duman et al. | Patient specific seizure prediction algorithm using Hilbert-Huang transform | |
CN111407243A (en) | Pulse signal pressure identification method based on deep learning | |
CN116881762A (en) | Emotion recognition method based on dynamic brain network characteristics | |
CN113158964A (en) | Sleep staging method based on residual learning and multi-granularity feature fusion | |
CN115919330A (en) | EEG Emotional State Classification Method Based on Multi-level SE Attention and Graph Convolution | |
CN113558644B (en) | Emotion classification method, medium and equipment for 3D matrix and multidimensional convolution network | |
CN117520891A (en) | Motor imagery electroencephalogram signal classification method and system | |
CN111543983B (en) | Electroencephalogram signal channel selection method based on neural network | |
CN113317803A (en) | Neural disease feature extraction method based on graph theory and machine learning | |
CN113180659A (en) | Electroencephalogram emotion recognition system based on three-dimensional features and cavity full convolution network | |
CN111783719A (en) | Myoelectric control method and device | |
CN116910464A (en) | Myoelectric signal prosthetic hand control system and method | |
CN116421200A (en) | Brain electricity emotion analysis method of multi-task mixed model based on parallel training | |
CN114305452A (en) | Cross-task cognitive load identification method based on electroencephalogram and field adaptation | |
CN113269159A (en) | Gesture recognition method fusing electromyographic signals and visual images | |
CN115919313B (en) | Facial myoelectricity emotion recognition method based on space-time characteristics |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |