AU2021104767A4 - Method for classification of human emotions based on selected scalp region eeg patterns by a neural network - Google Patents
Method for classification of human emotions based on selected scalp region eeg patterns by a neural network Download PDFInfo
- Publication number
- AU2021104767A4 AU2021104767A4 AU2021104767A AU2021104767A AU2021104767A4 AU 2021104767 A4 AU2021104767 A4 AU 2021104767A4 AU 2021104767 A AU2021104767 A AU 2021104767A AU 2021104767 A AU2021104767 A AU 2021104767A AU 2021104767 A4 AU2021104767 A4 AU 2021104767A4
- Authority
- AU
- Australia
- Prior art keywords
- data
- eeg
- neural network
- training
- testing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 230000008451 emotion Effects 0.000 title claims abstract description 45
- 238000000034 method Methods 0.000 title claims abstract description 38
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 27
- 210000004761 scalp Anatomy 0.000 title claims description 3
- 238000012549 training Methods 0.000 claims abstract description 23
- 238000012360 testing method Methods 0.000 claims abstract description 20
- 239000013598 vector Substances 0.000 claims abstract description 19
- 238000010200 validation analysis Methods 0.000 claims abstract description 13
- 238000013459 approach Methods 0.000 claims abstract description 9
- 238000007781 pre-processing Methods 0.000 claims abstract description 7
- 238000005192 partition Methods 0.000 claims abstract description 5
- 210000004556 brain Anatomy 0.000 claims description 11
- 230000004044 response Effects 0.000 claims description 8
- 238000005070 sampling Methods 0.000 claims description 4
- 230000001936 parietal effect Effects 0.000 claims description 3
- 208000027534 Emotional disease Diseases 0.000 claims description 2
- 230000008859 change Effects 0.000 claims description 2
- 230000000193 eyeblink Effects 0.000 claims description 2
- 238000001914 filtration Methods 0.000 claims description 2
- 210000003205 muscle Anatomy 0.000 claims description 2
- 230000033764 rhythmic process Effects 0.000 claims description 2
- 238000000926 separation method Methods 0.000 claims description 2
- 230000002123 temporal effect Effects 0.000 claims description 2
- 238000013480 data collection Methods 0.000 claims 1
- 230000008901 benefit Effects 0.000 description 9
- 230000002996 emotional effect Effects 0.000 description 6
- 230000037007 arousal Effects 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000008921 facial expression Effects 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000000537 electroencephalography Methods 0.000 description 1
- 230000006397 emotional response Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000012854 evaluation process Methods 0.000 description 1
- 210000001652 frontal lobe Anatomy 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000000869 occipital lobe Anatomy 0.000 description 1
- 210000001152 parietal lobe Anatomy 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 210000003478 temporal lobe Anatomy 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
- A61B5/372—Analysis of electroencephalograms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6814—Head
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Veterinary Medicine (AREA)
- Surgery (AREA)
- Psychiatry (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Artificial Intelligence (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Hospice & Palliative Care (AREA)
- Educational Technology (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Signal Processing (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
Abstract
The present disclosure relates to a method for classification of human emotions from
EEG signal by a neural network approach. The method comprises: collecting a data from an
online data source called database for emotion analysis using physiological (DEAP) signals;
pre-processing the data improving signal-to-noise ratio of an EEG data; extracting features
for both happy and sad emotions from a selected electrode channels from the EEG data;
dividing an input data comprising the extracted features into three partitions namely a
training data, validation data, and testing data; training a neural network with training data
wherein, during the training phase, the neural network is provided with a set of input feature
vector along with respective target vectors to distinguish output classes; and validating a
trained network using the validation data before testing it with the testing data.
10
(N 0 0 (Nir
o2 b. 0
E _
-~0 to
0 e
0n OD W
-j 2 0
12 0 e 12
bD 0 0 0 cn
en .2
E0 "1 0 .
>U 00en
L .2 en
oC oF
0n 7U
0 E
0- 1 V L
eno 0l -0
0t noo (U.0
o 2 2
0x 0 L
en1 9 V ,i I
Description
(N 0 0 (Nir
o2 b. 0
E _ -~0 to
0 e
0n W OD
-j 0 2 12 0 e 12
bD 0 0 0 cn en .2 E0 "1 0 .
00en >U L .2 en
oC oF
0n 7U 0 E
0- 1V L
eno 0l -0 0t noo (U.0
o 2 2 0x 0 L
en1 9 V,i I
The present disclosure relates to a method for classification of human emotions from EEG signal by a neural network approach.
Emotion assessment is typically accomplished by examining the user's emotional displays and/or physiological markers. The visible verbal and nonverbal conduct used to transmit emotion is referred to as a person's emotional expression. Emotion evaluation studies frequently examine a person's facial expressions, words, and gestures to determine their emotional state. Humans connect mostly through words, but they can also use other methods such as body language and facial expressions to effectively express their emotions. Recently, a lot of work has been conducted on recognizing emotional information using physiological signals as blood volume pulse, skin temperature, Electromyogram (EMG), Electrocardiogram (ECG), Galvanic Skin Resistance (GSR), and Electroencephalogram (EEG). In the previous studies on brain physiological signal, only a few studies on human emotion detection have been documented.
Electroencephalography (EEG) is a safe and effective signal acquisition technique that records the brain's behavioral activity. EEG has gotten a lot of interest since it directly reflects changes in the brain in reaction to a stimulus, making it a good method to identify changes in emotional status.
In order to make the above-mentioned existing solutions more efficient there is a method for classification of human emotions from EEG signal by a neural network approach.
The present disclosure relates to a method for classification of human emotions from EEG signal by a neural network approach. To identify emotions, features collected from EEG signals were used to reflect physiological changes in the participants in this disclosure. To depict the properties of the EEG signals for discriminating two emotions, joyful and sad, the mean of the signal, HOC, and Hjorth features, as well as entropy and band-power from each band, were extracted. The role of distinct brain areas in emotion classification was investigated using feature vectors derived from different brain regions. A Backpropagation Neural Network based classifier was used to classify the feature vectors. The built neural network classifier has an average accuracy of more than 94.45%.
In an embodiment, a method 100 for classification of human emotions from EEG signal by a neural network approach comprises the following steps: at step 102, collecting a data from an online data source called database for emotion analysis using physiological (DEAP) signals; at step 104, pre-processing the data improving signal-to-noise ratio of an EEG data; at step 106, extracting features for both happy and sad emotions from a selected electrode channels from the EEG data; at step 108, dividing an input data comprising the extracted features into three partitions namely a training data, validation data, and testing data; at step 110, training a neural network with training data wherein, during the training phase, the neural network is provided with a set of input feature vector along with respective target vectors to distinguish output classes; and at step 112, validating a trained network using the validation data before testing it with the testing data.
To further clarify advantages and features of the present disclosure, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings.
These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
Figure 1 illustrates a method for classification of human emotions from EEG signal by a neural network approach in accordance with an embodiment of the present disclosure.
Figure 2 illustrates (a) Flow diagram for classification of emotion using EEG, (b) Valance-arousal-dimensional model for basic emotions, and (c) Confusion matrix for frontal region considering all subjects and confusion matrix for occipital region considering all subjects in accordance with an embodiment of the present disclosure.
Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help to improve understanding of aspects of the present disclosure. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.
For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.
It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the invention and are not intended to be restrictive thereof.
Reference throughout this specification to "an aspect", "another aspect" or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrase "in an embodiment", "in another embodiment" and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures or components proceeded by "comprises...a" does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components.
Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting.
Embodiments of the present disclosure will be described below in detail with reference to the accompanying drawings.
Referring to Figure 1 illustrates a method for classification of human emotions from EEG signal by a neural network approach in accordance with an embodiment of the present disclosure. The method 100 for classification of human emotions from EEG signal by a neural network approach comprises the following steps: at step 102, collecting a data from an online data source called database for emotion analysis using physiological (DEAP) signals; at step 104, pre-processing the data improving signal-to-noise ratio of an EEG data; at step 106, extracting features for both happy and sad emotions from a selected electrode channels from the EEG data; at step 108, dividing an input data comprising the extracted features into three partitions namely a training data, validation data, and testing data; at step 110, training a neural network with training data wherein, during the training phase, the neural network is provided with a set of input feature vector along with respective target vectors to distinguish output classes; and at step 112, validating a trained network using the validation data before testing it with the testing data.
In an embodiment, the method, wherein, to elicit emotional reactions an effective stimuli were chosen wherein, a Valence-Arousal plane (V-A plane) was divided into 4 quadrants and wherein, 15 stimuli were selected for each quadrant from both manual and affective tags wherein, pre-processing comprises: common average referencing and down sampling EEG from 512 to 256 Hz and high pass filtering with a 2 Hz cut-off frequency using EEGLAB software; removing eye blink artifacts from a signal using blind source separation technique; recording a baseline of 5s before each video wherein, power of the baseline was calculated to nullify the stimuli unrelated variations in power over time; extracting frequency power of trials and baselines of 3-47 Hz frequency range with Welch's method with windows of 256 samples wherein, the baseline power was then subtracted from the EEG signal power; yielding the change of power relative to pre-stimulus period; and eliminating muscle and movement artifacts by further down sampling the EEG signal to 128 Hz and a passband filter of 4-45 Hz was applied.
In another embodiment, the method, wherein, the electrode channels selected for emotion classification comprise two: frontopolar, mid-frontal, occipital, mid-temporal, and parietal; EEG signal was band-pass filtered to obtain different brain rhythms to extract features from them; BPNN classifier combined the electrodes in a particular region and features were extracted from them wherein the feature vectors representing each brain region were then split into training, validation, and testing data; each feature vector is composed of both time and frequency domain features wherein, the important features extracted from EEG for emotion classification comprise: Time domain features: Mean, Hjorth parameters, and higher order crossings; and Frequency domain features: Band-power and entropy; and EEG can be either filtered with Finite Impulse Response (FIR) or Infinite Impulse Response (IIR) digital filters wherein, FIR filters are known to have linear phase response, while IIR filters are characterized by their nonlinear phase response.
In yet another embodiment, the method, wherein, among the other regions, Frontal and occipital region showed an accuracy of 96.7% and 92%, respectively after testing and the developed neural network classifier has demonstrated an average accuracy above 94.45%.
Figure 2 illustrates (a) Flow diagram for classification of emotion using EEG, (b) Valance-arousal-dimensional model for basic emotions, and (c) Confusion matrix for frontal region considering all subjects and Confusion matrix for occipital region considering all subjects in accordance with an embodiment of the present disclosure.
The selection of an appropriate stimulus is a critical challenge in eliciting emotional responses. Four quadrants were created in the Valence-Arousal plane (V-A plane, fig. 2b). From both manual and affective tagging, 15 stimuli were chosen for each quadrant. 120 videos were then put to an online volunteer evaluation process, in which an average of 15 people scored each of the 120 videos for valence and arousal. The ratio of the mean and standard deviation of each video's public rating was used to calculate the normalized score for valence and arousal. Then, for each quadrant in the normalized V-A space, the ten videos closest to the quadrant's extreme corners were chosen.
One video stimulus out of 40 was chosen for emotional content analysis from quadrant-i (high valance-high arousal) and quadrant-3 (low valence-low arousal) (Fig 2b). The video stimulus in the first quadrant is supposed to evoke a strong happy emotion, whereas the music video stimulus in the third quadrant is expected to elicit a strong negative emotion. The positive feeling is assumed to be "happy," whereas the negative emotion is assumed to be "sad."
Happy and sad emotions were classified using EEG data. The technique for distinguishing the aforementioned emotions is depicted in Figure 2a. EEG signals were used to extract features to characterize the first and third quadrant emotions (happy and sad, respectively). Mean, Hojrath parameters, HOC, Band-power, and Entropy are among the feature vectors used to classify "happy" and "sad" emotion. Electrodes from various brain lobes were put together to compare how well each lobe performed in discriminating emotions. Frontal, parietal, temporal, and occipital lobes were involved.
Before delivering the features to the classifier, they derived from the electrodes representing a lobe were combined. The collected features from each electrode were combined and all of the electrodes were deemed to represent the complete brain region. The electrodes in a given location were grouped together, and features were retrieved. Before feeding the feature vectors representing each brain region to the neural network classifier to identify happy and sad emotions, they were split into training, validation, and testing sets.
The data was partitioned four times with different ratios each time. The number of hidden layers has been reduced to a minimum. When all 32 subjects are taken into account, the overall average accuracy of the neural network classifier is determined, as well as accuracy details. The frontal and occipital regions, for example, had accuracy of 96.7 percent and 92 percent, respectively.
The drawings and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts necessarily need to be performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible. The scope of embodiments is at least as broad as given by the following claims.
Benefits, other advantages, and solutions to problems have been described above with regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any component(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature or component of any or all the claims.
Claims (10)
1. A method for classification of human emotions from EEG signal by a neural network approach, the method comprises:
collecting a data from an online data source called database for emotion analysis using physiological (DEAP) signals;
pre-processing the data improving signal-to-noise ratio of an EEG data;
extracting features for both happy and sad emotions from a selected electrode channels from the EEG data;
dividing an input data comprising the extracted features into three partitions namely a training data, validation data, and testing data;
training a neural network with training data wherein, during the training phase, the neural network is provided with a set of input feature vector along with respective target vectors to distinguish output classes; and
validating a trained network using the validation data before testing it with the testing data.
2. The method as claimed in claim 1, wherein, for data collection, to elicit emotional reactions an effective stimuli were chosen wherein, a Valence-Arousal plane (V-A plane) was divided into 4 quadrants and wherein, 15 stimuli were selected for each quadrant from both manual and affective tags.
3. The method as claimed in claim 1, wherein, pre-processing comprises:
common average referencing and down sampling EEG from 512 to 256 Hz and high pass filtering with a 2 Hz cut-off frequency using EEGLAB software; removing eye blink artifacts from a signal using blind source separation technique; recording a baseline of 5s before each video wherein, power of the baseline was calculated to nullify the stimuli unrelated variations in power over time; extracting frequency power of trials and baselines of 3-47 Hz frequency range with Welch's method with windows of 256 samples wherein, the baseline power was then subtracted from the EEG signal power; yielding the change of power relative to pre-stimulus period; and eliminating muscle and movement artifacts by further down sampling the EEG signal to 128 Hz and a passband filter of 4-45 Hz was applied.
4. The method as claimed in claim 1, wherein, the electrode channels selected for emotion classification comprise two: frontopolar, mid-frontal, occipital, mid temporal, and parietal.
5. The method as claimed in claim 1, wherein, EEG signal was band-pass filtered to obtain different brain rhythms to extract features from them.
6. The method as claimed in claim 5, wherein, BPNN classifier is trained with selected scalp region electrodes wherein the feature vectors are extracted from EEG signal. Further, feature vectors were split into training, validation, and testing.
7. The method as claimed in claim 5, wherein, each feature vector is composed of both time and frequency domain features wherein, the important features extracted from EEG for emotion classification comprise: Time domain features: Mean, Hjorth parameters, and higher order crossings; and Frequency domain features: Band-power and entropy.
8. The method as claimed in claim 5, wherein, EEG can be either filtered with Finite Impulse Response (FIR) or Infinite Impulse Response (IR) digital filters wherein, FIR filters are known to have linear phase response, while IIR filters are characterized by their nonlinear phase response.
9. The method as claimed in claim 1, wherein, among the other regions, Frontal and occipital region showed an accuracy of 96.7% and 92%, respectively after testing.
10. The method as claimed in claim 1, wherein, developed neural network classifier has demonstrated an average accuracy above 94.45%.
collecting a data from an online data source called database for emotion analysis using physiological 102 (DEAP) signals;
pre-processing the data improving signal-to-noise ratio of an EEG data; 104 extracting features for both happy and sad emotions from a selected electrode channels from the EEG 106 data;
dividing an input data comprising the extracted features into three partitions namely a training data, validation data, and testing data; 108 training a neural network with training data wherein, during the training phase, the neural network is provided with a set of input feature vector along with respective target vectors to distinguish output classes; and 110
validating a trained network using the validation data before testing it with the testing data. 112 Figure 1
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2021104767A AU2021104767A4 (en) | 2021-07-31 | 2021-07-31 | Method for classification of human emotions based on selected scalp region eeg patterns by a neural network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2021104767A AU2021104767A4 (en) | 2021-07-31 | 2021-07-31 | Method for classification of human emotions based on selected scalp region eeg patterns by a neural network |
Publications (1)
Publication Number | Publication Date |
---|---|
AU2021104767A4 true AU2021104767A4 (en) | 2022-04-28 |
Family
ID=81259361
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2021104767A Ceased AU2021104767A4 (en) | 2021-07-31 | 2021-07-31 | Method for classification of human emotions based on selected scalp region eeg patterns by a neural network |
Country Status (1)
Country | Link |
---|---|
AU (1) | AU2021104767A4 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116364096A (en) * | 2023-03-09 | 2023-06-30 | 浙江大学 | Electroencephalogram signal voice decoding method based on generation countermeasure network |
-
2021
- 2021-07-31 AU AU2021104767A patent/AU2021104767A4/en not_active Ceased
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116364096A (en) * | 2023-03-09 | 2023-06-30 | 浙江大学 | Electroencephalogram signal voice decoding method based on generation countermeasure network |
CN116364096B (en) * | 2023-03-09 | 2023-11-28 | 浙江大学 | Electroencephalogram signal voice decoding method based on generation countermeasure network |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Wu et al. | Investigating EEG-based functional connectivity patterns for multimodal emotion recognition | |
Zheng et al. | EEG-based emotion classification using deep belief networks | |
Aboalayon et al. | Efficient sleep stage classification based on EEG signals | |
Kuo et al. | Automatic stage scoring of single-channel sleep EEG based on multiscale permutation entropy | |
Wagh et al. | Performance evaluation of multi-channel electroencephalogram signal (EEG) based time frequency analysis for human emotion recognition | |
Awan et al. | Effective classification of EEG signals using K-nearest neighbor algorithm | |
CN113208593A (en) | Multi-modal physiological signal emotion classification method based on correlation dynamic fusion | |
Hasan et al. | Fine-grained emotion recognition from eeg signal using fast fourier transformation and cnn | |
AU2021104767A4 (en) | Method for classification of human emotions based on selected scalp region eeg patterns by a neural network | |
Radhika et al. | Transfer learning for subject-independent stress detection using physiological signals | |
Purnamasari et al. | Mobile EEG based drowsiness detection using K-nearest neighbor | |
Islam et al. | Virtual image from EEG to recognize appropriate emotion using convolutional neural network | |
Pan et al. | ST-SCGNN: A Spatio-Temporal Self-Constructing Graph Neural Network for Cross-Subject EEG-Based Emotion Recognition and Consciousness Detection | |
Attallah | Multi-tasks biometric system for personal identification | |
Immanuel et al. | Recognition of emotion with deep learning using EEG signals-the next big wave for stress management in this covid-19 outbreak | |
Ouyang et al. | Vigilance analysis based on continuous wavelet transform of eeg signals | |
Shashi Kumar et al. | Neural network approach for classification of human emotions from EEG signal | |
Asensio-Cubero et al. | A study on temporal segmentation strategies for extracting common spatial patterns for brain computer interfacing | |
Singh et al. | Emotion recognition using deep convolutional neural network on temporal representations of physiological signals | |
Nisa'Minhad et al. | Assessments of Autonomic Nervous System Biomarker for Emotion Recognition using Electrodermal Activity Signal | |
Kaur et al. | Detection of depression from brain signals: A review study | |
Mustafa et al. | Smart thoughts: BCI based system implementation to detect motor imagery movements | |
Sellami et al. | Analysis of speech related EEG signals using emotiv epoc+ headset, fast fourier transform, principal component analysis, and K-nearest neighbor methods | |
Tangkraingkij et al. | An appropriate number of neurons in a hidden layer for personal authentication using delta brainwave signals | |
Dondup et al. | EEG Based Emotion Recognition Using Variational Mode Decomposition and Convolutional Neural Network for Affective Computing Interfaces |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FGI | Letters patent sealed or granted (innovation patent) | ||
MK22 | Patent ceased section 143a(d), or expired - non payment of renewal fee or expiry |