AU2021104767A4 - Method for classification of human emotions based on selected scalp region eeg patterns by a neural network - Google Patents

Method for classification of human emotions based on selected scalp region eeg patterns by a neural network Download PDF

Info

Publication number
AU2021104767A4
AU2021104767A4 AU2021104767A AU2021104767A AU2021104767A4 AU 2021104767 A4 AU2021104767 A4 AU 2021104767A4 AU 2021104767 A AU2021104767 A AU 2021104767A AU 2021104767 A AU2021104767 A AU 2021104767A AU 2021104767 A4 AU2021104767 A4 AU 2021104767A4
Authority
AU
Australia
Prior art keywords
data
eeg
neural network
training
testing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2021104767A
Inventor
Shashi Kumar G. S.
Niranjana Sampathila
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to AU2021104767A priority Critical patent/AU2021104767A4/en
Application granted granted Critical
Publication of AU2021104767A4 publication Critical patent/AU2021104767A4/en
Ceased legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/372Analysis of electroencephalograms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6814Head
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Psychiatry (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Artificial Intelligence (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Hospice & Palliative Care (AREA)
  • Educational Technology (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Signal Processing (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The present disclosure relates to a method for classification of human emotions from EEG signal by a neural network approach. The method comprises: collecting a data from an online data source called database for emotion analysis using physiological (DEAP) signals; pre-processing the data improving signal-to-noise ratio of an EEG data; extracting features for both happy and sad emotions from a selected electrode channels from the EEG data; dividing an input data comprising the extracted features into three partitions namely a training data, validation data, and testing data; training a neural network with training data wherein, during the training phase, the neural network is provided with a set of input feature vector along with respective target vectors to distinguish output classes; and validating a trained network using the validation data before testing it with the testing data. 10 (N 0 0 (Nir o2 b. 0 E _ -~0 to 0 e 0n OD W -j 2 0 12 0 e 12 bD 0 0 0 cn en .2 E0 "1 0 . >U 00en L .2 en oC oF 0n 7U 0 E 0- 1 V L eno 0l -0 0t noo (U.0 o 2 2 0x 0 L en1 9 V ,i I

Description

(N 0 0 (Nir
o2 b. 0
E _ -~0 to
0 e
0n W OD
-j 0 2 12 0 e 12
bD 0 0 0 cn en .2 E0 "1 0 .
00en >U L .2 en
oC oF
0n 7U 0 E
0- 1V L
eno 0l -0 0t noo (U.0
o 2 2 0x 0 L
en1 9 V,i I
METHOD FOR CLASSIFICATION OF HUMAN EMOTIONS BASED ON SELECTED SCALP REGION EEG PATTERNS BY A NEURAL NETWORK FIELD OF THE INVENTION
The present disclosure relates to a method for classification of human emotions from EEG signal by a neural network approach.
BACKGROUND OF THE INVENTION
Emotion assessment is typically accomplished by examining the user's emotional displays and/or physiological markers. The visible verbal and nonverbal conduct used to transmit emotion is referred to as a person's emotional expression. Emotion evaluation studies frequently examine a person's facial expressions, words, and gestures to determine their emotional state. Humans connect mostly through words, but they can also use other methods such as body language and facial expressions to effectively express their emotions. Recently, a lot of work has been conducted on recognizing emotional information using physiological signals as blood volume pulse, skin temperature, Electromyogram (EMG), Electrocardiogram (ECG), Galvanic Skin Resistance (GSR), and Electroencephalogram (EEG). In the previous studies on brain physiological signal, only a few studies on human emotion detection have been documented.
Electroencephalography (EEG) is a safe and effective signal acquisition technique that records the brain's behavioral activity. EEG has gotten a lot of interest since it directly reflects changes in the brain in reaction to a stimulus, making it a good method to identify changes in emotional status.
In order to make the above-mentioned existing solutions more efficient there is a method for classification of human emotions from EEG signal by a neural network approach.
SUMMARY OF THE INVENTION
The present disclosure relates to a method for classification of human emotions from EEG signal by a neural network approach. To identify emotions, features collected from EEG signals were used to reflect physiological changes in the participants in this disclosure. To depict the properties of the EEG signals for discriminating two emotions, joyful and sad, the mean of the signal, HOC, and Hjorth features, as well as entropy and band-power from each band, were extracted. The role of distinct brain areas in emotion classification was investigated using feature vectors derived from different brain regions. A Backpropagation Neural Network based classifier was used to classify the feature vectors. The built neural network classifier has an average accuracy of more than 94.45%.
In an embodiment, a method 100 for classification of human emotions from EEG signal by a neural network approach comprises the following steps: at step 102, collecting a data from an online data source called database for emotion analysis using physiological (DEAP) signals; at step 104, pre-processing the data improving signal-to-noise ratio of an EEG data; at step 106, extracting features for both happy and sad emotions from a selected electrode channels from the EEG data; at step 108, dividing an input data comprising the extracted features into three partitions namely a training data, validation data, and testing data; at step 110, training a neural network with training data wherein, during the training phase, the neural network is provided with a set of input feature vector along with respective target vectors to distinguish output classes; and at step 112, validating a trained network using the validation data before testing it with the testing data.
To further clarify advantages and features of the present disclosure, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings.
BRIEF DESCRIPTION OF FIGURES
These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
Figure 1 illustrates a method for classification of human emotions from EEG signal by a neural network approach in accordance with an embodiment of the present disclosure.
Figure 2 illustrates (a) Flow diagram for classification of emotion using EEG, (b) Valance-arousal-dimensional model for basic emotions, and (c) Confusion matrix for frontal region considering all subjects and confusion matrix for occipital region considering all subjects in accordance with an embodiment of the present disclosure.
Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help to improve understanding of aspects of the present disclosure. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.
DETAILED DESCRIPTION
For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.
It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the invention and are not intended to be restrictive thereof.
Reference throughout this specification to "an aspect", "another aspect" or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrase "in an embodiment", "in another embodiment" and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures or components proceeded by "comprises...a" does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components.
Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting.
Embodiments of the present disclosure will be described below in detail with reference to the accompanying drawings.
Referring to Figure 1 illustrates a method for classification of human emotions from EEG signal by a neural network approach in accordance with an embodiment of the present disclosure. The method 100 for classification of human emotions from EEG signal by a neural network approach comprises the following steps: at step 102, collecting a data from an online data source called database for emotion analysis using physiological (DEAP) signals; at step 104, pre-processing the data improving signal-to-noise ratio of an EEG data; at step 106, extracting features for both happy and sad emotions from a selected electrode channels from the EEG data; at step 108, dividing an input data comprising the extracted features into three partitions namely a training data, validation data, and testing data; at step 110, training a neural network with training data wherein, during the training phase, the neural network is provided with a set of input feature vector along with respective target vectors to distinguish output classes; and at step 112, validating a trained network using the validation data before testing it with the testing data.
In an embodiment, the method, wherein, to elicit emotional reactions an effective stimuli were chosen wherein, a Valence-Arousal plane (V-A plane) was divided into 4 quadrants and wherein, 15 stimuli were selected for each quadrant from both manual and affective tags wherein, pre-processing comprises: common average referencing and down sampling EEG from 512 to 256 Hz and high pass filtering with a 2 Hz cut-off frequency using EEGLAB software; removing eye blink artifacts from a signal using blind source separation technique; recording a baseline of 5s before each video wherein, power of the baseline was calculated to nullify the stimuli unrelated variations in power over time; extracting frequency power of trials and baselines of 3-47 Hz frequency range with Welch's method with windows of 256 samples wherein, the baseline power was then subtracted from the EEG signal power; yielding the change of power relative to pre-stimulus period; and eliminating muscle and movement artifacts by further down sampling the EEG signal to 128 Hz and a passband filter of 4-45 Hz was applied.
In another embodiment, the method, wherein, the electrode channels selected for emotion classification comprise two: frontopolar, mid-frontal, occipital, mid-temporal, and parietal; EEG signal was band-pass filtered to obtain different brain rhythms to extract features from them; BPNN classifier combined the electrodes in a particular region and features were extracted from them wherein the feature vectors representing each brain region were then split into training, validation, and testing data; each feature vector is composed of both time and frequency domain features wherein, the important features extracted from EEG for emotion classification comprise: Time domain features: Mean, Hjorth parameters, and higher order crossings; and Frequency domain features: Band-power and entropy; and EEG can be either filtered with Finite Impulse Response (FIR) or Infinite Impulse Response (IIR) digital filters wherein, FIR filters are known to have linear phase response, while IIR filters are characterized by their nonlinear phase response.
In yet another embodiment, the method, wherein, among the other regions, Frontal and occipital region showed an accuracy of 96.7% and 92%, respectively after testing and the developed neural network classifier has demonstrated an average accuracy above 94.45%.
Figure 2 illustrates (a) Flow diagram for classification of emotion using EEG, (b) Valance-arousal-dimensional model for basic emotions, and (c) Confusion matrix for frontal region considering all subjects and Confusion matrix for occipital region considering all subjects in accordance with an embodiment of the present disclosure.
The selection of an appropriate stimulus is a critical challenge in eliciting emotional responses. Four quadrants were created in the Valence-Arousal plane (V-A plane, fig. 2b). From both manual and affective tagging, 15 stimuli were chosen for each quadrant. 120 videos were then put to an online volunteer evaluation process, in which an average of 15 people scored each of the 120 videos for valence and arousal. The ratio of the mean and standard deviation of each video's public rating was used to calculate the normalized score for valence and arousal. Then, for each quadrant in the normalized V-A space, the ten videos closest to the quadrant's extreme corners were chosen.
One video stimulus out of 40 was chosen for emotional content analysis from quadrant-i (high valance-high arousal) and quadrant-3 (low valence-low arousal) (Fig 2b). The video stimulus in the first quadrant is supposed to evoke a strong happy emotion, whereas the music video stimulus in the third quadrant is expected to elicit a strong negative emotion. The positive feeling is assumed to be "happy," whereas the negative emotion is assumed to be "sad."
Happy and sad emotions were classified using EEG data. The technique for distinguishing the aforementioned emotions is depicted in Figure 2a. EEG signals were used to extract features to characterize the first and third quadrant emotions (happy and sad, respectively). Mean, Hojrath parameters, HOC, Band-power, and Entropy are among the feature vectors used to classify "happy" and "sad" emotion. Electrodes from various brain lobes were put together to compare how well each lobe performed in discriminating emotions. Frontal, parietal, temporal, and occipital lobes were involved.
Before delivering the features to the classifier, they derived from the electrodes representing a lobe were combined. The collected features from each electrode were combined and all of the electrodes were deemed to represent the complete brain region. The electrodes in a given location were grouped together, and features were retrieved. Before feeding the feature vectors representing each brain region to the neural network classifier to identify happy and sad emotions, they were split into training, validation, and testing sets.
The data was partitioned four times with different ratios each time. The number of hidden layers has been reduced to a minimum. When all 32 subjects are taken into account, the overall average accuracy of the neural network classifier is determined, as well as accuracy details. The frontal and occipital regions, for example, had accuracy of 96.7 percent and 92 percent, respectively.
The drawings and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts necessarily need to be performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible. The scope of embodiments is at least as broad as given by the following claims.
Benefits, other advantages, and solutions to problems have been described above with regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any component(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature or component of any or all the claims.

Claims (10)

We Claim
1. A method for classification of human emotions from EEG signal by a neural network approach, the method comprises:
collecting a data from an online data source called database for emotion analysis using physiological (DEAP) signals;
pre-processing the data improving signal-to-noise ratio of an EEG data;
extracting features for both happy and sad emotions from a selected electrode channels from the EEG data;
dividing an input data comprising the extracted features into three partitions namely a training data, validation data, and testing data;
training a neural network with training data wherein, during the training phase, the neural network is provided with a set of input feature vector along with respective target vectors to distinguish output classes; and
validating a trained network using the validation data before testing it with the testing data.
2. The method as claimed in claim 1, wherein, for data collection, to elicit emotional reactions an effective stimuli were chosen wherein, a Valence-Arousal plane (V-A plane) was divided into 4 quadrants and wherein, 15 stimuli were selected for each quadrant from both manual and affective tags.
3. The method as claimed in claim 1, wherein, pre-processing comprises:
common average referencing and down sampling EEG from 512 to 256 Hz and high pass filtering with a 2 Hz cut-off frequency using EEGLAB software; removing eye blink artifacts from a signal using blind source separation technique; recording a baseline of 5s before each video wherein, power of the baseline was calculated to nullify the stimuli unrelated variations in power over time; extracting frequency power of trials and baselines of 3-47 Hz frequency range with Welch's method with windows of 256 samples wherein, the baseline power was then subtracted from the EEG signal power; yielding the change of power relative to pre-stimulus period; and eliminating muscle and movement artifacts by further down sampling the EEG signal to 128 Hz and a passband filter of 4-45 Hz was applied.
4. The method as claimed in claim 1, wherein, the electrode channels selected for emotion classification comprise two: frontopolar, mid-frontal, occipital, mid temporal, and parietal.
5. The method as claimed in claim 1, wherein, EEG signal was band-pass filtered to obtain different brain rhythms to extract features from them.
6. The method as claimed in claim 5, wherein, BPNN classifier is trained with selected scalp region electrodes wherein the feature vectors are extracted from EEG signal. Further, feature vectors were split into training, validation, and testing.
7. The method as claimed in claim 5, wherein, each feature vector is composed of both time and frequency domain features wherein, the important features extracted from EEG for emotion classification comprise: Time domain features: Mean, Hjorth parameters, and higher order crossings; and Frequency domain features: Band-power and entropy.
8. The method as claimed in claim 5, wherein, EEG can be either filtered with Finite Impulse Response (FIR) or Infinite Impulse Response (IR) digital filters wherein, FIR filters are known to have linear phase response, while IIR filters are characterized by their nonlinear phase response.
9. The method as claimed in claim 1, wherein, among the other regions, Frontal and occipital region showed an accuracy of 96.7% and 92%, respectively after testing.
10. The method as claimed in claim 1, wherein, developed neural network classifier has demonstrated an average accuracy above 94.45%.
collecting a data from an online data source called database for emotion analysis using physiological 102 (DEAP) signals;
pre-processing the data improving signal-to-noise ratio of an EEG data; 104 extracting features for both happy and sad emotions from a selected electrode channels from the EEG 106 data;
dividing an input data comprising the extracted features into three partitions namely a training data, validation data, and testing data; 108 training a neural network with training data wherein, during the training phase, the neural network is provided with a set of input feature vector along with respective target vectors to distinguish output classes; and 110
validating a trained network using the validation data before testing it with the testing data. 112 Figure 1
AU2021104767A 2021-07-31 2021-07-31 Method for classification of human emotions based on selected scalp region eeg patterns by a neural network Ceased AU2021104767A4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2021104767A AU2021104767A4 (en) 2021-07-31 2021-07-31 Method for classification of human emotions based on selected scalp region eeg patterns by a neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2021104767A AU2021104767A4 (en) 2021-07-31 2021-07-31 Method for classification of human emotions based on selected scalp region eeg patterns by a neural network

Publications (1)

Publication Number Publication Date
AU2021104767A4 true AU2021104767A4 (en) 2022-04-28

Family

ID=81259361

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2021104767A Ceased AU2021104767A4 (en) 2021-07-31 2021-07-31 Method for classification of human emotions based on selected scalp region eeg patterns by a neural network

Country Status (1)

Country Link
AU (1) AU2021104767A4 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116364096A (en) * 2023-03-09 2023-06-30 浙江大学 Electroencephalogram signal voice decoding method based on generation countermeasure network

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116364096A (en) * 2023-03-09 2023-06-30 浙江大学 Electroencephalogram signal voice decoding method based on generation countermeasure network
CN116364096B (en) * 2023-03-09 2023-11-28 浙江大学 Electroencephalogram signal voice decoding method based on generation countermeasure network

Similar Documents

Publication Publication Date Title
Wu et al. Investigating EEG-based functional connectivity patterns for multimodal emotion recognition
Zheng et al. EEG-based emotion classification using deep belief networks
Aboalayon et al. Efficient sleep stage classification based on EEG signals
Kuo et al. Automatic stage scoring of single-channel sleep EEG based on multiscale permutation entropy
Wagh et al. Performance evaluation of multi-channel electroencephalogram signal (EEG) based time frequency analysis for human emotion recognition
Awan et al. Effective classification of EEG signals using K-nearest neighbor algorithm
CN113208593A (en) Multi-modal physiological signal emotion classification method based on correlation dynamic fusion
Hasan et al. Fine-grained emotion recognition from eeg signal using fast fourier transformation and cnn
AU2021104767A4 (en) Method for classification of human emotions based on selected scalp region eeg patterns by a neural network
Radhika et al. Transfer learning for subject-independent stress detection using physiological signals
Purnamasari et al. Mobile EEG based drowsiness detection using K-nearest neighbor
Islam et al. Virtual image from EEG to recognize appropriate emotion using convolutional neural network
Pan et al. ST-SCGNN: A Spatio-Temporal Self-Constructing Graph Neural Network for Cross-Subject EEG-Based Emotion Recognition and Consciousness Detection
Attallah Multi-tasks biometric system for personal identification
Immanuel et al. Recognition of emotion with deep learning using EEG signals-the next big wave for stress management in this covid-19 outbreak
Ouyang et al. Vigilance analysis based on continuous wavelet transform of eeg signals
Shashi Kumar et al. Neural network approach for classification of human emotions from EEG signal
Asensio-Cubero et al. A study on temporal segmentation strategies for extracting common spatial patterns for brain computer interfacing
Singh et al. Emotion recognition using deep convolutional neural network on temporal representations of physiological signals
Nisa'Minhad et al. Assessments of Autonomic Nervous System Biomarker for Emotion Recognition using Electrodermal Activity Signal
Kaur et al. Detection of depression from brain signals: A review study
Mustafa et al. Smart thoughts: BCI based system implementation to detect motor imagery movements
Sellami et al. Analysis of speech related EEG signals using emotiv epoc+ headset, fast fourier transform, principal component analysis, and K-nearest neighbor methods
Tangkraingkij et al. An appropriate number of neurons in a hidden layer for personal authentication using delta brainwave signals
Dondup et al. EEG Based Emotion Recognition Using Variational Mode Decomposition and Convolutional Neural Network for Affective Computing Interfaces

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)
MK22 Patent ceased section 143a(d), or expired - non payment of renewal fee or expiry