CN111012340A - Emotion classification method based on multilayer perceptron - Google Patents
Emotion classification method based on multilayer perceptron Download PDFInfo
- Publication number
- CN111012340A CN111012340A CN202010013712.2A CN202010013712A CN111012340A CN 111012340 A CN111012340 A CN 111012340A CN 202010013712 A CN202010013712 A CN 202010013712A CN 111012340 A CN111012340 A CN 111012340A
- Authority
- CN
- China
- Prior art keywords
- entropy
- perceptron
- model
- classification
- differential
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7203—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7253—Details of waveform analysis characterised by using transforms
- A61B5/7257—Details of waveform analysis characterised by using transforms using Fourier transforms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Biophysics (AREA)
- Psychiatry (AREA)
- Artificial Intelligence (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Mathematical Physics (AREA)
- Psychology (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Educational Technology (AREA)
- Hospice & Palliative Care (AREA)
- Social Psychology (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
Abstract
The invention relates to an emotion classification method based on a multilayer perceptron. The innovation of the invention is that: the training model can be used as a pre-training model and input into the network again, so that when data is updated, the model can be adjusted in real time by using new data without inputting all data into the network again for training, thereby realizing the purpose of continuously optimizing the accuracy of the model on the existing model, namely realizing the purpose of learning the model in real time. The lifelong learning model has important significance for improving the adaptability and accuracy of the model.
Description
Technical Field
The invention relates to an emotion classification method based on electroencephalogram signals, and belongs to the technical field of digital signal processing.
Background
The brain electrical signal is the general reflection of the physiological activity of brain nerve cells on the surface of cerebral cortex or scalp. The brain electrical signals contain a large amount of physiological and disease information, and with the development of brain-computer interface (BCI) technology, the brain electrical signals are applied more and more widely in the fields of medical treatment and engineering. At present, emotion analysis methods based on electroencephalogram signals are highly concerned.
In recent years, many related studies at home and abroad have demonstrated the feasibility of emotion recognition through electroencephalogram. FengLiu et al propose an emotion recognition method based on sample entropy, Calibo et al apply energy features and combine with a neural network to recognize emotion. In previous studies, classification of emotions has been achieved. However, although the above methods have shown some accuracy in each particular experiment, they have also had significant disadvantages: if the classification accuracy is low, and the model cannot be migrated and studied for life, that is, the existing model cannot be further trained, the model cannot be adjusted in real time along with the acquisition of data, and the model cannot be continuously optimized.
Disclosure of Invention
The invention aims to: aiming at the defects in the prior art, the emotion classification method based on the multilayer perceptron is provided, and the classification accuracy is improved through model optimization.
In order to achieve the following purposes, the technical scheme adopted by the invention is as follows:
the emotion classification method based on the multilayer perceptron is characterized by comprising the following steps of:
step 1: denoising the acquired electroencephalogram signals;
step 2: carrying out Fourier transform on the signal, and converting the signal into frequency domain analysis;
and step 3: extracting characteristics, namely calculating the values of differential entropy and power spectral density of five important frequency bands of the electroencephalogram signal;
and 4, step 4: and inputting the characteristic values into a multi-layer perceptron to train a classification model.
And 5: and loading the model obtained by training into a classification algorithm so as to realize real-time classification of emotion, and meanwhile, when new data is obtained, the model obtained by original training can be used as a pre-training model to be further trained together with the newly obtained data, so that the performance of the classification model can be improved by using the new data when the data is updated, and the purpose of continuously optimizing the accuracy of the model by adjusting the model in real time on the existing model through the obtained data is achieved.
The further technical scheme is that the electroencephalogram signals are signals which are respectively collected from the forehead areas of the left hemisphere and the forehead area of the right hemisphere. In the feature extraction process, the asymmetry of the electroencephalogram signals when the left brain and the right brain generate emotion is considered, the asymmetric entropy features of the electroencephalogram signals are extracted, and the feature values are also input into the neural network.
In the step 2, a formula is used:
and converting the signal analysis and processing based on the time domain into the analysis of the frequency domain.
Further, in step 3, the characteristic values are differential entropy values and power spectral density values of each frequency band obtained by extracting signals obtained by fourier transform processing.
Further, the specific steps of step 3 are:
the differential entropy for the five important frequency bands is calculated using the following derived formula:
representing a source X by an average information quantityiThe range of values is roughly divided into small parts with width delta, and each part always has one x as can be seen from the mean theoremiThe following equation holds true:
assigning each point in the i part to xiSubstituting into a Shannon entropy formula of the discrete variable:
HΔ=-∑ip(xi)Δln[p(xi)Δ]=-∑ip(xi)Δln[p(xi)]-∑ip(xi)ΔlnΔ 0
when Δ approaches 0, Σip(xi) Δ tends to 1, and ln Δ tends to- ∞, so the second rightmost term of equation (2) tends to ∞, defining the first rightmost term of equation (3) as entropy of a continuous source, called differential entropy, so that differential entropy can be written as:
H(x)=-∫Xf(x)log[f(x)]dx (2)
wherein f (x) is a signal xiA probability density function of; assuming a signal source, the EEG data follows normal distribution N (mu, sigma)2) Then, solving the differential entropy to obtain:
from the above formula, provided that σ is obtained2Then x is obtainediThe differential entropy of (2) is normal distribution N (mu, sigma)2) The variance calculation formula of (a) is:
the spectral energy of a discrete signal is generally defined asThen, from the above equation, the differential entropy characteristic for a certain frequency band is:
and the power spectral density can be represented by the formula:
the emotion intensity of a person can be obtained by analyzing the frequency band energy, namely the power spectral density, and the emotion can be classified into positive emotion and negative emotion by utilizing other characteristic values through a multi-layer perceptron.
Further, in the step 5, the asymmetric entropy may also be input into a multi-layer perceptron;
the asymmetric entropy calculation method comprises the following steps:
1) respectively calculating differential entropy of three leads (FP1, F3 and F7) of the left brain in a β frequency band and differential entropy of three leads (FP2, F4 and F8) of the right brain in a β frequency band;
2) calculating the value of the differential entropy of the right side lead divided by the difference between the differential entropies of the left and right symmetrical leads, and the value of the differential entropy of the right side lead divided by the sum of the differential entropies of the left and right symmetrical leads, as shown in formulas (8) and (9):
DELdifferential entropy, DE, representing left side leadsRDifferential entropy representing the right lead;
3) the calculated index1 and index2 are pieced together as asymmetric entropy DisEn, which is represented by equation (10):
DisEn=[index1,index2](11)。
furthermore, in the step 4, the multi-layer perceptron classifier adopts a migration learning and lifetime learning architecture, and the accuracy of classification can be improved through further training; the multilayer perceptron inputs the characteristic value into an input layer of the multilayer perceptron by simulating the mode of biological neurons, each node of a hidden layer and the output layer calculates linear transformation once, a linear activation function is applied, and the classification accuracy of the multilayer perceptron is improved by adding the hidden layer; training a model by inputting a training data set; the method is essentially equivalent to the integration of a generalized linear regression model, so that the classification task can be completed.
Further, the specific implementation method of step 5 is to reload the classification model stored in step 4 and apply the reloaded classification model to the emotion classification of the acquired data, and in order to reduce the influence of accidental factors, electroencephalogram data within 1s, namely a segment of data, is processed for each analysis, so that the real-time emotion classification can be realized.
Compared with the prior art, the invention has the main contributions and characteristics that:
by changing the classification algorithm and applying transfer learning, the pre-training models are input into the network, and each model trained later is input into the network as a new pre-training model, so that the classification models can be updated in real time, namely, the model can be learned for the whole life. The classification accuracy is improved along with the increase of data samples, and all data do not need to be input into the network again for long-time retraining; compared with the traditional method, the method can continuously adjust the model in real time along with data acquisition and improve the accuracy of emotion classification, so that the classification method has more practical application value.
Drawings
The invention will be further described with reference to the accompanying drawings.
FIG. 1 is a flow chart of the present invention for obtaining a pre-training model.
FIG. 2 is a flow chart of classifier migration learning in the present invention.
Detailed Description
The following detailed description of specific embodiments of the invention is provided, but it should be understood that the scope of the invention is not limited to the specific embodiments.
The implementation provides an emotion classification method based on a multilayer perceptron, which comprises the following steps:
the method comprises the following steps of 1, filtering noise of electroencephalogram signals through a ThinkGearASIC chip, obtaining amplified original electroencephalogram signals obtained by the chip and electroencephalogram signal waveband data such as α and the like output through wireless communication, and as a further technical scheme, collecting electroencephalogram signal data of left and right hemispheres of a brain respectively by adopting a dual-channel device, calculating asymmetric entropy characteristics of the electroencephalogram signal data, and using the electroencephalogram signal data as input of a classifier.
Step 2: carrying out Fourier transform on the acquired signals, and converting the signals into analysis on a frequency domain:
using the formula:
and converting the signal analysis and processing based on the time domain into the analysis of the frequency domain.
And step 3: extracting the characteristics of the acquired electroencephalogram signals, and respectively calculating the equivalent values of differential entropy and power spectral density of five important frequency bands, wherein the specific operations are as follows:
the differential entropy for the five important frequency bands is calculated using the following derived formula:
representing a source X by an average information quantityiThe range of values is roughly divided into small parts with width delta, and each part always has one x as can be seen from the mean theoremiThe following equation holds true:
assigning each point in the i part to xiSubstituting into a Shannon entropy formula of the discrete variable:
HΔ=-∑ip(xi)Δln[p(xi)Δ]=-∑ip(xi)Δln[p(xi)]-∑ip(xi)ΔlnΔ 0
when Δ approaches 0, Σip(xi) Δ tends to 1, and ln Δ tends to- ∞, so the second rightmost term of equation (2) tends to ∞, defining the first rightmost term of equation (3) as entropy of a continuous source, called differential entropy, so that differential entropy can be written as:
H(x)=-∫Xf(x)log[f(x)]dx (2)
wherein f (x) is a signal xiIs determined. We assume signal source, brain electrical signal data obey normal distribution N (mu, sigma)2) Then, solving the differential entropy to obtain:
from the above formula, provided thatTo obtain sigma2Then x is obtainediThe differential entropy of (2) is normal distribution N (mu, sigma)+) The variance calculation formula of (a) is:
the spectral energy of a discrete signal is generally defined asThen, from the above equation, the differential entropy characteristic for a certain frequency band is:
therefore, errors caused by overlarge frequency band energy values in calculation are reduced by utilizing the differential entropy characteristics, and the accuracy of the characteristics is improved.
And the power spectral density can be represented by the formula:
by analyzing the energy of the frequency band, i.e. the power spectral density, the emotional intensity of the person can be obtained. The positive and negative emotions can be classified through a multi-layer perceptron by utilizing other characteristic values.
As an optional optimization scheme, the asymmetric entropy can also be input into the multi-layer perceptron in the process.
The asymmetric entropy calculation method comprises the following steps:
differential entropy of the three leads of the left brain (FP1, F3, F7) at the β frequency band and differential entropy of the three leads of the right brain (FP2, F4, F8) at the β frequency band are respectively calculated.
Calculating the value of the differential entropy of the right side lead divided by the difference between the differential entropies of the left and right symmetrical leads, and the value of the differential entropy of the right side lead divided by the sum of the differential entropies of the left and right symmetrical leads, as shown in formulas (9) and (10):
DELdifferential entropy, DE, representing left side leadsRDifferential entropy representing right leads
The calculated index1 and index2 are pieced together as asymmetric entropy DisEn, which is represented by equation (11):
DisEn=[index1,index2](11)
and 4, step 4: and inputting the characteristic values into a multi-layer perceptron, and training the model.
The multilayer perceptron inputs the characteristic value into an input layer of the multilayer perceptron by simulating the mode of biological neurons, each node of the hidden layer and the output layer calculates linear transformation once, a linear activation function is applied, and the classification accuracy of the multilayer perceptron is improved by adding the hidden layer. A model is trained by inputting a training data set. The method is essentially equivalent to the integration of a generalized linear regression model, so that the classification task can be completed.
And 5: and (4) carrying out real-time emotion classification by using the classification model trained in the step (4).
The specific implementation method is that the classification model stored in the step 4 is reloaded and applied to emotion classification of the acquired data. In order to reduce the influence of accidental factors, electroencephalogram data within 1s, namely a piece of data, is processed in each analysis. In this way, real-time classification of emotions is achieved.
Step 6: and updating the model.
The adopted neural network model can be trained together with newly acquired data by utilizing a pre-training model and directly using a corresponding structure and weight, so that transfer learning is realized. When new data is acquired each time, whether the model is further updated with the model obtained by last updating can be selected, and if yes, the model is adjusted; if not, the previous model is adopted to classify and predict the emotion, so that the model is continuously updated, lifelong learning is realized, and continuous optimization of the classification model is realized.
The innovation of the invention is that: by changing the classification algorithm and applying transfer learning, the pre-training models are input into the network, and each model trained later is input into the network as a new pre-training model, so that the classification models can be updated in real time, namely, the model can be learned for the whole life. The classification accuracy is improved along with the increase of data samples, and all data do not need to be input into the network again for long-time retraining; compared with the traditional method, the method can continuously adjust the model in real time along with data acquisition and improve the accuracy of emotion classification, so that the classification method has more practical application value.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any approximate changes or substitutions should be included in the scope of the present invention, and therefore, the scope of the present invention should be subject to the claims.
Claims (9)
1. A sentiment classification method based on a multilayer perceptron is characterized in that: comprises the following steps of (a) carrying out,
step 1: denoising the acquired electroencephalogram signals;
step 2: carrying out Fourier transform on the signal, and converting the signal into frequency domain analysis;
and step 3: extracting characteristics, namely calculating the values of differential entropy and power spectral density of five important frequency bands of the electroencephalogram signal;
and 4, step 4: inputting the characteristic values into a multi-layer perceptron to train a classification model;
and 5: and loading the model obtained by training into a classification algorithm so as to realize real-time emotion classification, and finishing further training of the model by using new data acquired in real time.
2. The multi-layered perceptron-based emotion classification method of claim 1, characterized in that: in the step 1, the extracted signals are brain electrical signals of a brain forehead area.
3. The multi-layered perceptron-based emotion classification method of claim 2, characterized in that: in the step 1, two-channel equipment is adopted to respectively acquire electroencephalogram data of left and right hemispheres of a brain, and classification recognition rate is improved by calculating asymmetric entropy characteristics and using the calculated data as classifier input.
5. The multi-layered perceptron-based emotion classification method of claim 1, characterized in that: in the step 3, the characteristic values are differential entropy values and power spectral density values of each frequency band obtained by extracting signals obtained by fourier transform processing.
6. The multi-layered perceptron-based emotion classification method of claim 5, characterized in that: the specific steps of the step 3 are as follows:
the differential entropy for the five important frequency bands is calculated using the following derived formula:
representing a source X by an average information quantityiThe range of values is roughly divided into small parts with width delta, and each part always has one x as can be seen from the mean theoremiThe following equation holds true:
assigning each point in the i part to xiShannon entropy of substituting discrete variablesIn the formula:
HΔ=-∑ip(xi)Δln[p(xi)Δ]=-∑ip(xi)Δln[p(xi)]-∑ip(xi)ΔlnΔ 0
when Δ approaches 0, Σip(xi) Δ tends to 1, and ln Δ tends to- ∞, so the second rightmost term of equation (2) tends to ∞, defining the first rightmost term of equation (3) as entropy of a continuous source, called differential entropy, so that differential entropy can be written as:
H(x)=-∫Xf(x)log[f(x)]dx (2)
wherein f (x) is a signal xiA probability density function of; assuming a signal source, the EEG data follows normal distribution N (mu, sigma)2) Then, solving the differential entropy to obtain:
from the above formula, provided that σ is obtained2Then x is obtainediThe differential entropy of (2) is normal distribution N (mu, sigma)2) The variance calculation formula of (a) is:
the spectral energy of a discrete signal is generally defined asThen, from the above equation, the differential entropy characteristic for a certain frequency band is:
and the power spectral density can be represented by the formula:
the emotion intensity of a person can be obtained by analyzing the frequency band energy, namely the power spectral density, and the emotion can be classified into positive emotion and negative emotion by utilizing other characteristic values through a multi-layer perceptron.
7. The multi-layered perceptron-based emotion classification method of claim 5, characterized in that: in the step 5, the asymmetric entropy may also be input into a multi-layer perceptron;
the asymmetric entropy calculation method comprises the following steps:
1) respectively calculating differential entropy of three leads (FP1, F3 and F7) of the left brain in a β frequency band and differential entropy of three leads (FP2, F4 and F8) of the right brain in a β frequency band;
2) calculating the value of the differential entropy of the right side lead divided by the difference between the differential entropies of the left and right symmetrical leads, and the value of the differential entropy of the right side lead divided by the sum of the differential entropies of the left and right symmetrical leads, as shown in formulas (8) and (9):
DELdifferential entropy, DE, representing left side leadsRDifferential entropy representing the right lead;
3) the calculated index1 and index2 are pieced together as asymmetric entropy DisEn, which is represented by equation (10):
DisEn=[index1,index2](11)。
8. the multi-layered perceptron-based emotion classification method of claim 1, characterized in that: in the step 4, the multi-layer perceptron classifier adopts a migration learning and lifetime learning framework, and the classification accuracy can be improved through further training; the multilayer perceptron inputs the characteristic value into an input layer of the multilayer perceptron by simulating the mode of biological neurons, each node of a hidden layer and the output layer calculates linear transformation once, a linear activation function is applied, and the classification accuracy of the multilayer perceptron is improved by adding the hidden layer; training a model by inputting a training data set; the method is essentially equivalent to the integration of a generalized linear regression model, so that the classification task can be completed.
9. The multi-layered perceptron-based emotion classification method of claim 1, characterized in that: the specific implementation method of the step 5 is that the classification model stored in the step 4 is reloaded and applied to emotion classification of the acquired data, and electroencephalogram data within 1s, namely a section of data, is processed every time for reducing the influence of accidental factors, so that real-time emotion classification can be realized.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010013712.2A CN111012340A (en) | 2020-01-07 | 2020-01-07 | Emotion classification method based on multilayer perceptron |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010013712.2A CN111012340A (en) | 2020-01-07 | 2020-01-07 | Emotion classification method based on multilayer perceptron |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111012340A true CN111012340A (en) | 2020-04-17 |
Family
ID=70202358
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010013712.2A Pending CN111012340A (en) | 2020-01-07 | 2020-01-07 | Emotion classification method based on multilayer perceptron |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111012340A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111671445A (en) * | 2020-04-20 | 2020-09-18 | 广东食品药品职业学院 | Consciousness disturbance degree analysis method |
CN112220479A (en) * | 2020-09-04 | 2021-01-15 | 陈婉婷 | Genetic algorithm-based examined individual emotion judgment method, device and equipment |
CN113229818A (en) * | 2021-01-26 | 2021-08-10 | 南京航空航天大学 | Cross-subject personality prediction system based on electroencephalogram signals and transfer learning |
CN114722950A (en) * | 2022-04-14 | 2022-07-08 | 武汉大学 | Multi-modal multivariate time sequence automatic classification method and device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6609024B1 (en) * | 1998-11-12 | 2003-08-19 | Electronics And Telecommunications Research Institute | Method of making a judgment on emotional positivity or negativity by detecting asymmetry of brain waves of left and right cerebral hemispheres |
US20070060831A1 (en) * | 2005-09-12 | 2007-03-15 | Le Tan T T | Method and system for detecting and classifyng the mental state of a subject |
CN109984759A (en) * | 2019-03-15 | 2019-07-09 | 北京数字新思科技有限公司 | The acquisition methods and device of individual emotional information |
CN110443352A (en) * | 2019-07-12 | 2019-11-12 | 阿里巴巴集团控股有限公司 | Semi-automatic neural network tuning method based on transfer learning |
-
2020
- 2020-01-07 CN CN202010013712.2A patent/CN111012340A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6609024B1 (en) * | 1998-11-12 | 2003-08-19 | Electronics And Telecommunications Research Institute | Method of making a judgment on emotional positivity or negativity by detecting asymmetry of brain waves of left and right cerebral hemispheres |
US20070060831A1 (en) * | 2005-09-12 | 2007-03-15 | Le Tan T T | Method and system for detecting and classifyng the mental state of a subject |
CN109984759A (en) * | 2019-03-15 | 2019-07-09 | 北京数字新思科技有限公司 | The acquisition methods and device of individual emotional information |
CN110443352A (en) * | 2019-07-12 | 2019-11-12 | 阿里巴巴集团控股有限公司 | Semi-automatic neural network tuning method based on transfer learning |
Non-Patent Citations (5)
Title |
---|
柳长源 等: "基于脑电信号的情绪特征提取与分类", 《传感技术学报》 * |
柳长源 等: "基于脑电信号的情绪特征提取与分类", 传感技术学报 * |
聂聃 等: "基于脑电的情绪识别研究综述", 《中国生物医学工程学报》 * |
聂聃 等: "基于脑电的情绪识别研究综述", 中国生物医学工程学报 * |
蒋长锦 等: "《快速傅里叶变换及其C程序》", 31 August 2004, 中国科学技术大学出版社 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111671445A (en) * | 2020-04-20 | 2020-09-18 | 广东食品药品职业学院 | Consciousness disturbance degree analysis method |
CN112220479A (en) * | 2020-09-04 | 2021-01-15 | 陈婉婷 | Genetic algorithm-based examined individual emotion judgment method, device and equipment |
CN113229818A (en) * | 2021-01-26 | 2021-08-10 | 南京航空航天大学 | Cross-subject personality prediction system based on electroencephalogram signals and transfer learning |
CN114722950A (en) * | 2022-04-14 | 2022-07-08 | 武汉大学 | Multi-modal multivariate time sequence automatic classification method and device |
CN114722950B (en) * | 2022-04-14 | 2023-11-07 | 武汉大学 | Multi-mode multi-variable time sequence automatic classification method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111012340A (en) | Emotion classification method based on multilayer perceptron | |
CN107844755B (en) | Electroencephalogram characteristic extraction and classification method combining DAE and CNN | |
CN114052735B (en) | Deep field self-adaption-based electroencephalogram emotion recognition method and system | |
CN110969108B (en) | Limb action recognition method based on autonomic motor imagery electroencephalogram | |
CN110353702A (en) | A kind of emotion identification method and system based on shallow-layer convolutional neural networks | |
Hemanth et al. | Brain signal based human emotion analysis by circular back propagation and Deep Kohonen Neural Networks | |
CN111709267B (en) | Electroencephalogram signal emotion recognition method of deep convolutional neural network | |
CN112869711B (en) | Automatic sleep staging and migration method based on deep neural network | |
CN113128552B (en) | Electroencephalogram emotion recognition method based on depth separable causal graph convolution network | |
CN112244873A (en) | Electroencephalogram time-space feature learning and emotion classification method based on hybrid neural network | |
CN113158964B (en) | Sleep stage method based on residual error learning and multi-granularity feature fusion | |
CN106955112A (en) | Brain wave Emotion recognition method based on Quantum wavelet neural networks model | |
CN109871831A (en) | A kind of emotion identification method and system | |
Cheng et al. | Emotion recognition algorithm based on convolution neural network | |
CN112465069A (en) | Electroencephalogram emotion classification method based on multi-scale convolution kernel CNN | |
CN109222966A (en) | A kind of EEG signals sensibility classification method based on variation self-encoding encoder | |
CN112001305B (en) | Feature optimization SSVEP asynchronous recognition method based on gradient lifting decision tree | |
CN113723557A (en) | Depression electroencephalogram classification system based on multiband time-space convolution network | |
CN113180659A (en) | Electroencephalogram emotion recognition system based on three-dimensional features and cavity full convolution network | |
CN114190944A (en) | Robust emotion recognition method based on electroencephalogram signals | |
CN114091529A (en) | Electroencephalogram emotion recognition method based on generation countermeasure network data enhancement | |
CN117786497A (en) | Brain electricity emotion recognition method and system based on colistin predation optimization algorithm | |
Ge et al. | Sleep stages classification using neural networks with multi-channel neural data | |
CN116628420A (en) | Brain wave signal processing method based on LSTM neural network element learning | |
Wang et al. | Automatic epileptic seizure detection based on EEG using a moth-flame optimization of one-dimensional convolutional neural networks |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |