CN113017628B - Consciousness and emotion recognition method and system integrating ERP components and nonlinear features - Google Patents

Consciousness and emotion recognition method and system integrating ERP components and nonlinear features Download PDF

Info

Publication number
CN113017628B
CN113017628B CN202110156080.XA CN202110156080A CN113017628B CN 113017628 B CN113017628 B CN 113017628B CN 202110156080 A CN202110156080 A CN 202110156080A CN 113017628 B CN113017628 B CN 113017628B
Authority
CN
China
Prior art keywords
features
erp
emotion
mmse
slope
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110156080.XA
Other languages
Chinese (zh)
Other versions
CN113017628A (en
Inventor
张敏
郑向伟
嵇存
郑法
邹秀楠
胡斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhonghao Liying Technology (Beijing) Co.,Ltd.
Original Assignee
Shandong Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Normal University filed Critical Shandong Normal University
Priority to CN202110156080.XA priority Critical patent/CN113017628B/en
Publication of CN113017628A publication Critical patent/CN113017628A/en
Application granted granted Critical
Publication of CN113017628B publication Critical patent/CN113017628B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7225Details of analog processing, e.g. isolation amplifier, gain or sensitivity adjustment, filtering, baseline or drift compensation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Psychiatry (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Physiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Power Engineering (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses a consciousness and emotion recognition method and system fusing ERP components and nonlinear characteristics, which comprises the steps of preprocessing an original electroencephalogram signal, wherein the preprocessing step comprises artifact removal, superposition averaging and channel selection; secondly, an ERP feature matching and extracting method based on Shapelet technology is provided, and ERP features such as N200, P300, N300 and the like are automatically extracted; thirdly, obtaining better time-frequency resolution and nonlinear characteristics of EEG signals by adopting an integrated time-frequency analysis method to perform variable mode decomposition on VMD and wavelet packet decomposition on WPD, further extracting nonlinear characteristics MMSE, and fusing the MMSE with the ERP characteristics to form emotion characteristic vectors; and finally, inputting the fused feature vectors into a random forest classifier to recognize conscious emotion and unconscious emotion.

Description

Consciousness and emotion recognition method and system integrating ERP components and nonlinear features
Technical Field
The application relates to the technical field of emotion recognition, in particular to a method and a system for recognizing consciousness and emotion by fusing ERP (enterprise resource planning) components and nonlinear characteristics.
Background
The statements in this section merely provide background information related to the present disclosure and may not constitute prior art.
The quality of the mood determines the quality of life and even influences the life health of people. Positive emotions can hopefully hope for life, further have certain promotion effect on daily life, work and learning, and negative emotions can be opposite, so that people can even suffer from serious mental diseases and lose precious life. With the development of brain and cognitive neuroscience, Electroencephalogram (EEG) signals are gradually one of research hotspots in the field of human-computer interaction because EEG signals have the characteristics of being non-invasive, strong in objectivity, difficult to disguise, high in time resolution, low in cost and the like, and effective information of human emotional states can be truly reflected.
How to automatically extract the characteristics reflecting the essence of the electroencephalogram signals and improve the accuracy of emotion recognition is a primary concern of researchers, and is also a research target which is always realized in an effort. Time-frequency analysis can represent instantaneous changes of signals in time domain and frequency domain simultaneously, and is a powerful tool used for processing non-stationary and non-linear time sequences. In order to effectively identify different emotions, a time-frequency analysis method is combined with a traditional feature extraction method, and a conscious emotion identification method fusing ERP components and nonlinear features MMSE is provided.
In the course of implementing the present disclosure, the inventors found that the following technical problems exist in the prior art:
with the development and improvement of the feature extraction method, a single feature extraction method cannot meet the current research needs, and some researchers have also proposed some combination strategies, for example, a time-frequency domain method and a nonlinear method are combined to achieve a better recognition effect. In the research of cognitive neuroscience, it is found that not all frequency bands are effective for the research of emotion recognition, and different frequency bands of electroencephalogram signals are closely related to different brain activities. For the selection of the brain wave frequency band, most researchers are based on the experience of the predecessors. In the band extraction stage, the Discrete Wavelet Transform (DWT) is not fine enough to decompose high frequency signals, and emotional information is easily lost. Wavelet Packet Decomposition (WPD) can achieve more detailed signal Decomposition compared to DWT, and the resolution of the high frequency part is better than DWT. Researches show that the mixed time-frequency analysis method can effectively analyze physiological signals such as electroencephalogram signals, eye electrical signals, electrocardiosignals and skin electrical signals. Studies by Kabir and Das et al indicate that integrated Empirical Mode Decomposition (EMD) and DWT can achieve higher classification accuracy than EMD or DWT alone. The method of the invention is characterized in that the method comprises the steps of utilizing a Variable Mode Decomposition (VMD) method and a DWT method to obtain a higher time-frequency resolution ratio, and more likely capturing and reflecting brain nonlinear characteristics in electroencephalogram signals.
Psychological research shows that the ERP components such as N200, P300, N300 and LPC have obvious difference in consciousness facing emotion and unconscious analysis, but the ERP components are not used for emotion recognition as features, and are only recognized and analyzed by psychology professionals, which is time-consuming and labor-consuming, and the use of automatic ERP component recognition and extraction technology is particularly important. Therefore, an automatic ERP feature matching and extracting method based on Shapelet technology is provided.
In summary, it is important to provide a new feature extraction method for recognizing consciousness and unconscious emotion.
Disclosure of Invention
In order to overcome the defects of the prior art, the application provides a method and a system for recognizing consciousness and emotion by fusing ERP components and nonlinear characteristics; firstly, preprocessing an original electroencephalogram signal, wherein the preprocessing steps comprise artifact removal, superposition averaging and channel selection; secondly, an ERP feature matching and extracting method based on Shapelet technology is provided, and ERP features such as N200, P300, N300 and the like are automatically extracted; thirdly, obtaining better time-frequency resolution and nonlinear characteristics of EEG signals by adopting an integrated time-frequency analysis method to perform variable mode decomposition on VMD and wavelet packet decomposition on WPD, further extracting nonlinear characteristics MMSE, and fusing the MMSE with the ERP characteristics to form emotion characteristic vectors; and finally, inputting the fused feature vectors into a random forest classifier to recognize conscious emotion and unconscious emotion.
In a first aspect, the application provides a method for fusing ERP components and non-linear features for conscious emotion;
an ERP component and nonlinear feature fused conscious emotion method comprises the following steps:
acquiring an original electroencephalogram signal, and preprocessing the original electroencephalogram signal;
extracting ERP characteristics from the preprocessed electroencephalogram signals;
extracting multi-scale sample entropy MMSE (minimum mean square error) characteristics from the preprocessed electroencephalogram signals;
fusing the ERP features and the multi-scale sample entropy MMSE features to obtain fused features;
and inputting the fusion features into the trained classifier, and outputting emotion recognition results.
In a second aspect, the present application provides a conscious emotional system that fuses ERP components with nonlinear features;
an awareness emotion system fusing ERP components and nonlinear features comprises:
an acquisition module configured to: acquiring an original electroencephalogram signal, and preprocessing the original electroencephalogram signal;
an ERP feature extraction module configured to: extracting ERP characteristics from the preprocessed electroencephalogram signals;
a nonlinear feature extraction module configured to: extracting multi-scale sample entropy MMSE (minimum mean square error) features from the preprocessed electroencephalogram signals;
a feature fusion module configured to: fusing the ERP features and the multi-scale sample entropy MMSE features to obtain fused features;
an emotion recognition module configured to: and inputting the fusion features into the trained classifier, and outputting emotion recognition results.
In a third aspect, the present application further provides an electronic device, including: one or more processors, one or more memories, and one or more computer programs; wherein a processor is connected to the memory, the one or more computer programs are stored in the memory, and when the electronic device is running, the processor executes the one or more computer programs stored in the memory, so as to make the electronic device execute the method according to the first aspect.
In a fourth aspect, the present application also provides a computer-readable storage medium for storing computer instructions which, when executed by a processor, perform the method of the first aspect.
In a fifth aspect, the present application also provides a computer program (product) comprising a computer program for implementing the method of any of the preceding first aspects when run on one or more processors.
Compared with the prior art, the beneficial effect of this application is:
the method disclosed by the invention comprises the following five parts: the system comprises a data preprocessing part, an ERP feature extraction part, a nonlinear feature MMSE extraction part, a feature vector fusion part and a classifier classification and prediction part. By analyzing that not all EEG bands are suitable for emotion recognition studies, the present disclosure extracts the EEG bands most relevant to emotion, i.e., the β and γ bands, and reconstructs them into a new emotion band. Aiming at the problems that the manual extraction of ERP features is time-consuming, labor-consuming and difficult to extract, an automatic ERP feature matching and extracting method based on a Shapelet technology is provided, and the ERP features such as N200, P300 and N300 which are related to consciousness and unconscious emotion are automatically extracted; aiming at the integrated time-frequency analysis method, nonlinear features can be captured better to achieve better recognition effect, the VMD-WPD-based nonlinear feature extraction method is provided, namely, the nonlinear feature MMSE is extracted on the basis of the VMD-WPD.
Removing artifacts of the original electroencephalogram signals according to artifact removal standards, then carrying out superposition averaging, and then selecting channels related to consciousness and emotion to obtain preprocessed electroencephalograms of each tested electroencephalogram signal; performing Shapelet processing on the preprocessed electroencephalogram data to obtain ERP characteristics such as N200, P300 and N300 related to consciousness and unconscious emotion; the extraction of the nonlinear characteristic MMSE is that firstly VMD is used for decomposing preprocessed electroencephalogram signal data to obtain a Variational Mode Component (VMF), and then the VMF obtained by decomposition is subjected to wavelet packet decomposition and reconstruction to obtain an emotion frequency band VMFβ+γAnd calculating the MMSE of the data; the feature vector fusion part fuses the ERP features and the nonlinear features MMSE to form a new feature vector for conscious emotion recognition; finally, the characteristics are aligned toThe amount is sent to a classifier for classification and prediction, and then recognition of conscious emotion and unconscious emotion states is carried out.
Advantages of additional aspects of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the application and, together with the description, serve to explain the application and are not intended to limit the application.
Fig. 1 is a flowchart of a conscious emotion recognition method based on combination of ERP components and nonlinear features MMSE in an embodiment of the present disclosure;
fig. 2 is a schematic diagram of matching and extracting ERP features in the embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a WPD binary tree with a sampling frequency of 1000Hz in an embodiment of the present disclosure;
figure 4 is a moving average coarse-grained process diagram of MMSE in an embodiment of the disclosure.
Detailed Description
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise, and it should be understood that the terms "comprises" and "comprising", and any variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The embodiments and features of the embodiments of the present invention may be combined with each other without conflict.
Example one
The embodiment provides a consciousness emotion method fusing ERP components and nonlinear characteristics;
as shown in fig. 1, the emotional awareness method for fusing ERP components and nonlinear features includes:
s101: acquiring an original electroencephalogram signal, and preprocessing the original electroencephalogram signal;
s102: extracting ERP characteristics from the preprocessed electroencephalogram signals;
s103: extracting multi-scale sample entropy MMSE (minimum mean square error) characteristics from the preprocessed electroencephalogram signals;
s104: fusing the ERP features and the multi-scale sample entropy MMSE features to obtain fused features;
s105: and inputting the fusion features into the trained classifier, and outputting emotion recognition results.
As one or more embodiments, the S101: acquiring an original electroencephalogram signal, and preprocessing the original electroencephalogram signal; the method specifically comprises the following steps:
removing artifacts from the acquired original electroencephalogram signals, then superposing and averaging, finally carrying out channel selection of conscious emotion recognition, selecting channels related to conscious emotion, and finally obtaining preprocessed electroencephalogram signals.
Wherein, the channels related to consciousness and emotion comprise: f3, FZ, F4, FC3, FCZ, FC4, C3, Cz and C4, wherein F represents Frontal lobe (Frontal lobe), C: represents midline (Central), and Z: represents zero (zero), i.e. left and right brain centers.
The naming rule of the electrode points is based on the 10-20 system electrode placement method, wherein: f, Frontal lobe (Frontal lobe), FP, prefrontal lobe (Frontal poles), T, Temporal lobe (Temporal lobes), O, Occipital lobe (Occipital lobes), P, Parietal lobe (Parietal lobes), C, Central line (Central), Z, zero (zero), namely left and right brain centers, and FCZ, namely the junction of the Frontal lobe, the Central line and the brain centers; odd numbers represent the left brain and even numbers represent the right brain.
In step 101 of this embodiment, the present disclosure uses self-acquired consciousness and unconscious emotion data sets, and the data preprocessing mainly includes artifact removal, superposition averaging, and channel selection.
The artifact exclusion criteria is the length of the artifact signal as a percentage of the total length under test, and generally, if it exceeds 20%, the data is considered invalid.
And performing superposition average processing on the screened tested electroencephalogram data, and superposing the ERP under various conditions according to the emotional type and consciousness level of stimulation. The superposition averaging can offset most of the artifacts so that the data can be more pure and more useful information can be obtained.
In terms of channel selection, redundant information is obtained due to the use of full channels, and the complexity of calculation is increased. This is because not all channels are mood and cognition related, but are distributed in a specific area, and the full channel calculation increases the computational complexity.
N2 and N3 are mainly distributed in the forehead and central regions, P3 is mainly distributed in the central, forehead and parietal regions, and in combination with the research results of psychological mechanisms of facial expression perception, the invention selects data of 18 channels such as F3, Fz, F4, FC3, FCZ, FC4, C3, CZ, C4, CP3, CPZ, CP4, P3, PZ, P4, PO3, POZ and PO4 for research.
As one or more embodiments, the S102: extracting ERP characteristics from the preprocessed electroencephalogram signals; the method specifically comprises the following steps:
and performing ERP feature matching and extraction on the preprocessed electroencephalogram signals based on a Shapelet algorithm, and extracting a plurality of ERP features.
Wherein, a plurality of ERP characteristics include: n200, P300 and N300, wherein: n represents Negative, N200 refers to a Negative wave occurring at 200ms, similarly, P represents Positive, and P300 refers to a Positive wave occurring at 300 ms.
All the extracted ERP features form a feature vector x1
Further, the S102: extracting ERP characteristics from the preprocessed electroencephalogram signals; the detailed steps comprise:
s1020: manually extracting ERP features, and taking the manually extracted ERP features as known features;
s1021: generating a candidate subsequence T with the length equal to the known characteristic length;
s1022: successively calculating the known features SkAnd storing the slope between any two points in the set S'k
S1023: calculating the slope of the candidate subsequence T in sequence and storing the slope in a set T';
s1024: calculating S'kAnd the absolute value of the difference between the slope of T 'and the slope of T', judging whether each absolute value is smaller than a self-defined similar threshold epsilon, if so, executing the next step; if not, taking twice of the absolute value of the difference of the slopes as a penalty value;
s1025: finding out all subsequences similar to the known characteristics according to the sum of all the slope differences, and marking and outputting the most similar subsequences if the sum of all the slope differences is minimum;
s1026: storing the most similar subsequences in x1
In step S102 of this embodiment, ERP features are extracted using Shapelet technology to form a feature vector x1. ERP components such as N200, P300, N300 and LPC have significant differences in emotion-oriented consciousness and unconscious analysis, but the ERP components are not used as features for emotion recognition. Therefore, the invention provides a feature matching and extracting algorithm based on Shapelet technology, and the algorithm realizes automatic identification and extraction of ERP components. The preprocessed brain electrical signal S, signature subsequence S, is knownkAnd a custom similarity threshold epsilon.
The function of the algorithm is to retrieve unknown features that are similar to a specified shape over a specified period of time. In other words, unknown P300 is matched and output based on known P300, where known feature subsequences refer to N200, P300 and N300 components that we manually extracted from ERP. These subsequences are used to match and extract unknown corresponding ERP components. The criteria for manually extracting the known ERP subsequences are based on the time period in which the corresponding ERP component occurs. In order to keep the dimension of the feature vector consistent, the time period length of the ERP feature is uniformly set to be 50 data points. Fig. 2 shows a schematic diagram of the matching and extracting process of N300 features.
As one or more embodiments, the S103: extracting multi-scale sample entropy MMSE (minimum mean square error) characteristics from the preprocessed electroencephalogram signals; the method comprises the following steps:
and carrying out variation mode decomposition and wavelet packet decomposition and reconstruction on the preprocessed electroencephalogram signals to obtain reconstructed signals based on beta and gamma frequency bands, and calculating multi-scale sample entropy MMSE of the reconstructed signals.
Further, the step S103: extracting multi-scale sample entropy MMSE (minimum mean square error) characteristics from the preprocessed electroencephalogram signals; the detailed steps comprise:
decomposing the preprocessed electroencephalogram signals by using a variational modal decomposition VMD to obtain components VMF of different levels;
decomposing each component VMF by using wavelet packet decomposition WPD, and reconstructing a decomposed result to obtain an emotion frequency band VMFβ+γ
Calculate each emotion band VMFβ+γThe multi-scale sample entropy MMSE characteristic;
storing multiscale sample entropy MMSE features into x2
Further, calculating each emotion frequency band VMFβ+γThe multi-scale sample entropy MMSE characteristic; the method specifically comprises the following steps:
(1) let SθRepresenting a moving average time series with theta as a scale factor, the generation process of the moving average time series is shown in FIG. 4, SθIs defined as follows:
Figure BDA0002934747340000101
(2) calculating a moving time series SθIs given as the sample entropy at delay time δ θ, named MMSE. The calculation formula is as follows:
MMSE(s,m,θ,r)=SampleEn(Sθ,m,δ=θ,r) (2)
where SampleEn is a computational function of sample entropy, m represents the embedding dimension, and r represents the similarity tolerance.
In step S103 of the present embodiment, the EEG frequency bands most effective for emotion recognition are β and γ frequency bands, which is consistent with the neuropsychology study about emotion frequency bands.
In the WPD process, an input signal is decomposed into a low-frequency part and a high-frequency part, wherein the low-frequency part and the high-frequency part are further decomposed into the low-frequency part and the high-frequency part, respectively, and the process is repeated to obtain a complete binary wavelet packet tree.
This disclosure uses five-layer wavelet packet decomposition, a schematic diagram of a binary wavelet packet tree as in fig. 3. In performing WPD, the db4 wavelet basis is chosen because it is closer to normal EEG signals, which can produce more accurate decomposition results.
Because the integrated time-frequency analysis method can better capture the nonlinear characteristics of the electroencephalogram signals, a frequency band reconstruction method is provided based on the VMD-WPD, namely, the beta frequency band and the gamma frequency band are extracted and reconstructed into an emotional frequency band VMFβ+γAnd further extracting the nonlinear characteristic MMSE on the basis. Knowing the number K of the preprocessed electroencephalogram signals S and VMF; the number of layers i of WPD decomposition; the scale factor theta.
As one or more embodiments, the S104: fusing the ERP features and the multi-scale sample entropy MMSE features to obtain fused features; the method specifically comprises the following steps:
the feature vector x1And the feature vector x2Performing feature fusion to form a new emotion feature vector x, namely x ═ x (x)1,x2)。
As one or more embodiments, the S105: inputting the fusion features into a trained classifier, and outputting emotion recognition results; the method specifically comprises the following steps:
and sending the emotion characteristic vector into a trained random forest classifier to perform classification and identification of conscious emotion and unconscious emotion states.
Further, the training of the trained random forest classifier comprises the following steps:
constructing a random forest classifier;
constructing a training set, wherein the training set comprises emotion feature vectors of known emotion recognition results;
and inputting the training set into a random forest classifier, and training the random forest classifier to obtain the trained random forest classifier.
In step S105 of this embodiment, the feature vector x is sent to a random forest classifier to identify conscious emotion and unconscious emotion. Since the conventional classifier is prone to overfitting, which may result in a reduction in classification accuracy, many researchers have improved classification accuracy by combining classifiers. In this context, random forest classifiers have come into play, which is one of the most important Bagging-based ensemble learning methods. A number of studies have shown that random forests are one of the more popular classifiers in the field of emotion recognition.
Example two
The embodiment provides a conscious emotion system fusing ERP components and nonlinear features;
an awareness emotion system fusing ERP components and nonlinear features comprises:
an acquisition module configured to: acquiring an original electroencephalogram signal, and preprocessing the original electroencephalogram signal;
an ERP feature extraction module configured to: extracting ERP characteristics from the preprocessed electroencephalogram signals;
a nonlinear feature extraction module configured to: extracting multi-scale sample entropy MMSE (minimum mean square error) characteristics from the preprocessed electroencephalogram signals;
a feature fusion module configured to: fusing the ERP features and the multi-scale sample entropy MMSE features to obtain fused features;
an emotion recognition module configured to: and inputting the fusion features into the trained classifier, and outputting emotion recognition results.
It should be noted here that the above-mentioned obtaining module, ERP feature extraction module, nonlinear feature extraction module, feature fusion module and emotion recognition module correspond to steps S101 to S105 in the first embodiment, and the above-mentioned modules are the same as the examples and application scenarios implemented by the corresponding steps, but are not limited to the contents disclosed in the first embodiment. It should be noted that the modules described above as part of a system may be implemented in a computer system such as a set of computer-executable instructions.
In the foregoing embodiments, the description of each embodiment has an emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions in other embodiments.
The proposed system can be implemented in other ways. For example, the above-described system embodiments are merely illustrative, and for example, the division of the above-described modules is merely a logical division, and in actual implementation, there may be other divisions, for example, multiple modules may be combined or integrated into another system, or some features may be omitted, or not executed.
EXAMPLE III
The present embodiment also provides an electronic device, including: one or more processors, one or more memories, and one or more computer programs; wherein, a processor is connected with the memory, the one or more computer programs are stored in the memory, and when the electronic device runs, the processor executes the one or more computer programs stored in the memory, so as to make the electronic device execute the method according to the first embodiment.
It should be understood that in this embodiment, the processor may be a central processing unit CPU, and the processor may also be other general purpose processors, digital signal processors DSP, application specific integrated circuits ASIC, off-the-shelf programmable gate arrays FPGA or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and so on. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory may include both read-only memory and random access memory, and may provide instructions and data to the processor, and a portion of the memory may also include non-volatile random access memory. For example, the memory may also store device type information.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software.
The method in the first embodiment may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in the processor. The software modules may be located in ram, flash, rom, prom, or eprom, registers, among other storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor. To avoid repetition, it is not described in detail here.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
Example four
The present embodiments also provide a computer-readable storage medium for storing computer instructions, which when executed by a processor, perform the method of the first embodiment.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (9)

1. An ERP component and nonlinear feature fused consciousness and emotion method is characterized by comprising the following steps:
acquiring an original electroencephalogram signal, and preprocessing the original electroencephalogram signal;
extracting ERP characteristics from the preprocessed electroencephalogram signals; the detailed steps comprise:
manually extracting ERP features, and taking the manually extracted ERP features as known features;
generating a candidate subsequence T with the length equal to the known characteristic length;
successively calculating the known features SkAnd storing the slope between any two points in the set S'k
Calculating the slope of the candidate subsequence T in sequence and storing the slope in a set T';
calculating S'kAnd the absolute value of the difference between the slope of T 'and the slope of T', judging whether each absolute value is smaller than a self-defined similar threshold epsilon, if so, executing the next step; if not, taking twice of the absolute value of the difference of the slopes as a penalty value;
finding out all subsequences similar to the known characteristics according to the sum of all the slope differences, and marking and outputting the most similar subsequences if the sum of all the slope differences is minimum;
storing the most similar subsequences in x1
Extracting multi-scale sample entropy MMSE (minimum mean square error) features from the preprocessed electroencephalogram signals;
fusing the ERP features and the multi-scale sample entropy MMSE features to obtain fused features;
and inputting the fusion features into the trained classifier, and outputting emotion recognition results.
2. The method of fusing ERP components and non-linear features of emotional awareness according to claim 1, wherein an original electroencephalogram signal is obtained and preprocessed; the method specifically comprises the following steps:
removing artifacts from the acquired original electroencephalogram signals, then superposing and averaging, finally carrying out channel selection of conscious emotion recognition, selecting channels related to conscious emotion, and finally obtaining preprocessed electroencephalogram signals.
3. The method of claim 1, wherein the ERP features are extracted from the preprocessed electroencephalogram signal; the method specifically comprises the following steps:
and performing ERP feature matching and extraction on the preprocessed electroencephalogram signals based on a Shapelet algorithm, and extracting a plurality of ERP features.
4. The method for fusing ERP components and nonlinear features of conscious emotions according to claim 1, wherein multi-scale sample entropy MMSE features are extracted from the preprocessed electroencephalogram signals; the method comprises the following steps:
and carrying out variation mode decomposition and wavelet packet decomposition and reconstruction on the preprocessed electroencephalogram signals to obtain reconstructed signals based on beta and gamma frequency bands, and calculating multi-scale sample entropy MMSE of the reconstructed signals.
5. The method for fusing ERP components and nonlinear features of conscious emotions according to claim 1, wherein multi-scale sample entropy MMSE features are extracted from the preprocessed electroencephalogram signals; the detailed steps comprise:
decomposing the preprocessed electroencephalogram signals by using a variational modal decomposition VMD to obtain components VMF of different levels;
decomposing each component VMF by using wavelet packet decomposition WPD, and reconstructing a decomposed result to obtain an emotion frequency band VMFβ+γ
Calculate each emotion band VMFβ+γThe multi-scale sample entropy MMSE characteristic;
storing multiscale sample entropy MMSE features into x2
6. The method according to claim 1, wherein the fusion features are input into a trained classifier, and a sentiment recognition result is output; the method specifically comprises the following steps:
and sending the emotion characteristic vector into a trained random forest classifier to perform classification and identification of conscious emotion and unconscious emotion states.
7. An ERP component and nonlinear feature fused consciousness emotion system is characterized by comprising:
an acquisition module configured to: acquiring an original electroencephalogram signal, and preprocessing the original electroencephalogram signal;
an ERP feature extraction module configured to: extracting ERP characteristics from the preprocessed electroencephalogram signals; the detailed steps comprise:
manually extracting ERP features, and taking the manually extracted ERP features as known features;
generating a candidate subsequence T with the length equal to the known characteristic length;
successively calculating the known features SkAnd storing the slope between any two points in the set S'k
Calculating the slope of the candidate subsequence T in sequence and storing the slope in a set T';
calculating S'kAnd the absolute value of the difference between the slope of T 'and the slope of T', judging whether each absolute value is smaller than a self-defined similar threshold epsilon, if so, executing the next step; if not, taking twice of the absolute value of the difference of the slopes as a penalty value;
finding out all subsequences similar to the known characteristics according to the sum of all the slope differences, and marking and outputting the most similar subsequences if the sum of all the slope differences is minimum;
storing the most similar subsequences in x1
A nonlinear feature extraction module configured to: extracting multi-scale sample entropy MMSE (minimum mean square error) characteristics from the preprocessed electroencephalogram signals;
a feature fusion module configured to: fusing the ERP features and the multi-scale sample entropy MMSE features to obtain fused features;
an emotion recognition module configured to: and inputting the fusion features into the trained classifier, and outputting emotion recognition results.
8. An electronic device, comprising: one or more processors, one or more memories, and one or more computer programs; wherein a processor is connected to the memory, the one or more computer programs being stored in the memory, the processor executing the one or more computer programs stored in the memory when the electronic device is running, to cause the electronic device to perform the method of any of the preceding claims 1-6.
9. A computer-readable storage medium storing computer instructions which, when executed by a processor, perform the method of any one of claims 1 to 6.
CN202110156080.XA 2021-02-04 2021-02-04 Consciousness and emotion recognition method and system integrating ERP components and nonlinear features Active CN113017628B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110156080.XA CN113017628B (en) 2021-02-04 2021-02-04 Consciousness and emotion recognition method and system integrating ERP components and nonlinear features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110156080.XA CN113017628B (en) 2021-02-04 2021-02-04 Consciousness and emotion recognition method and system integrating ERP components and nonlinear features

Publications (2)

Publication Number Publication Date
CN113017628A CN113017628A (en) 2021-06-25
CN113017628B true CN113017628B (en) 2022-06-10

Family

ID=76459993

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110156080.XA Active CN113017628B (en) 2021-02-04 2021-02-04 Consciousness and emotion recognition method and system integrating ERP components and nonlinear features

Country Status (1)

Country Link
CN (1) CN113017628B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115192040B (en) * 2022-07-18 2023-08-11 天津大学 Electroencephalogram emotion recognition method and device based on poincare graph and second-order difference graph

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103230276A (en) * 2013-02-01 2013-08-07 上海中医药大学附属岳阳中西医结合医院 Device and method for quantitatively evaluating and recording human body subjective feeling
CN103577562A (en) * 2013-10-24 2014-02-12 河海大学 Multi-measurement time series similarity analysis method
CN104020845A (en) * 2014-03-27 2014-09-03 浙江大学 Acceleration transducer placement-unrelated movement recognition method based on shapelet characteristic
CN107871140A (en) * 2017-11-07 2018-04-03 哈尔滨工程大学 One kind is based on slope elasticity method for measuring similarity
CN108959704A (en) * 2018-05-28 2018-12-07 华北电力大学 A kind of rewards and punishments weight type simulation sequence similarity analysis method considering metamorphosis
CN110032585A (en) * 2019-04-02 2019-07-19 北京科技大学 A kind of time series bilayer symbolism method and device
CN110781945A (en) * 2019-10-22 2020-02-11 太原理工大学 Electroencephalogram signal emotion recognition method and system integrating multiple features
CN111310570A (en) * 2020-01-16 2020-06-19 山东师范大学 Electroencephalogram signal emotion recognition method and system based on VMD and WPD
CN111766473A (en) * 2020-06-30 2020-10-13 济南轨道交通集团有限公司 Power distribution network single-phase earth fault positioning method and system based on slope distance

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8392255B2 (en) * 2007-08-29 2013-03-05 The Nielsen Company (Us), Llc Content based selection and meta tagging of advertisement breaks
US20150045700A1 (en) * 2013-08-09 2015-02-12 University Of Washington Through Its Center For Commercialization Patient activity monitoring systems and associated methods
CN105342596A (en) * 2015-12-15 2016-02-24 深圳市珑骧智能科技有限公司 Method and system capable of increasing heart rate detection accuracy
CN110585590A (en) * 2019-09-24 2019-12-20 中国人民解放军陆军军医大学第一附属医院 Transcranial direct current stimulation device and data processing method
CN111134667B (en) * 2020-01-19 2024-01-26 中国人民解放军战略支援部队信息工程大学 Time migration emotion recognition method and system based on electroencephalogram signals

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103230276A (en) * 2013-02-01 2013-08-07 上海中医药大学附属岳阳中西医结合医院 Device and method for quantitatively evaluating and recording human body subjective feeling
CN103577562A (en) * 2013-10-24 2014-02-12 河海大学 Multi-measurement time series similarity analysis method
CN104020845A (en) * 2014-03-27 2014-09-03 浙江大学 Acceleration transducer placement-unrelated movement recognition method based on shapelet characteristic
CN107871140A (en) * 2017-11-07 2018-04-03 哈尔滨工程大学 One kind is based on slope elasticity method for measuring similarity
CN108959704A (en) * 2018-05-28 2018-12-07 华北电力大学 A kind of rewards and punishments weight type simulation sequence similarity analysis method considering metamorphosis
CN110032585A (en) * 2019-04-02 2019-07-19 北京科技大学 A kind of time series bilayer symbolism method and device
CN110781945A (en) * 2019-10-22 2020-02-11 太原理工大学 Electroencephalogram signal emotion recognition method and system integrating multiple features
CN111310570A (en) * 2020-01-16 2020-06-19 山东师范大学 Electroencephalogram signal emotion recognition method and system based on VMD and WPD
CN111766473A (en) * 2020-06-30 2020-10-13 济南轨道交通集团有限公司 Power distribution network single-phase earth fault positioning method and system based on slope distance

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Time series shapelets: a novel technique that allows accurate, interpretable and fast classification;Lexiang Ye,Eamonn Keogh;《Data Min Knowl Disc》;20111231;全文 *
基于趋势特征表示的shapelet分类方法;闫欣鸣,孟凡荣,闫秋艳;《计算机应用》;20171231;第37卷(第8期);全文 *

Also Published As

Publication number Publication date
CN113017628A (en) 2021-06-25

Similar Documents

Publication Publication Date Title
Sharma et al. DepHNN: a novel hybrid neural network for electroencephalogram (EEG)-based screening of depression
Akbari et al. Schizophrenia recognition based on the phase space dynamic of EEG signals and graphical features
Brihadiswaran et al. EEG-based processing and classification methodologies for autism spectrum disorder: A review
Islam et al. A wavelet-based artifact reduction from scalp EEG for epileptic seizure detection
CN111310570B (en) Electroencephalogram signal emotion recognition method and system based on VMD and WPD
Hosni et al. Classification of EEG signals using different feature extraction techniques for mental-task BCI
Agarwal et al. Classification of alcoholic and non-alcoholic EEG signals based on sliding-SSA and independent component analysis
García-Martínez et al. Recognition of emotional states from EEG signals with nonlinear regularity-and predictability-based entropy metrics
CN113017628B (en) Consciousness and emotion recognition method and system integrating ERP components and nonlinear features
Baghdadi et al. Robust feature learning method for epileptic seizures prediction based on long-term EEG signals
Ghosh et al. Exploration of face-perceptual ability by EEG induced deep learning algorithm
KR et al. A multi-dimensional hybrid CNN-BiLSTM framework for epileptic seizure detection using electroencephalogram signal scrutiny
Hassan et al. Review of EEG Signals Classification Using Machine Learning and Deep-Learning Techniques
Nirabi et al. Machine Learning-Based Stress Level Detection from EEG Signals
Farokhah et al. Cross-subject channel selection using modified relief and simplified CNN-based deep learning for EEG-based emotion recognition
Jacob et al. EEG entropies as estimators for the diagnosis of encephalopathy
Torabi et al. Semantic category-based classification using nonlinear features and wavelet coefficients of brain signals
Wang et al. An epileptic EEG detection method based on data augmentation and lightweight neural network
Patil et al. Classification of human emotions using multiclass support vector machine
Jemal et al. An effective deep neural network architecture for cross-subject epileptic seizure detection in EEG data
Kinney-Lang et al. Elucidating age-specific patterns from background electroencephalogram pediatric datasets via PARAFAC
Jo et al. Channel-Aware Self-Supervised Learning for EEG-based BCI
CN114742107A (en) Method for identifying perception signal in information service and related equipment
Nagarajan et al. Pictorial Information Retrieval from EEG using Generative Adversarial Networks
Nancy et al. A brain EEG classification system for the mild cognitive impairment analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240315

Address after: 401329 No. 99, Xinfeng Avenue, Jinfeng Town, Gaoxin District, Jiulongpo District, Chongqing

Patentee after: Chongqing Science City Intellectual Property Operation Center Co.,Ltd.

Country or region after: China

Address before: 250014 No. 88, Wenhua East Road, Lixia District, Shandong, Ji'nan

Patentee before: SHANDONG NORMAL University

Country or region before: China

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240819

Address after: Room 344, 3rd Floor, Building 6, No. 1 Courtyard, Shangdi 10th Street, Haidian District, Beijing 100080

Patentee after: Zhonghao Liying Technology (Beijing) Co.,Ltd.

Country or region after: China

Address before: 401329 No. 99, Xinfeng Avenue, Jinfeng Town, Gaoxin District, Jiulongpo District, Chongqing

Patentee before: Chongqing Science City Intellectual Property Operation Center Co.,Ltd.

Country or region before: China