CN110946576A - Visual evoked potential emotion recognition method based on width learning - Google Patents

Visual evoked potential emotion recognition method based on width learning Download PDF

Info

Publication number
CN110946576A
CN110946576A CN201911411761.5A CN201911411761A CN110946576A CN 110946576 A CN110946576 A CN 110946576A CN 201911411761 A CN201911411761 A CN 201911411761A CN 110946576 A CN110946576 A CN 110946576A
Authority
CN
China
Prior art keywords
emotion
pictures
electroencephalogram
testee
frequency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911411761.5A
Other languages
Chinese (zh)
Inventor
秦学斌
王卓
纪晨晨
杨培娇
李明桥
申昱瞳
胡佳琛
汪梅
王湃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian University of Science and Technology
Original Assignee
Xian University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian University of Science and Technology filed Critical Xian University of Science and Technology
Priority to CN201911411761.5A priority Critical patent/CN110946576A/en
Publication of CN110946576A publication Critical patent/CN110946576A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/377Electroencephalography [EEG] using evoked responses
    • A61B5/378Visual stimuli
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/725Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Psychiatry (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Artificial Intelligence (AREA)
  • Psychology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Physiology (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Fuzzy Systems (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention belongs to electroencephalogram characteristic classification in the field of biological characteristic identification, and particularly relates to a visual evoked potential emotion identification method based on width learning. The method comprises the steps of carrying out adaptability evaluation on an international emotion picture system (IAPS); acquiring electroencephalogram signals generated based on visual induction; preprocessing data by methods such as band-pass filtering; extracting multiple characteristics of the electroencephalogram signals under different methods; the electroencephalogram signal classification processing method based on width learning comprises five steps, the adaptability evaluation is carried out on an international emotion picture system (IAPS) by the designed experimental method, and the accuracy of the experiment is improved; preprocessing data by methods such as band-pass filtering and the like to eliminate fluctuation and difference among electroencephalogram signals at different times; the power spectral density method is adopted to extract multiple features, so that the robustness and efficiency of electroencephalogram signals and emotion classification are improved; the emotion classification by the breadth learning method makes the result avoid falling into local optimality and can save cost.

Description

Visual evoked potential emotion recognition method based on width learning
Technical Field
The invention belongs to electroencephalogram characteristic classification in the field of biological characteristic identification, and particularly relates to a visual evoked potential emotion identification method based on width learning.
Background
The emotion is a reaction generated when a person is stimulated by the outside world, is a general term for a series of subjective cognitive experiences, and is a psychological and physiological state generated by a plurality of feelings, ideas and behaviors comprehensively. Wherein the stimuli include visual, auditory, olfactory, and the like. The emotion is not single in the psychological world, and not only comprises psychological response and physiological response, but also reflects the self-demand and subjective attitude of people. The emotion occurs with some external expressions, which are related to emotion and include facial expressions, posture expressions, intonation expressions, and sensory feedback. The nature of emotion is a psychological phenomenon mediated by the tendency of the subject to desire or desire. The emotion has three components of unique physiological arousal, subjective experience and external expression. Meeting the needs and desires of the subject will result in a positive, positive emotion and conversely will result in a negative, negative emotion.
The brain-computer interface is a direct communication and control channel established between the human brain and a computer or other electronic devices, through which a human can express ideas or manipulate devices directly through the brain without requiring language or motion, which can effectively enhance the ability of a severely physically disabled patient to communicate with the outside or control the external environment to improve the quality of life of the patient. The brain-computer interface technology is a cross technology relating to multiple disciplines such as neuroscience, signal detection, signal processing, pattern recognition and the like.
The research of emotion recognition is also ongoing, but the emotion classification research based on electroencephalogram still faces many problems: how to evaluate an international emotion picture system (IAPS) during emotion picture induction experiments enables experiment results to be more accurate; how to effectively alleviate fluctuations and differences between different subjects, the brain electrical signals at different times; how to improve the robustness and efficiency of electroencephalogram signal emotion classification; the emotion classification is performed by what algorithm so that the time cost can be saved as much as possible under the condition that the influence of the emotion classification result is not large.
Disclosure of Invention
In order to overcome the problems, the invention provides a visual evoked potential emotion recognition method based on width learning.
In order to achieve the purpose, the invention adopts the technical scheme that: a visual evoked potential emotion recognition method based on width learning is characterized by comprising the following specific steps:
the method comprises the following steps: carrying out adaptability evaluation on an international emotion picture system (IAPS);
step two: acquiring electroencephalogram signals generated based on visual induction;
step three: preprocessing data by methods such as band-pass filtering;
step four: extracting multiple characteristics of the electroencephalogram signals under different methods;
step five: and (4) carrying out classification processing on the electroencephalogram signals based on width learning.
The first step comprises the following steps:
2.1 selecting 64 pictures from IAPS, dividing the pictures into eight grades according to a pleasure degree range, wherein the higher the grade is, the higher the pleasure degree is, each grade comprises 8 pictures, the higher the pleasure degree score is, the more positive the emotion is represented, and the pleasure degree range is 1-9;
2.2 minimizing the external interference to the subject, wherein the subject sits at 1m in front of the PC, and the subject comprises five boys and five girls, and the age is between 22 and 26 years, and is informed of the experiment purpose and process, and the experiment flow is as follows:
(1) emotion preparation, namely, a 'preparation' mark appears on a screen to prompt a testee to prepare, wherein the preparation time lasts for 3 seconds;
(2) inducing pictures, wherein a testee watches the appeared pictures for 6 seconds and fully feels the emotion to be expressed by the pictures;
(3) performing emotion evaluation, namely after a testee finishes watching one picture, simply evaluating and scoring the applicability of eight grade pictures selected in an experiment;
(4) in the calming stage, after the emotion of the picture inducing stage is recalled, the testee recuperates the emotion within 10 seconds, and subsequent experiments are carried out until all pictures are played;
2.3 after the scoring process is finished, averaging the personal scores of the pleasure degree of each picture to obtain the average score of the pleasure degree of the picture.
The specific process of the second step is as follows: the testee takes the brain electricity cap, and experimenter debugs behind the brain electricity collection equipment, pauses for 30 seconds, treats that the testee adjusts to best health and mental state, begins the experiment, and during the brain electricity was gathered, to use the minor matters as the unit, every minor matters contains 3 pictures of the same type to change the amazing picture and increase 20 seconds's relaxation time before carrying out next minor matters, gather the specific step of the experiment based on the brain electricity signal that the vision induction produced and be:
3.1 emotion preparation, wherein a 'preparation' mark appears on a screen to prompt a testee to prepare for 3 seconds;
3.2, inducing pictures, enabling a testee to watch the appeared pictures for 6 seconds, recording emotion electroencephalogram signals in the process by a computer, avoiding the testee to move as much as possible in the process, fully immersing the contents of the pictures, avoiding blinking and reducing the interference of the electroencephalogram signals;
3.3 relaxation phase, the testee has 8 seconds to recover the self emotion for the next experiment;
3.4 replace pictures and repeat the above steps until the experiment is finished.
Step three, preprocessing the electroencephalogram signals obtained in the step two to reduce trail interference and improve the final classification recognition rate; the pretreatment comprises the steps of reducing the sampling rate and carrying out band-pass filtering of different frequency bands, firstly, a band-pass filter is utilized to take out signals between 0.5Hz and 50Hz, and a Butterworth band-pass filter is used for carrying out signal filtering;
the butterworth low pass filter can be expressed by the following equation of amplitude squared versus frequency:
Figure BDA0002350146890000041
where n is the order of the filter, ω c is the cutoff frequency, which is the frequency at which the amplitude drops to-3 db, ω p is the passband edge frequency, and 1/(1+ e 2) ═ H (ω) |2 is the value at the passband edge.
Y (H) ray through two-dimensional complex plane2=H(s)H*(s) ═ H(s) H (-s) a number | H (ω) at a point of s ═ j ω2Therefore, by analytic extension:
Figure BDA0002350146890000042
the poles of the above function are equidistantly distributed on a circle of radius ω c
Figure BDA0002350146890000043
Therefore, the temperature of the molten metal is controlled,
Figure BDA0002350146890000044
the amplitude and frequency relationship of the nth order butterworth low pass filter can be expressed by the following equation:
Figure BDA0002350146890000045
wherein: g denotes the amplification of the filter, H denotes the transfer function, j is the imaginary unit, n denotes the number of filter stages, ω is the angular frequency of the signal in radians/sec, and ω c is the cut-off frequency at which the amplitude drops by 3 db.
Let the cutoff frequency ω c be 1), the above formula is normalized to:
Figure BDA0002350146890000051
the specific process of the step four is as follows:
adopting a spectral density method to carry out multi-feature extraction on an electroencephalogram signal, assuming that a random signal X (N) has N points in a periodogram method, and the result after Fourier transformation of the signal X (N) is X (ejw), then obtaining power spectral density through the periodogram method calculation as follows:
Figure BDA0002350146890000052
the Welch method divides data into M segments, each segment has L points, overlap with length N is allowed to exist between each segment of data when data are divided, the mean value of the power spectral densities of the M segments is used as the power spectral density of an original signal, and firstly, the spectral estimation of each segment is obtained:
Figure BDA0002350146890000053
wherein d (n) is a window function expression, the window function in the Welch method can select a Hamming window, a Hanning window and other non-rectangular windows, can reduce a part of side lobe leakage, U is a normalization factor, and the expression is
Figure BDA0002350146890000054
Finally, the power spectrums of the M segments are obtained, and the power spectrum estimation of the original signal is the mean value of the power spectrums of the M segments
Figure BDA0002350146890000055
And analyzing the preprocessed electroencephalogram signals by a power spectral density method, and extracting frequency domain characteristics of the electroencephalogram signals.
The concrete process of the step five is as follows:
the method comprises the steps that an input sample of a width learning method is subjected to linear transformation once, feature expressions are mapped on a feature plane to form feature nodes, the obtained feature nodes are subjected to nonlinear transformation of an activation function to generate enhanced nodes, the feature nodes and the enhanced nodes are connected together to serve as actual input signals of a system and are linearly output through a connection matrix, and the width learning method adopts a ridge regression generalized inverse to directly obtain an output connection matrix;
X∈RN×Mgiven input data, where N represents the number of input samples and M represents the feature dimension of each sample vector. Assuming that the number of feature nodes is b, the feature on the feature plane can be obtained according to the width structure as follows:
Figure BDA0002350146890000061
in the formula WeiIs the optimal input weight matrix obtained by sparse self-coding.
If d enhancement nodes are generated, the enhancement layer characteristics can be expressed as:
Figure BDA0002350146890000062
in the formula: whAnd βhRespectively representing a random matrix and an offset; phi (-) is an optional nonlinear activation function. Taking a combined matrix obtained by connecting the characteristic nodes and the enhanced nodes as the actual input of the system, and assuming that an output matrix is Y e to RN×QThe width model can then be found by the equation:
YN×Q=AN×(b+d)·W(b+d)×Q=[ZN×b|HN×d]·W(b+d)×Q
in the formula: a represents the actual input matrix of the BLS; w represents the output connection weight matrix, and W is through pair A+The ridge regression approximation is calculated according to the following formula:
Figure BDA0002350146890000063
and taking the result of the feature extraction in the step four as the input of the width learning system, and obtaining the output result of the width learning system through the algorithm.
The electroencephalogram cap adopts an Emotiv Epoc wireless portable electroencephalograph.
Compared with the prior art, the invention has the following beneficial effects: the invention designs an experiment method to carry out adaptability evaluation on an international emotion picture system (IAPS), thereby increasing the accuracy of the experiment; preprocessing data by methods such as band-pass filtering and the like to eliminate fluctuation and difference among electroencephalogram signals at different times; the power spectral density method is adopted to extract multiple features, so that the robustness and efficiency of electroencephalogram signals and emotion classification are improved; the emotion classification by the breadth learning method makes the result avoid falling into local optimality and can save cost.
Drawings
FIG. 1 is a flow chart of a method for emotion recognition based on width learning by visual evoked potentials according to the present invention;
FIG. 2 is a negative emotion induced raw plot of the present invention;
FIG. 3 is a diagram of the erotic emotion induction source of the present invention;
FIG. 4 is a graph of positive emotion induction according to the present invention;
FIG. 5 is a diagram of the processing and analysis results of the original signals of the 4-8HZ positive emotion induced electroencephalogram in the F3 area;
FIG. 6 is a diagram of the processing and analysis results of the original signals of the 4-8HZ neutral positive emotion induced electroencephalogram in the F3 area;
FIG. 7 is a diagram showing the results of processing and analyzing the original signals of the 4-8HZ negative emotion-induced electroencephalogram in the F3 zone;
FIG. 8 is a schematic structural diagram of a width learning method according to the present invention;
in the figure, 1 — output layer; 2-a feature layer; 3-a strengthening layer.
Detailed Description
To further explain the technical means and effects of the present invention adopted to achieve the predetermined object, the following detailed description of the embodiments, structures, features and effects according to the present invention will be given with reference to the accompanying drawings and preferred embodiments.
A visual evoked potential emotion recognition method based on width learning is characterized by comprising the following specific steps:
the method comprises the following steps: carrying out adaptability evaluation on an international emotion picture system (IAPS);
step two: acquiring electroencephalogram signals generated based on visual induction;
step three: preprocessing data by methods such as band-pass filtering;
step four: extracting multiple characteristics of the electroencephalogram signals under different methods;
step five: and (4) carrying out classification processing on the electroencephalogram signals based on width learning.
Step one, carrying out adaptability evaluation on an international emotion picture system (IAPS)
The invention is based on the visual induction experimental analysis of emotion pictures, and in order to increase the credibility, the invention uses an international universal emotion picture library, namely an international emotion picture system (IAPS). Because human emotion is delicate and complex and varies, all emotion researches face the problem of stimulation material standardization. IAPS is a picture system with quantitative evaluation, which is designed to solve this problem, and generally performs the suitability evaluation before using this picture library.
64 pictures are selected from IAPS, the pictures are divided into eight grades according to the joyful degree range, the joyful degree is higher in the grade, the joyful degree is higher in each grade, 8 pictures are arranged in each grade, the higher the joyful degree score is, the more positive the emotion is, and the joyful degree range is 1-9.
The subject was tested in a well-insulated, well-lit laboratory at 23-25 c to ensure that the subject was comfortable. In the whole experiment process, the experiment environment is kept quiet and noiseless, no other electromagnetic interference exists as far as possible, the examinee sits on a comfortable backrest chair 1m away from the PC, the muscles of the whole body are in a relaxed state, no muscle tension or movement is generated, and the interference of the outside to the examinee is reduced to the minimum. The experiment comprises real students, five boys and five girls, the age is 22-26 years old, the eyesight is normal, the body is healthy, and the right hand is good. Before the experiment, the tested person is ensured to be in good spirit and informed of the experiment purpose and process.
Experimental procedure
(1) And (4) emotion preparation. A "ready" flag appears on the screen, prompting the subject to be ready for a 3 second duration.
(2) And (5) inducing pictures. The subject watches the appearing picture for 6 seconds, and fully feels the emotion to be expressed by the picture.
(3) And (4) evaluating the emotion. After the testee finishes watching one picture, the testee needs to simply evaluate and score the applicability of eight grades of pictures selected in the experiment
(4) And (5) a calming stage. After recalling the emotion of the picture induction stage, the subject had 10 seconds to recover his own emotion, and then the following experiment was performed until all pictures were played.
After the scoring process is finished, each picture has personal scores of the pleasure degree, and the scores are averaged to obtain the average score of the pleasure degree of the picture.
Step two, collecting electroencephalogram signals generated based on visual induction
In order to ensure that the stimulation pictures can correctly induce corresponding emotion electroencephalograms of a testee, the stimulation picture classification standard is set and is divided into three categories, namely positive, neutral and negative. Firstly, the testees can obtain personal scores of the pleasure degree of the pictures by self evaluation, and the average value of the personal scores is obtained to obtain the average score of the pictures. Then setting a standard of picture classification according to the relation between the division of the emotion and the pleasure degree, and determining the picture with high pleasure degree as an active picture; determining the picture with the flat joyfulness as a neutral picture; a picture with low pleasure is designated as a negative picture. The tested person divides the picture categories according to the classification standard and removes the pictures with average scores not meeting the classification standard so as to ensure the effectiveness and credibility of the stimulation pictures.
Because the electroencephalogram signals are very weak, in order to reduce noise and increase the reliability of the acquired electroencephalogram signals, experiments are carried out in a relatively closed and quiet environment. Before the experiment begins, some attention points are informed to experimenters, for example, the factors of limb movement, emotional stress and the like should be avoided as much as possible in the experiment so as to avoid the error of collection. After the testee wears the electroencephalogram cap and the experimenter debugs the electroencephalogram acquisition equipment, the process is suspended for 30 seconds, the testee is allowed to adjust to the optimal physical and psychological states, and then the experiment is started. Because the emotional electroencephalogram induced by frequently switching various emotional pictures in a short time is not in place, when the electroencephalogram is collected, each section takes a section as a unit, each section contains 3 pictures of the same type, and the relaxation time of 20 seconds is increased before the stimulation picture is replaced for the next section.
The method comprises the following steps:
(1) and (4) emotion preparation. A "ready" flag appears on the screen, prompting the subject to be ready for a 3 second duration.
(2) And (5) inducing pictures. The testee watches the appeared picture for 6 seconds, and the computer records the emotional electroencephalogram signals in the process. In the process, the testee avoids moving as much as possible, fully immerses the picture content, does not need blinking, and reduces the interference of electroencephalogram signals.
(3) And (4) a relaxation stage. The subject had 8 seconds to calm down his/her own mood for the next experiment.
The above steps are then repeated until the experiment is completed, and fig. 1-3 are the original graphs of the experiment results.
Thirdly, preprocessing the data by methods such as band-pass filtering and the like
Preprocessing the electroencephalogram signals obtained in the second step to reduce trail interference and improve the final classification recognition rate; the preprocessing comprises the steps of reducing the sampling rate, carrying out band-pass filtering on different frequency bands and the like. Because the electroencephalogram signals are affected by the interference of surrounding noise and power frequency noise in the acquisition process, the noise of the original signals needs to be filtered firstly, the currently accepted electroencephalogram frequency is mainly concentrated below 50Hz, and the signals between 0.5Hz and 50Hz are taken out by utilizing a band-pass filter. The filtering of the signal is performed using a butterworth bandpass filter.
The butterworth filter is characterized by a frequency response curve in the pass band that is maximally flat with no fluctuations, and gradually drops to zero in the stop band. On a wave plot of the logarithm of the amplitude against the angular frequency, starting from a certain boundary angular frequency, the amplitude decreases gradually with increasing angular frequency, tending to negative infinity.
The butterworth low pass filter can be expressed by the following equation of amplitude squared versus frequency:
Figure BDA0002350146890000111
where n is the order of the filter, ω c is the cutoff frequency, which is the frequency at which the amplitude drops to-3 db, ω p is the passband edge frequency, and 1/(1+ e 2) ═ H (ω) |2 is the value at the passband edge.
Y (H) ray through two-dimensional complex plane2=H(s)H*(s) ═ H(s) H (-s) a number | H (ω) at a point of s ═ j ω2Therefore, by analytic extension:
Figure BDA0002350146890000112
the poles of the above function are equidistantly distributed on a circle of radius ω c
Figure BDA0002350146890000121
Therefore, the temperature of the molten metal is controlled,
Figure BDA0002350146890000122
the amplitude and frequency relationship of the nth order butterworth low pass filter can be expressed by the following equation:
Figure BDA0002350146890000123
wherein: g denotes the amplification of the filter, H denotes the transfer function, j is the imaginary unit, n denotes the number of filter stages, ω is the angular frequency of the signal in radians/sec, and ω c is the cut-off frequency at which the amplitude drops by 3 db.
Let the cutoff frequency ω c be 1), the above formula is normalized to:
Figure BDA0002350146890000124
experimental analysis forehead and occipital lobe areas are most active in the emotional induction experiment process, and the important analysis is performed on the area F of the important lead area, because the number of leads is large, if the analysis complexity of each channel is large, the result of the processing analysis of the original signal represented by the area F3 at the frequency of 4-8HZ is shown in fig. 4-6.
Step four, extracting multiple characteristics of the electroencephalogram signals under different methods
In the current method, an end-to-end classification model is realized by using an original time domain signal, but the electroencephalogram signal has the characteristics of randomness, non-stability, non-linearity and the like. Therefore, in the task of pattern recognition, the proper characteristics are still selected as input according to the requirements of the task. The characteristics of the electroencephalogram signals mainly comprise time domain characteristics, frequency domain characteristics, the time domain characteristics mainly refer to statistical characteristics such as mean values, variances, correlation analysis and the like, the time domain characteristics are visual and easy to understand, and the electroencephalogram signals can be used for waveform analysis and identification; the frequency domain features are mainly based on spectral estimation in the frequency domain, such as power spectral estimation, autoregressive coefficients, etc., where the most common is power spectral density; the time-frequency domain features refer to the comprehensive analysis of signals from the two aspects of time domain and frequency domain, and because electroencephalogram signals have non-stationary characteristics, the time-frequency domain features can often more effectively extract key information, and the common time-frequency domain features include wavelet coefficients, entropy concepts in nonlinear dynamics and the like.
Spectral density method: because electroencephalogram signals have different meanings in different frequency bands, the most intuitive method is to analyze the electroencephalogram signals from the frequency bands. The calculation of the power spectral density is based on fourier transform, and common methods are periodogram method, Welch method and the like.
Assuming that there are N points in the random signal x (N), and the fourier transform result of the signal x (N) is x (ejw), the power spectral density calculated by the periodogram method is:
Figure BDA0002350146890000131
the Welch method is a calculation mode of a periodogram method, and is one of the most common and widely applied methods at present. The method divides data into M segments, each segment has L points, overlap with the length of N is allowed to exist among the segments when the data are divided, and the mean value of the power spectral densities of the M segments is used as the power spectral density of an original signal. First, the spectral estimate of each segment is calculated:
Figure BDA0002350146890000132
wherein d (n) is a window function expression, and the window function in the Welch method can select a Hamming window, a Hanning window and other non-rectangular windows, so that part of side lobe leakage can be reduced. U is a normalization factor expressed as
Figure BDA0002350146890000141
Finally, the power spectrums of the M segments are obtained, and the power spectrum estimation of the original signal is the mean value of the power spectrums of the M segments
Figure BDA0002350146890000142
And analyzing the preprocessed electroencephalogram signals by a power spectral density method, and extracting frequency domain characteristics of the electroencephalogram signals.
Step five, electroencephalogram signal classification processing based on width learning
A width learning system: the traditional neural network such as the BP network has the defects that the operation time of back propagation calculation is long, local optimization is easy to happen, and the like, so that the classification performance of the network is often greatly influenced by an initialization area.
The width learning system is an increment learning algorithm based on RVFL plane network structure, the structure of which is shown in figure 7, and comprises an output layer (1), a characteristic layer (2) and a reinforcing layer (3), and compared with the traditional RVFL structure, the width learning system has the advantages that an input weight matrix of the width learning system is not randomly generated, but is coded in a sparse self-coding mode, and then the optimal weight is selected in the decoding process. And after the input sample of the width learning method is subjected to linear transformation once, the feature expression is mapped on the feature plane to form feature nodes, and the obtained feature nodes are subjected to nonlinear transformation of an activation function to generate enhanced nodes. The characteristic node and the enhancement node are connected together to serve as an actual input signal of the system and are linearly output through a connection matrix. Like RVFL, in consideration of the defects of high time cost, easy falling into local optimization and the like of the classic BP algorithm, the width learning method directly solves the output connection matrix by using the ridge regression generalized inverse.
X∈RN×MGiven input data, where N represents the number of input samples and M represents the feature dimension of each sample vector. Assuming that the number of feature nodes is b, the feature on the feature plane can be obtained according to the width structure as follows:
Figure BDA0002350146890000151
in the formula WeiIs the optimal input weight matrix obtained by sparse self-coding.
If d enhancement nodes are generated, the enhancement layer characteristics can be expressed as:
Figure BDA0002350146890000152
in the formula: whAnd βhRespectively representing a random matrix and an offset; phi (-) is an optional nonlinear activation function. Taking a combined matrix obtained by connecting the characteristic nodes and the enhanced nodes as the actual input of the system, and assuming that an output matrix is Y e to RN×QThe width model can then be found by the equation:
YN×Q=AN×(b+d)·W(b+d)×Q=[ZN×b|HN×d]·W(b+d)×Q
in the formula: a represents the actual input matrix of the BLS; w represents the output connection weight matrix, and W is through pair A+The ridge regression approximation is calculated according to the following formula:
Figure BDA0002350146890000153
and taking the result of the feature extraction in the step four as the input of the width learning system, and obtaining the output result of the width learning system through the algorithm.

Claims (7)

1. A visual evoked potential emotion recognition method based on width learning is characterized by comprising the following specific steps:
the method comprises the following steps: carrying out adaptability evaluation on an international emotion picture system (IAPS);
step two: acquiring electroencephalogram signals generated based on visual induction;
step three: preprocessing data by a band-pass filtering method;
step four: extracting multiple characteristics of the electroencephalogram signals under different methods;
step five: and (4) carrying out classification processing on the electroencephalogram signals based on width learning.
2. The method for width-learning-based visual evoked potential recognition of emotion of claim 1, wherein said step one comprises the steps of:
2.1 selecting 64 pictures from IAPS, dividing the pictures into eight grades according to a pleasure degree range, wherein the higher the grade is, the higher the pleasure degree is, each grade comprises 8 pictures, the higher the pleasure degree score is, the more positive the emotion is represented, and the pleasure degree range is 1-9;
2.2 minimizing the external interference to the subject, wherein the subject sits at 1m in front of the PC, and the subject comprises five boys and five girls, and the age is between 22 and 26 years, and is informed of the experiment purpose and process, and the experiment flow is as follows:
(1) emotion preparation, namely, a 'preparation' mark appears on a screen to prompt a testee to prepare, wherein the preparation time lasts for 3 seconds;
(2) inducing pictures, wherein a testee watches the appeared pictures for 6 seconds and fully feels the emotion to be expressed by the pictures;
(3) performing emotion evaluation, namely after a testee finishes watching one picture, simply evaluating and scoring the applicability of eight grade pictures selected in an experiment;
(4) in the calming stage, after the emotion of the picture inducing stage is recalled, the testee recuperates the emotion within 10 seconds, and subsequent experiments are carried out until all pictures are played;
2.3 after the scoring process is finished, averaging the personal scores of the pleasure degree of each picture to obtain the average score of the pleasure degree of the picture.
3. The method for recognizing emotion based on width learning visual evoked potential according to claim 1, wherein the specific process of the second step is as follows: the testee takes the brain electricity cap, and experimenter debugs behind the brain electricity collection equipment, pauses for 30 seconds, treats that the testee adjusts to best health and mental state, begins the experiment, and during the brain electricity was gathered, to use the minor matters as the unit, every minor matters contains 3 pictures of the same type to change the amazing picture and increase 20 seconds's relaxation time before carrying out next minor matters, gather the specific step of the experiment based on the brain electricity signal that the vision induction produced and be:
3.1 emotion preparation, wherein a 'preparation' mark appears on a screen to prompt a testee to prepare for 3 seconds;
3.2, inducing pictures, enabling a testee to watch the appeared pictures for 6 seconds, recording emotion electroencephalogram signals in the process by a computer, avoiding the testee to move as much as possible in the process, fully immersing the contents of the pictures, avoiding blinking and reducing the interference of the electroencephalogram signals;
3.3 relaxation phase, the testee has 8 seconds to recover the self emotion for the next experiment;
3.4 replace pictures and repeat the above steps until the experiment is finished.
4. The method for recognizing emotion based on width learning visual evoked potential of claim 1, wherein in the third step, preprocessing is performed on the electroencephalogram signals obtained in the second step, the preprocessing includes reducing the sampling rate, performing band-pass filtering of different frequency bands, firstly, by using a band-pass filter, taking out signals with frequency between 0.5Hz and 50Hz, and performing signal filtering by using a Butterworth band-pass filter;
the butterworth low pass filter can be expressed by the following equation of amplitude squared versus frequency:
Figure FDA0002350146880000031
where n is the order of the filter, ω c is the cutoff frequency, which is the frequency at which the amplitude drops to-3 db, ω p is the passband edge frequency, and 1/(1+ e 2) ═ H (ω) |2 is the value at the passband edge.
Y (H) ray through two-dimensional complex plane2=H(s)H*(s) ═ H(s) H (-s) a number | H (ω) at a point of s ═ j ω2Therefore, by analytic extension:
Figure FDA0002350146880000032
the poles of the above function are equidistantly distributed on a circle of radius ω c
Figure FDA0002350146880000033
Wherein k is 0,1,2
Therefore, the temperature of the molten metal is controlled,
Figure FDA0002350146880000034
k=0,1,2,....,n-1
the amplitude and frequency relationship of the nth order butterworth low pass filter can be expressed by the following equation:
Figure FDA0002350146880000035
wherein: g denotes the amplification of the filter, H denotes the transfer function, j is the imaginary unit, n denotes the number of filter stages, ω is the angular frequency of the signal in radians/sec, and ω c is the cut-off frequency at which the amplitude drops by 3 db.
Let the cutoff frequency ω c be 1), the above formula is normalized to:
Figure FDA0002350146880000041
5. the method for recognizing emotion based on width learning visual evoked potential of claim 1, wherein the specific process of step four is as follows:
adopting a spectral density method to carry out multi-feature extraction on an electroencephalogram signal, assuming that a random signal X (N) has N points in a periodogram method, and the result after Fourier transformation of the signal X (N) is X (ejw), then obtaining power spectral density through the periodogram method calculation as follows:
Figure FDA0002350146880000042
the method comprises the following steps of adopting a Welch method to carry out multi-feature extraction on electroencephalogram signals, wherein the Welch method divides data into M segments, each segment of data has L points, overlap with the length of N is allowed to exist among all the segments of data during data segmentation, taking the mean value of the power spectral densities of the M segments as the power spectral density of an original signal, and firstly calculating the spectral estimation of each segment:
Figure FDA0002350146880000043
wherein d (n) is a window function expression, the window function in the Welch method can select a Hamming window, a Hanning window and other non-rectangular windows, can reduce a part of side lobe leakage, U is a normalization factor, and the expression is as follows:
Figure FDA0002350146880000044
and finally, the power spectrum estimation of the original signal is the mean value of the power spectrums of the M segments:
Figure FDA0002350146880000045
and analyzing the preprocessed electroencephalogram signals by a power spectral density method, and extracting frequency domain characteristics of the electroencephalogram signals.
6. The method for recognizing emotion based on width learning visual evoked potential according to claim 1, wherein the concrete process of the fifth step is as follows:
after linear transformation is carried out for one time, the feature expression is mapped on a feature plane to form feature nodes, the obtained feature nodes are subjected to nonlinear transformation of an activation function to generate enhanced nodes, the feature nodes and the enhanced nodes are jointly connected to be used as actual input signals of a system and are linearly output through a connection matrix, and an output connection matrix is directly solved by a width learning method through ridge regression generalized inverse;
X∈RN×Mgiven input data, where N represents the number of input samples and M represents the feature dimension of each sample vector. Assuming that the number of feature nodes is b, the feature on the feature plane can be obtained according to the width structure as follows:
ZN×b=XN×M·Wei M×b
in the formula WeiIs the optimal input weight matrix obtained by sparse self-coding.
If d enhancement nodes are generated, the enhancement layer characteristics can be expressed as:
HN×d=φ(ZN×b·Wh b×dh N×d)
in the formula: whAnd βhRespectively representing a random matrix and an offset; phi (-) is an optional nonlinear activation function. Taking a combined matrix obtained by connecting the characteristic nodes and the enhanced nodes as the actual input of the system, and assuming that an output matrix is Y e to RN×QThe width model can then be found by the equation:
YN×Q=AN×(b+d)·W(b+d)×Q=[ZN×b|HN×d]·W(b+d)×Q
in the formula: a represents the actual input matrix of the BLS; w represents the output connection weight matrix, and W is through pair A+The ridge regression approximation is calculated according to the following formula:
Figure FDA0002350146880000051
and taking the result of the feature extraction in the step four as the input of the width learning system, and obtaining the output result of the width learning system through the algorithm.
7. The method for emotion recognition based on visual evoked potentials for breadth learning of claim 3, wherein said electroencephalogram cap employs an Emotiv Epoc wireless portable electroencephalograph.
CN201911411761.5A 2019-12-31 2019-12-31 Visual evoked potential emotion recognition method based on width learning Pending CN110946576A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911411761.5A CN110946576A (en) 2019-12-31 2019-12-31 Visual evoked potential emotion recognition method based on width learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911411761.5A CN110946576A (en) 2019-12-31 2019-12-31 Visual evoked potential emotion recognition method based on width learning

Publications (1)

Publication Number Publication Date
CN110946576A true CN110946576A (en) 2020-04-03

Family

ID=69985292

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911411761.5A Pending CN110946576A (en) 2019-12-31 2019-12-31 Visual evoked potential emotion recognition method based on width learning

Country Status (1)

Country Link
CN (1) CN110946576A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112546391A (en) * 2020-12-04 2021-03-26 中国科学院深圳先进技术研究院 Method for determining emotional processing tendency and related product
CN112596273A (en) * 2020-12-30 2021-04-02 华南理工大学 Intelligent color-changing glasses with electroencephalogram emotion assessment and adjustment functions and control method
CN112656427A (en) * 2020-11-26 2021-04-16 山西大学 Electroencephalogram emotion recognition method based on dimension model
CN112754502A (en) * 2021-01-12 2021-05-07 曲阜师范大学 Automatic music switching method based on electroencephalogram signals
CN112836593A (en) * 2021-01-15 2021-05-25 西北大学 Emotion recognition method and system fusing prior and automatic electroencephalogram characteristics
CN113011493A (en) * 2021-03-18 2021-06-22 华南理工大学 Electroencephalogram emotion classification method, device, medium and equipment based on multi-kernel width learning
CN113359541A (en) * 2021-05-19 2021-09-07 杭州师范大学 Multi-sensory-mode continuous attention monitoring system and method
CN113576478A (en) * 2021-04-23 2021-11-02 西安交通大学 Electroencephalogram signal-based image emotion classification method, system and device
CN113591653A (en) * 2021-07-22 2021-11-02 中南大学 Incremental zinc flotation working condition discrimination method based on width learning system
CN114403877A (en) * 2022-01-21 2022-04-29 中山大学 Multi-physiological-signal emotion quantitative evaluation method based on two-dimensional continuous model
CN116369949A (en) * 2023-06-06 2023-07-04 南昌航空大学 Electroencephalogram signal grading emotion recognition method, electroencephalogram signal grading emotion recognition system, electronic equipment and medium
CN116894485A (en) * 2023-07-11 2023-10-17 广东工业大学 Incremental width learning system and method based on privacy protection

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101706842A (en) * 2009-08-25 2010-05-12 浙江大学 Method for creating Chinese emotion picture system
CN102347742A (en) * 2011-06-14 2012-02-08 深圳市共进电子股份有限公司 1kFtADSL (1kFt Asymmetrical Digital Subscriber Loop) signal attenuation circuit
CN102715911A (en) * 2012-06-15 2012-10-10 天津大学 Brain electric features based emotional state recognition method
CN109685071A (en) * 2018-11-30 2019-04-26 杭州电子科技大学 Brain electricity classification method based on the study of common space pattern feature width
CN109871831A (en) * 2019-03-18 2019-06-11 太原理工大学 A kind of emotion identification method and system
KR20190128978A (en) * 2018-05-09 2019-11-19 한국과학기술원 Method for estimating human emotions using deep psychological affect network and system therefor

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101706842A (en) * 2009-08-25 2010-05-12 浙江大学 Method for creating Chinese emotion picture system
CN102347742A (en) * 2011-06-14 2012-02-08 深圳市共进电子股份有限公司 1kFtADSL (1kFt Asymmetrical Digital Subscriber Loop) signal attenuation circuit
CN102715911A (en) * 2012-06-15 2012-10-10 天津大学 Brain electric features based emotional state recognition method
KR20190128978A (en) * 2018-05-09 2019-11-19 한국과학기술원 Method for estimating human emotions using deep psychological affect network and system therefor
CN109685071A (en) * 2018-11-30 2019-04-26 杭州电子科技大学 Brain electricity classification method based on the study of common space pattern feature width
CN109871831A (en) * 2019-03-18 2019-06-11 太原理工大学 A kind of emotion identification method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
杨福生: "《随机信号分析》", 31 July 1990, 清华大学出版社 *
贾晨 等: "基于宽度学习方法的多模态信息融合", 《智能系统学报》 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112656427B (en) * 2020-11-26 2023-03-24 山西大学 Electroencephalogram emotion recognition method based on dimension model
CN112656427A (en) * 2020-11-26 2021-04-16 山西大学 Electroencephalogram emotion recognition method based on dimension model
CN112546391A (en) * 2020-12-04 2021-03-26 中国科学院深圳先进技术研究院 Method for determining emotional processing tendency and related product
CN112596273A (en) * 2020-12-30 2021-04-02 华南理工大学 Intelligent color-changing glasses with electroencephalogram emotion assessment and adjustment functions and control method
CN112754502A (en) * 2021-01-12 2021-05-07 曲阜师范大学 Automatic music switching method based on electroencephalogram signals
CN112836593A (en) * 2021-01-15 2021-05-25 西北大学 Emotion recognition method and system fusing prior and automatic electroencephalogram characteristics
CN113011493A (en) * 2021-03-18 2021-06-22 华南理工大学 Electroencephalogram emotion classification method, device, medium and equipment based on multi-kernel width learning
CN113576478A (en) * 2021-04-23 2021-11-02 西安交通大学 Electroencephalogram signal-based image emotion classification method, system and device
CN113359541A (en) * 2021-05-19 2021-09-07 杭州师范大学 Multi-sensory-mode continuous attention monitoring system and method
CN113591653A (en) * 2021-07-22 2021-11-02 中南大学 Incremental zinc flotation working condition discrimination method based on width learning system
CN114403877A (en) * 2022-01-21 2022-04-29 中山大学 Multi-physiological-signal emotion quantitative evaluation method based on two-dimensional continuous model
CN116369949A (en) * 2023-06-06 2023-07-04 南昌航空大学 Electroencephalogram signal grading emotion recognition method, electroencephalogram signal grading emotion recognition system, electronic equipment and medium
CN116369949B (en) * 2023-06-06 2023-09-15 南昌航空大学 Electroencephalogram signal grading emotion recognition method, electroencephalogram signal grading emotion recognition system, electronic equipment and medium
CN116894485A (en) * 2023-07-11 2023-10-17 广东工业大学 Incremental width learning system and method based on privacy protection

Similar Documents

Publication Publication Date Title
CN110946576A (en) Visual evoked potential emotion recognition method based on width learning
CN110765920B (en) Motor imagery classification method based on convolutional neural network
CN107024987B (en) Real-time human brain attention testing and training system based on EEG
CN109224242B (en) Psychological relaxation system and method based on VR interaction
CN110969108B (en) Limb action recognition method based on autonomic motor imagery electroencephalogram
Ren et al. Off-line and on-line stress detection through processing of the pupil diameter signal
Jerritta et al. Electrocardiogram-based emotion recognition system using empirical mode decomposition and discrete Fourier transform.
CN109784023B (en) Steady-state vision-evoked electroencephalogram identity recognition method and system based on deep learning
JP2019528104A (en) In-ear sensing system and method for monitoring biological signals
CN103584872A (en) Psychological stress assessment method based on multi-physiological-parameter integration
Secerbegovic et al. Mental workload vs. stress differentiation using single-channel EEG
Tong et al. Emotion recognition based on photoplethysmogram and electroencephalogram
CN114781465B (en) rPPG-based non-contact fatigue detection system and method
CN109871831B (en) Emotion recognition method and system
Baghdadi et al. Dasps: a database for anxious states based on a psychological stimulation
CN113143208B (en) Pain sensitivity assessment system and method based on multidimensional measurement
CN111000556A (en) Emotion recognition method based on deep fuzzy forest
CN115640827B (en) Intelligent closed-loop feedback network method and system for processing electrical stimulation data
CN110811648A (en) Depression tendency evaluation system based on residual convolutional neural network
CN108143412B (en) Control method, device and system for electroencephalogram emotion analysis of children
Pavlov et al. Recognition of electroencephalographic patterns related to human movements or mental intentions with multiresolution analysis
CN113723557A (en) Depression electroencephalogram classification system based on multiband time-space convolution network
CN117883082A (en) Abnormal emotion recognition method, system, equipment and medium
CN113576498A (en) Visual and auditory aesthetic evaluation method and system based on electroencephalogram signals
Sindhu et al. Emotion driven mood enhancing multimedia recommendation system using physiological signal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200403