CN113591598A - Brain-computer interface cross-load linear discrimination method based on correlation analysis - Google Patents

Brain-computer interface cross-load linear discrimination method based on correlation analysis Download PDF

Info

Publication number
CN113591598A
CN113591598A CN202110769354.2A CN202110769354A CN113591598A CN 113591598 A CN113591598 A CN 113591598A CN 202110769354 A CN202110769354 A CN 202110769354A CN 113591598 A CN113591598 A CN 113591598A
Authority
CN
China
Prior art keywords
sample
load
electroencephalogram
data
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110769354.2A
Other languages
Chinese (zh)
Other versions
CN113591598B (en
Inventor
李梦凡
左皓鑫
伍煜伟
廖文喆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei University of Technology
Original Assignee
Hebei University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei University of Technology filed Critical Hebei University of Technology
Priority to CN202110769354.2A priority Critical patent/CN113591598B/en
Publication of CN113591598A publication Critical patent/CN113591598A/en
Application granted granted Critical
Publication of CN113591598B publication Critical patent/CN113591598B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/372Analysis of electroencephalograms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/377Electroencephalography [EEG] using evoked responses
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/377Electroencephalography [EEG] using evoked responses
    • A61B5/378Visual stimuli
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Psychiatry (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Computation (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Psychology (AREA)
  • Physiology (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention provides a brain-computer interface cross-load linear discrimination method based on correlation analysis, which comprises the following steps: acquiring electroencephalogram data under mental loads in different states, training a plurality of groups of classifiers for the acquired electroencephalogram signal data by adopting a Fisher criterion, inputting electroencephalogram samples to be classified into the plurality of groups of classifiers to obtain a plurality of groups of linear discrimination values, respectively calculating average Pearson correlation coefficients for the electroencephalogram samples to be classified and the electroencephalogram data under the different states, and converting the average Pearson correlation coefficients into weight coefficients; and the weighted sum value is used for judging the type of the electroencephalogram sample to be classified through a Fisher criterion and a voter. According to the method, linear discriminant classifiers based on the Fisher criterion in different mental load states are designed, and conversion of Pearson correlation coefficients and weight calculation is adopted, so that weight coefficients of a sample to be detected and a training sample are obtained, a final discriminant is improved, and the purpose of self-adaptive classification of the individual load-crossing electroencephalogram data is achieved.

Description

Brain-computer interface cross-load linear discrimination method based on correlation analysis
Technical Field
The invention belongs to the field of neural engineering in biomedical engineering, and particularly relates to a brain-computer interface cross-load linear discrimination method based on correlation analysis.
Background
Mental load is an important factor influencing the work performance, the work performance is reduced due to higher mental load, and serious consequences are caused due to higher mental load in aspects with high safety requirements such as brain-computer interfaces. In the fusion aspect of the classifier, the steps of data acquisition, preprocessing, feature extraction, classifier design and training, classification decision and the like are the general solving process of the pattern classification problem. The fusion of the classifier is divided into classification decisions of a data layer, a feature layer and a final decision layer, as shown in fig. 1, the original electroencephalogram data can be subjected to the fusion of the classifier layers in the aspects of preprocessing, feature extraction and final classification. Because data fusion needs to extract different physiological data, the applicability in a brain-computer interface is not strong, multiple features are needed for feature fusion, the brain-computer interface mostly uses single features such as event-related potentials, steady-state visual evoked potentials and the like, and the feature types are few, so that research is mostly carried out on the fusion algorithm level.
The fusion classification algorithm of the training set and the test set based on the classifier plays a key role in constructing a self-adaptive identification brain-computer interface system and directly influences the performance of the system, and the core problem of the construction of the brain-computer interface system is to improve the identification rate when the tested brain is in different brain load states and to identify the control instruction which is tried to be output in the brain load state. Populus help (spontaneous electroencephalogram brain-computer interface technology and electroencephalogram signal identification method research [ D ]. Shanghai traffic university, 2007) performs adaptive feature selection of a genetic algorithm on feature fusion on an adaptive brain-computer interface, and Xu et al (virtual reality emotion classification system [ D ]. southern China university, 2020) based on a flexible fabric electroencephalogram electrode adopt a transfer learning algorithm in deep learning on different emotion electroencephalograms, so that the average classification accuracy of the electroencephalograms with different emotions reaches 69.1%. The application of the self-adaptive identification of different brain load electroencephalogram data based on the traditional linear discrimination classification method is still blank.
The traditional linear classifier based on the Fisher criterion is widely applied because of good effect on classification of electroencephalogram signals and convenience for training and use, and the effect on classification of individual cross-load states is reduced due to reduction of event-related potential amplitude of a user under high mental load and difference of training data and test data. Adaptive linear discriminant methods that perform final linear discriminant changes based on data similarity have not been used. And the improvement of the classification precision of the linear discrimination method in the fusion aspect of discrimination decision is rare.
Disclosure of Invention
In view of the above, the present invention is directed to a brain-computer interface cross-load linear discrimination method based on correlation analysis, so as to solve the problem that the effect on the classification of the individual cross-load state is reduced due to the decrease of the event-related potential amplitude of the user under high mental load and the difference between the training data and the test data.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
a brain-computer interface cross-load linear discrimination method based on correlation analysis comprises the following steps:
s1, acquiring electroencephalogram data under mental loads in different states;
s2, training a plurality of groups of classifiers by using Fisher criterion on the acquired electroencephalogram data;
s3, inputting the electroencephalogram samples to be classified into a plurality of groups of classifiers to obtain a plurality of groups of linear discrimination values;
s4, calculating average Pearson correlation coefficients of the electroencephalogram samples to be classified and the electroencephalogram data under different states respectively, and converting the Pearson correlation coefficients into weight coefficients through a weight conversion formula;
and S5, weighting and summing the weight coefficient and the linear discrimination value, wherein the weighted summation value is used for judging the type of the electroencephalogram sample to be classified through a Fisher criterion and a voter.
Further, the Fisher criterion in step S2 is as follows:
Figure BDA0003152167790000021
Figure BDA0003152167790000031
wherein Y represents a sample to be detected, Target represents a judged Target sample, Nontarget represents a judged non-Target sample, ff(x) Which represents the final value of the discrimination value,
Figure BDA0003152167790000032
for low-load data X1The direction of the projection is trained to be,
Figure BDA0003152167790000033
for medium load data XmThe direction of the projection is trained to be,
Figure BDA0003152167790000034
training X for medium and high load datahProjection direction omega of the comingL0Training a classifier threshold, ω, for low-load dataM0Training a threshold value of a classifier for the medium load data; omegaH0Training the classifier threshold, r, for high load dataLFor inputting Pearson's correlation weight coefficient, r, calculated for the sample to be discriminated and the low load training set sampleMPearson's correlation weight coefficient, r, calculated for inputting samples to be discriminated and samples of the medium load training setHPearson's correlation weight coefficient, F, calculated for input of samples to be discriminated and samples of the high load training setLrLFor the sample r to be discriminatedLA discrimination value of FMrMFor the sample r to be discriminatedMA discrimination value of FHrHFor the sample r to be discriminatedHThe discrimination value of (1).
Further, the average pearson correlation coefficient calculated in step S4 is calculated by the following formula:
Figure BDA0003152167790000035
where ρ isX,YIs a correlation coefficient, σXIs the standard deviation of X, σYIs the standard deviation of Y, X represents the electroencephalogram data under different states of mental load, Y represents the sample to be tested,
Figure BDA0003152167790000036
represents the mean value of X, μYRepresents the mean value of Y.
Further, the weight conversion formula in step S4 is as follows:
Figure BDA0003152167790000037
wherein r isYRepresenting the transformed weight, p, of the corresponding sample YYMean Pearson coefficient, p, representing the corresponding sample Y and the training samplelIs the Pearson correlation coefficient, p, of the EEG signal data under the task of the visual evoked event related potential under the low mental loadmIs the Pearson correlation coefficient, p, of the EEG signal data under the task of the visual evoked event related potential under the condition of the midbrain force loadhThe correlation coefficient of the pearson of the tested electroencephalogram signal data collected under the task of the visual evoked event related potential under the high mental load.
Further, in step S5, the weighted sum is used to determine the type of the electroencephalogram sample to be classified through a linear discriminant formula and a voter, and the specific process is as follows: firstly, classifying all samples in each repetition in the BCI, and setting n types of excitation in a single repetition and m types of repetition in a single tertiary;
Figure BDA0003152167790000041
wherein
Figure BDA0003152167790000042
The class of the ith excitation in the jth repetition is a threshold value obtained by a classifier for the excitation sample; only one excitation is judged to be Target in a single repetition, so that only the excitation with the highest threshold value is selected as the class 1, namely the Target; secondly, superposing Target ticket number total _ value and threshold total _ w of various incentives under m repetition:
Figure BDA0003152167790000043
Figure BDA0003152167790000044
class category for judging ith excitation of single reali
Figure BDA0003152167790000045
If the Target ticket number of the excitation type is the highest ticket number in the n types of excitation, judging the excitation as the Target; if other incentives which are consistent with the number of votes obtained by the ith incentive and are all the highest exist, comparing the threshold sum of the incentives, and if the threshold sum of the ith incentive is higher than the other incentives, determining the Target; if not, it is judged as Nontarget.
Compared with the prior art, the brain-computer interface cross-load linear discrimination method based on correlation analysis has the following beneficial effects:
(1) the invention relates to a brain-computer interface cross-load linear discrimination method based on correlation analysis
(2) The invention relates to a brain-computer interface cross-load linear discrimination method based on correlation analysis
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate an embodiment of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a schematic diagram of a conventional fusion classifier algorithm hierarchy according to an embodiment of the present invention;
fig. 2 is an algorithm flowchart of a brain-computer interface cross-load linear discrimination method based on correlation analysis according to an embodiment of the present invention.
FIG. 3 is a Pearson correlation determination table according to an embodiment of the present invention;
FIG. 4 is a comparison graph of the classification accuracy of the method and the single-state data classifier for the sample to be tested according to the embodiment of the present invention;
FIG. 5 is a graph comparing the information transmission rate results of the single-state data classifier on the samples to be tested according to the embodiment of the present invention.
Detailed Description
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict.
The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
A brain-computer interface cross-load linear discrimination method based on correlation analysis comprises the following steps:
s1, acquiring electroencephalogram data under mental loads in different states;
s2, training a plurality of groups of classifiers by using Fisher criterion on the acquired electroencephalogram data;
s3, inputting the electroencephalogram samples to be classified into a plurality of groups of classifiers to obtain a plurality of groups of linear discrimination values;
s4, calculating average Pearson correlation coefficients of the electroencephalogram samples to be classified and the electroencephalogram data under different states respectively, and converting the Pearson correlation coefficients into weight coefficients through a weight conversion formula;
and S5, weighting and summing the weight coefficient and the linear discrimination value, wherein the weighted summation value is used for judging the type of the electroencephalogram sample to be classified through a Fisher criterion and a voter.
Further, the Fisher criterion in step S2 is as follows:
Figure BDA0003152167790000061
Figure BDA0003152167790000062
wherein Y represents a sample to be detected, Target represents a judged Target sample, Nontarget represents a judged non-Target sample, ff(x) Which represents the final value of the discrimination value,
Figure BDA0003152167790000063
for low-load data X1The direction of the projection is trained to be,
Figure BDA0003152167790000064
for medium load data XmThe direction of the projection is trained to be,
Figure BDA0003152167790000065
for medium and high load data XhTrained projection direction omegaL0Training a classifier threshold, ω, for low-load dataM0Training a threshold value of a classifier for the medium load data; omegaH0Training the classifier threshold, r, for high load dataLFor inputting Pearson's correlation weight coefficient, r, calculated for the sample to be discriminated and the low load training set sampleMPearson's correlation weight coefficient, r, calculated for inputting samples to be discriminated and samples of the medium load training setHPearson's correlation weight coefficient, F, calculated for input of samples to be discriminated and samples of the high load training setLrLFor the sample r to be discriminatedLA discrimination value of FMrMFor the sample r to be discriminatedMA discrimination value of FHrHFor the sample r to be discriminatedHThe discrimination value of (1).
Further, the average pearson correlation coefficient calculated in step S4 is calculated by the following formula:
Figure BDA0003152167790000066
where ρ isX,YIs a correlation coefficient, σXIs the standard deviation of X, σYIs the standard deviation of Y, X represents the electroencephalogram data under different states of mental load, Y represents the sample to be tested,
Figure BDA0003152167790000067
represents the mean value of X, μYRepresents the mean value of Y.
Further, the weight conversion formula in step S4 is as follows:
Figure BDA0003152167790000071
wherein r isYRepresenting the transformed weight, p, of the corresponding sample YYMean Pearson coefficient, p, representing the corresponding sample Y and the training samplelIs the Pearson correlation coefficient, p, of the EEG signal data under the task of the visual evoked event related potential under the low mental loadmIs the Pearson correlation coefficient, p, of the EEG signal data under the task of the visual evoked event related potential under the condition of the midbrain force loadhThe correlation coefficient of the pearson of the tested electroencephalogram signal data collected under the task of the visual evoked event related potential under the high mental load.
In the step S5, the weighted sum is used to determine the type of the electroencephalogram sample to be classified through a linear discriminant formula and a voter, and the specific process is as follows: firstly, classifying all samples in each repetition in the BCI, and setting n types of excitation in a single repetition and m types of repetition in a single tertiary;
Figure BDA0003152167790000072
wherein
Figure BDA0003152167790000073
Is the class of the i excitation in the j repetition, isThe threshold value of the excitation sample obtained by the classifier; only one excitation is judged to be Target in a single repetition, so that only the excitation with the highest threshold value is selected as the class 1, namely the Target; secondly, superposing Target ticket number total _ value and threshold total _ w of various incentives under m repetition:
Figure BDA0003152167790000074
Figure BDA0003152167790000075
class category for judging ith excitation of single reali
Figure BDA0003152167790000076
If the Target ticket number of the excitation type is the highest ticket number in the n types of excitation, judging the excitation as the Target; if other incentives which are consistent with the number of votes obtained by the ith incentive and are all the highest exist, comparing the threshold sum of the incentives, and if the threshold sum of the ith incentive is higher than the other incentives, determining the Target; if not, it is judged as Nontarget.
After the pearson correlation coefficients are calculated for the test samples and the training samples, each test sample needs to be subjected to correlation coefficient calculation with all training set samples, and then the average value of the correlation coefficients is converted into a weight coefficient which is used as the weight coefficient in the final discriminant. The algorithm flow is shown in fig. 2. Firstly, training parameters of the classifier by using three groups of different data, then respectively calculating Pearson correlation coefficients of the test sample and the training sample when inputting the test sample, then converting the Pearson correlation coefficients into weight coefficients according to the corresponding average values of the test sample and the training sample, carrying out a final discriminant, carrying out a weighted summation on the values of the test sample input into each state classifier, and the rest steps are consistent with a linear discriminant method based on a Fisher criterion.
A linear discriminant classifier based on Fisher's criterion: the principle of the method is that two types of models are obtained through training data, namely, projection planes or axes of the two types of data are obtained, high-dimensional data are projected to low-dimensional data, and the complexity of the data is reduced; the data samples of the same type in the low-dimensional data generated after projection are gathered together as much as possible, and the data samples of different types are separated as far as possible, so that the classification effect is optimal.
The general form of the linear discriminant method based on the Fisher criterion can be expressed as equation 2:
f(x)=WTX+w0 (2)
after the improvement:
Figure BDA0003152167790000081
Figure BDA0003152167790000082
wherein Y represents a sample to be detected, Target represents a judged Target sample, and Nontarget represents a judged non-Target sample.
And one weight is distributed to the three state data, and the relative relation number is considered in consideration of the fact that the test sample has a relation to the finally obtained discrimination numerical value of the trainer.
There are many applications of the pearson correlation coefficient in determining the similarity between two sets of samples. The correlation coefficient reflects the linear relationship between the variables and the direction of the correlation.
The pearson correlation coefficient between two variables is defined as the quotient of the product of the covariance and the standard deviation between the two variables, and is calculated as shown in fig. 4:
Figure BDA0003152167790000091
the above equation defines the correlation coefficient of two variable samples X and Y, where ρX,YHas a value range of [1,1 ]]. The larger the correlation coefficient, the larger the tableThe higher the correlation between X and Y. Wherein a positive linear correlation is present when the coefficient value is 1, a negative linear correlation is present when the coefficient value is-1, and a value of 0 indicates that the two sample variables X and Y are independent. The correlation coefficient value range is shown in fig. 3 for determining the correlation strength of the two correlated samples.
The conversion formula of the weight is shown as 5:
Figure BDA0003152167790000092
wherein r isYRepresenting the transformed weight, p, of the corresponding sample YYThe average pearson coefficient of the corresponding sample Y and the training sample is shown.
A voter for event-dependent potential classification. Firstly, classifying all samples in each repetition, and setting n types of excitation in a single repetition and m types of excitation in a single real:
Figure BDA0003152167790000093
wherein
Figure BDA0003152167790000094
The class of the ith excitation in the jth repetition is the threshold value obtained by the classifier for the excitation sample. Because only one excitation needs to be judged as Target in a single repetition, only the excitation with the highest threshold is selected as the class 1, namely the Target. Secondly, superposing Target ticket number total _ value and threshold total _ w of various incentives under m repetition:
Figure BDA0003152167790000101
Figure BDA0003152167790000102
class category for judging ith excitation of single reali
Figure BDA0003152167790000103
If the Target ticket number of the excitation type is the highest ticket number in the n types of excitation, judging the excitation as the Target; if other incentives which are consistent with the number of votes obtained by the ith incentive and are all the highest exist, comparing the threshold sum of the incentives, and if the threshold sum of the ith incentive is higher than the other incentives, determining the Target; if not, it is judged as Nontarget.
To summarize:
1) selecting a method for judging a classifier at an algorithm fusion level aiming at a brain-computer interface based on event-related potentials;
2) extracting the characteristics of the obtained electroencephalogram data in different states, and training the parameters of a plurality of groups of classifiers according to the Fisher criterion, so that the electroencephalogram sample Y to be classified can obtain a plurality of groups of linear discrimination values F after entering the plurality of groups of classifiersL,FM,FH
3) Respectively calculating average Pearson correlation coefficients of the electroencephalogram sample Y to be classified and the electroencephalogram data under different states, and then converting the average Pearson correlation coefficients into weight coefficients r according to a formula 5l,rm,rh
4) And performing a final discriminant, performing a weighted summation on the values of the test sample input into each state classifier, and discriminating the category of the final sample output according to the weighted summation value.
5) The final classification discrimination is based on formula 3 and the voter, and f is output for the input sample YfThe value of (Y) determines whether the Y sample belongs to the Target class or the Nontarget class.
The specific embodiment is as follows:
acquiring electroencephalogram signal data X of a task of visually inducing event-related potentials under low mental loadLData X of tested electroencephalogram signals acquired under task of visual evoked event-related potentials under medium mental loadMAcquired under the task of visual evoked event-related potentials under high mental loadTested electroencephalogram signal data XHAnd the electroencephalogram data Y to be classified, which is acquired under the task of the visual evoked event related potential in an unknown state, is acquired.
Classifying the acquired electroencephalogram data under different states of mental loads through a linear discrimination formula:
Figure BDA0003152167790000111
Figure BDA0003152167790000112
calculating the obtained data XL, XM and XH of different mental load states
Figure BDA0003152167790000113
And ωL0,ωM0,ωH0And calculating three classification results obtained after the unknown sample Y to be detected is input into the L classifier, the M classifier and the H classifier of the three parameter classifiers.
Secondly, calculating average Pearson correlation coefficient p respectively for the unknown classification sample Y and three groups of XL, XM and XHl,pm,phThe weight is then converted into a weight coefficient rl,rm,rh
The pearson correlation coefficient formula is as follows:
Figure BDA0003152167790000114
the conversion formula of the weight is as follows:
Figure BDA0003152167790000115
finally, classifying the samples to be detected according to a linear discrimination formula and a voter to obtain a Pearson correlation coefficient classifier PCC classifier, and comparing with classifiers L classifier, M classifier and H classifier classification results and information transmission rates which are trained by three kinds of state data;
and calculating a linear discrimination value of the data sample through feature extraction, simultaneously calculating a Pearson correlation coefficient of the data sample and the training sample, and weighting on final discrimination to obtain a final linear discrimination value. And a voter is used in the stacking process, so that the classification accuracy is improved aiming at the improvement of the stacking times. As shown in fig. 4, the classification accuracy reaches about 80.28% after 7 times of superposition, and in the classification of data in all states, it can be seen that the linear discrimination method based on the pearson correlation coefficient reaches the highest result in the primary superposition classification, and also reaches the result of the classification accuracy similar to that of the training test using data in the same brain load state in the subsequent superposition calculation.
FIG. 4 shows the result of the average classification accuracy of the classifier trained by the algorithm and the classifier trained by the three other brain load state data sets on the data sets under all other load states. The results show that the accuracy increases with the number of overlays. Under the condition of overlapping for a certain number of times, the classification accuracy of tested N200 and P300 potentials can reach 81.39%. This demonstrates that the algorithm can identify evoked N200 and P300 potentials with better recognition rate on the first overlay than other classifiers.
Fig. 5 shows the average result of the information transfer rate of the classifier trained by the algorithm and the classifier trained by the other three brain load state data sets for all the state data. The results show that the information transmission rate decreases with an increase in the number of times of superimposition. In the case of the initial superposition, the tested N200 and P300 potential information transmission rate can reach 28.36 bits/min. This demonstrates that the algorithm can identify evoked N200 and P300 potentials and that the classification identification performance is improved compared to a single state classifier. However, after the third superposition, the information transmission rate of the method is similar to the classifier trained by the electroencephalogram data with low load, medium load and high load, and the difference is 0.89 bit/min on average, which indicates that after the user uses the brain-computer interface for too long time, the classification performance of the classifier based on the pearson correlation coefficient to the electroencephalogram data of the user under the brain load state is reduced very low.
The classifier based on the Pearson correlation coefficient can obtain the classification performance which is closer to that of a conventional classifier in the same training and testing data set when classifying the data of multiple states, obviously improves the classification performance, and has stronger performance than a classifier which is trained by singly using single-state data on the classification of the data of the multiple states.
The goal of a multi-state data fusion classifier is to be able to fully exploit the strengths of each state in training the classifier on its respective classification performance to achieve superior performance over any single-state classifier. The algorithm of the correlation coefficient method and the difference change of the weight coefficient along with the input of the sample to be detected embody the principle, and the final judgment is realized by establishing a proper similarity model parameter calculation of the training sample set and the sample to be detected and calculating the changed weight parameter of each state classifier by utilizing the similarity degree of each state classifier.
The algorithm has the advantages of small calculation amount of the weight coefficients between different samples to be detected and training samples, namely the optimal weight value finally judged by each state classifier is determined by the similarity of each classifier and the samples to be detected on the whole training set and does not change along with the quality of a single sample, no matter how the classifier has the classification effect on the samples, wherein the weight coefficients are only related to the similarity between the samples. The optimal correlation weight coefficient is that for the samples in the whole training set and the samples to be tested, the weight coefficient is irrelevant to the training samples alone and only relevant to the similarity degree between the samples.
Therefore, the classification capability of each trained state classifier on different samples to be tested is fully utilized, and the adaptability and the accuracy of the integrated weight coefficient calculation model under the correlation coefficient are improved. The final classification judgment is adaptive to the change of the capacity of each state classifier, namely the optimal weight is adjusted along with the similarity between samples, the decision weight of the data in the same state is increased on the weight coefficient, the self-adaptive brain-computer interface is realized, the use of determining the similarity degree between the data by the Pearson correlation coefficient is simple, the identification precision of the brain-computer interface can be improved under the multi-task high-load state, the optimal human-computer task distribution and the human-computer cooperation process can be realized, the operation performance and the safety of a human-computer system are improved, the self-adaptive classification effect under the individual cross-load state is improved, and the classification judgment has stronger practicability and universality compared with the classification judgment of single state data
Those of ordinary skill in the art will appreciate that the elements and method steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of clearly illustrating the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present invention, and they should be construed as being included in the following claims and description.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (5)

1. A brain-computer interface cross-load linear discrimination method based on correlation analysis is characterized by comprising the following steps:
s1, acquiring electroencephalogram data under mental loads in different states;
s2, training a plurality of groups of classifiers by using Fisher criterion on the acquired electroencephalogram data;
s3, inputting the electroencephalogram samples to be classified into a plurality of groups of classifiers to obtain a plurality of groups of linear discrimination values;
s4, calculating average Pearson correlation coefficients of the electroencephalogram samples to be classified and the electroencephalogram data under different states respectively, and converting the Pearson correlation coefficients into weight coefficients through a weight conversion formula;
and S5, weighting and summing the weight coefficient and the linear discrimination value, wherein the weighted summation value is used for judging the type of the electroencephalogram sample to be classified through a Fisher criterion and a voter.
2. The method for linear discrimination across loads of brain-computer interface based on correlation analysis as claimed in claim 1, wherein the Fisher criterion in step S2 is as follows:
Figure FDA0003152167780000011
Figure FDA0003152167780000012
wherein Y represents a sample to be detected, Target represents a judged Target sample, Nontarget represents a judged non-Target sample, ff(x) Which represents the final value of the discrimination value,
Figure FDA0003152167780000013
for low-load data XlThe direction of the projection is trained to be,
Figure FDA0003152167780000014
for medium load data XmThe direction of the projection is trained to be,
Figure FDA0003152167780000015
for medium and high load data XhTrained projection direction omegaL0Training a classifier threshold, ω, for low-load dataM0Training a threshold value of a classifier for the medium load data; omegaH0Training the classifier threshold, r, for high load dataLFor inputting Pearson's correlation weight coefficient, r, calculated for the sample to be discriminated and the low load training set sampleMPearson's correlation weight coefficient, r, calculated for inputting samples to be discriminated and samples of the medium load training setHPearson's correlation weight coefficient, F, calculated for input of samples to be discriminated and samples of the high load training setLrLFor the sample r to be discriminatedLA discrimination value of FMrMFor the sample r to be discriminatedMA discrimination value of FHrHFor the sample r to be discriminatedHThe discrimination value of (1).
3. The brain-computer interface load-crossing linear discrimination method based on correlation analysis according to claim 1, wherein the average pearson correlation coefficient calculated in step S4 is calculated by the following formula:
Figure FDA0003152167780000021
where ρ isX,YIs a correlation coefficient, σXIs the standard deviation of X, σYIs the standard deviation of Y, X represents the electroencephalogram data under different states of mental load, Y represents the sample to be tested,
Figure FDA0003152167780000022
represents the mean value of X, μYRepresents the mean value of Y.
4. The brain-computer interface load-crossing linear discrimination method based on correlation analysis as claimed in claim 1, wherein the weight conversion formula in step S4 is as follows:
Figure FDA0003152167780000023
wherein r isYRepresenting the transformed weight, p, of the corresponding sample YYMean Pearson coefficient, p, representing the corresponding sample Y and the training samplelIs the Pearson correlation coefficient, p, of the EEG signal data under the task of the visual evoked event related potential under the low mental loadmIs the Pearson correlation coefficient, p, of the EEG signal data under the task of the visual evoked event related potential under the condition of the midbrain force loadhThe correlation coefficient of the pearson of the tested electroencephalogram signal data collected under the task of the visual evoked event related potential under the high mental load.
5. The brain-computer interface cross-load linear discrimination method based on correlation analysis as claimed in claim 1, wherein the weighted sum value in step S5 is used to determine the type of the brain electrical sample to be classified through a linear discrimination formula and a voter, and the specific process is as follows: firstly, classifying all samples in each repetition in the BCI, and setting n types of excitation in a single repetition and m types of repetition in a single tertiary;
Figure FDA0003152167780000024
wherein
Figure FDA0003152167780000025
The class of the ith excitation in the jth repetition is a threshold value obtained by a classifier for the excitation sample; only one excitation is judged to be Target in a single repetition, so that only the excitation with the highest threshold value is selected as the class 1, namely the Target; secondly, superposing Target ticket number total _ value and threshold total _ w of various incentives under m repetition:
Figure FDA0003152167780000031
Figure FDA0003152167780000032
class category for judging ith excitation of single reali
Figure FDA0003152167780000033
If the Target ticket number of the excitation type is the highest ticket number in the n types of excitation, judging the excitation as the Target; if other incentives which are consistent with the number of votes obtained by the ith incentive and are all the highest exist, comparing the threshold sum of the incentives, and if the threshold sum of the ith incentive is higher than the other incentives, determining the Target; if not, it is judged as Nontarget.
CN202110769354.2A 2021-07-07 2021-07-07 Correlation analysis-based brain-computer interface cross-load linear discrimination method Active CN113591598B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110769354.2A CN113591598B (en) 2021-07-07 2021-07-07 Correlation analysis-based brain-computer interface cross-load linear discrimination method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110769354.2A CN113591598B (en) 2021-07-07 2021-07-07 Correlation analysis-based brain-computer interface cross-load linear discrimination method

Publications (2)

Publication Number Publication Date
CN113591598A true CN113591598A (en) 2021-11-02
CN113591598B CN113591598B (en) 2024-07-12

Family

ID=78246218

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110769354.2A Active CN113591598B (en) 2021-07-07 2021-07-07 Correlation analysis-based brain-computer interface cross-load linear discrimination method

Country Status (1)

Country Link
CN (1) CN113591598B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115982576A (en) * 2023-03-17 2023-04-18 石家庄科林电气股份有限公司 Malignant load identification method and device and electric energy meter

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108470182A (en) * 2018-01-23 2018-08-31 天津大学 A kind of brain-computer interface method enhanced for asymmetric brain electrical feature with identification
WO2020042511A1 (en) * 2018-08-28 2020-03-05 天津大学 Motion potential brain-machine interface encoding and decoding method based on spatial filtering and template matching
WO2020253965A1 (en) * 2019-06-20 2020-12-24 Toyota Motor Europe Control device, system and method for determining perceptual load of a visual and dynamic driving scene in real time

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108470182A (en) * 2018-01-23 2018-08-31 天津大学 A kind of brain-computer interface method enhanced for asymmetric brain electrical feature with identification
WO2020042511A1 (en) * 2018-08-28 2020-03-05 天津大学 Motion potential brain-machine interface encoding and decoding method based on spatial filtering and template matching
WO2020253965A1 (en) * 2019-06-20 2020-12-24 Toyota Motor Europe Control device, system and method for determining perceptual load of a visual and dynamic driving scene in real time

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
王江;徐桂芝;王磊;张惠源;: "基于多通道自适应自回归模型脑-机接口系统特征的提取", 中国组织工程研究与临床康复, no. 48, 26 November 2011 (2011-11-26) *
谭学敏;郭超;: "半监督学习的运动想象脑电信号分类", 计算机工程与应用, no. 03, 31 December 2020 (2020-12-31) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115982576A (en) * 2023-03-17 2023-04-18 石家庄科林电气股份有限公司 Malignant load identification method and device and electric energy meter
CN115982576B (en) * 2023-03-17 2023-06-20 石家庄科林电气股份有限公司 Malignant load identification method and device and electric energy meter

Also Published As

Publication number Publication date
CN113591598B (en) 2024-07-12

Similar Documents

Publication Publication Date Title
Coyle et al. A time-series prediction approach for feature extraction in a brain-computer interface
CN110333783B (en) Irrelevant gesture processing method and system for robust electromyography control
CN111265212A (en) Motor imagery electroencephalogram signal classification method and closed-loop training test interaction system
Doroz et al. Online signature verification modeled by stability oriented reference signatures
WO2020042511A1 (en) Motion potential brain-machine interface encoding and decoding method based on spatial filtering and template matching
Rejer Genetic algorithms for feature selection for brain–computer interface
Mohammadi et al. Discrimination of depression levels using machine learning methods on EEG signals
Liu et al. Improved GMM with parameter initialization for unsupervised adaptation of brain–computer interface
Wojcikiewicz et al. Stationary common spatial patterns: towards robust classification of non-stationary eeg signals
KR100479338B1 (en) Apparatus for verifying an online signature using of transform technique and method teherefor
CN113591598B (en) Correlation analysis-based brain-computer interface cross-load linear discrimination method
Jain et al. Biometrics systems: anatomy of performance
CN108874137B (en) General model for gesture action intention detection based on electroencephalogram signals
Deepthi et al. An intelligent Alzheimer’s disease prediction using convolutional neural network (CNN)
Bablani et al. Lie detection using fuzzy ensemble approach with novel defuzzification method for classification of EEG signals
CN112698720B (en) Movement imagery identification method based on mutual information feature extraction and multi-person fusion
CN114384999A (en) User irrelevant myoelectricity gesture recognition system based on self-adaptive learning
Yang et al. EEG classification for BCI based on CSP and SVM-GA
Jian-Feng Comparison of different classifiers for biometric system based on EEG signals
Hamedi et al. Imagined speech decoding from EEG: The winner of 3rd Iranian BCI competition (iBCIC2020)
CN114721514A (en) Geometric model selection intention distinguishing method and system based on electroencephalogram signals
Han et al. Feature set extraction algorithm based on soft computing techniques and its application to EMG pattern classification
Mikut et al. Takagi--Sugeno--Kang Fuzzy Classifiers for a Special Class of Time-Varying Systems
CN112733727A (en) Electroencephalogram consciousness dynamic classification method based on linear analysis and feature decision fusion
Soetedjo et al. Maintaining High Accuracy General P300 Speller Using the Language Modeling and Dynamic Stopping

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant