CN112890827B - Electroencephalogram identification method and system based on graph convolution and gate control circulation unit - Google Patents

Electroencephalogram identification method and system based on graph convolution and gate control circulation unit Download PDF

Info

Publication number
CN112890827B
CN112890827B CN202110048818.0A CN202110048818A CN112890827B CN 112890827 B CN112890827 B CN 112890827B CN 202110048818 A CN202110048818 A CN 202110048818A CN 112890827 B CN112890827 B CN 112890827B
Authority
CN
China
Prior art keywords
electroencephalogram
data
matrix
electroencephalogram data
electro
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110048818.0A
Other languages
Chinese (zh)
Other versions
CN112890827A (en
Inventor
彭德光
朱楚洪
孙健
唐贤伦
高崚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Zhaokun Intelligent Medical Technology Co ltd
Original Assignee
Chongqing Zhaokun Intelligent Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Zhaokun Intelligent Medical Technology Co ltd filed Critical Chongqing Zhaokun Intelligent Medical Technology Co ltd
Priority to CN202110048818.0A priority Critical patent/CN112890827B/en
Publication of CN112890827A publication Critical patent/CN112890827A/en
Application granted granted Critical
Publication of CN112890827B publication Critical patent/CN112890827B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7225Details of analog processing, e.g. isolation amplifier, gain or sensitivity adjustment, filtering, baseline or drift compensation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • A61B5/726Details of waveform analysis characterised by using transforms using Wavelet transforms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems

Abstract

The invention provides an electroencephalogram identification method and system based on a graph convolution and gating cyclic unit, which comprises the following steps: acquiring electroencephalogram data, and preprocessing the electroencephalogram data, wherein the preprocessing comprises the following steps: filtering, correcting the reference, removing the mean value, eliminating the electro-oculogram artifact, and normalizing; inputting the preprocessed electroencephalogram data into a plurality of image convolution networks for relatively independent feature extraction, stacking the features extracted by the image convolution networks into a feature matrix, inputting the feature matrix into a gating circulation unit for classification and identification, and outputting an identification result; the invention can effectively improve the recognition rate of the electroencephalogram signals.

Description

Electroencephalogram identification method and system based on graph convolution and gate control circulation unit
Technical Field
The invention relates to the field of intelligent biological medical treatment, in particular to an electroencephalogram identification method and system based on a graph volume and gate control circulation unit.
Background
Brain-computer interface (BCI) is a human-computer interaction technology that enables direct communication with computers or other electronic devices through the human brain. BCI has important research significance and great development potential in the fields of nerve rehabilitation, biomedicine, intelligent robots and the like. BCI can restore normal brain function by inducing activity-dependent brain plasticity, realize diagnosis of epilepsy, and directly control robots, artificial limbs and the like by using the brain. Provides possibility for the disabled to recover the normal activity function.
The core of BCI is the identification of electroencephalographic signals (EEG). EEG can be generally classified as: visual Evoked Potentials (VEP), event-related potentials (ERP), cortical slow potentials (SCP), time-related synchronous potentials (ERS), desynchronous potentials, and the like (ERD). In the identification of EEG, a specific class is generally identified for diagnosis or control, and there are various types of EEG that are combined to realize more diverse control.
ERS and ERD are generated by unilateral limb movement or imagination of unilateral limb movement, and generate ERS in the ipsilateral brain area and generate ERD on the other side. The invention introduces an electroencephalogram identification method combining a Graph Convolution Network (GCN) and a gating circulation unit (GRU), which is a classification identification method based on ERS and ERD.
The EEG identification mainly comprises three links of data preprocessing, feature extraction and classification. The data preprocessing comprises the steps of selecting a useful channel, calibrating a base line, re-referencing, filtering, removing artifacts, normalizing and the like, and is mainly used for better extracting useful features and removing electro-oculogram interference. The traditional EEG feature extraction mainly comprises a time domain method, a frequency domain method and a time frequency method. A power spectrum estimation commonly used in a frequency domain method converts EEG into a corresponding relationship between power and frequency, such as an AR model; common time domain methods include analysis of variance, waveform identification, peak detection and the like; the time-frequency method includes wavelet transform and the like. Early EEG classification methods have linear classifiers: such as linear discriminant analysis, support vector machine, artificial neural network: like the multi-layer perceptron, nonlinear Bayesian classification: bayes secondary classification, nearest neighbor classifier: k nearest neighbor algorithm, mahalanobis distance, etc. These early classification methods all had difficulty in performing well on EEG classifications with low signal-to-noise ratios, non-stationarity, and high randomness. With the rapid development of deep learning in recent years, the application field of the deep learning algorithm is continuously expanded, more and more people apply the deep learning algorithm to the BCI field, and the deep learning algorithm has better performance in the BCI field due to the strong processing capability of the deep learning algorithm on nonlinear data.
The traditional feature extraction method is used for extracting features (such as mean, variance, amplitude and the like) which can be understood by people in the process of extracting the EEG features, and some features close to the nature of the EEG are abandoned instead. Although the common deep learning algorithm does not have the problem, the essential characteristics of the electroencephalogram signals cannot be effectively extracted, so that the classification accuracy is low. In addition, both the traditional EEG identification method and the trained neural network classifier are greatly disturbed when the eye electrical Energy (EOG) with the amplitude being several times of that of the EEG is detected, so that the classification result is inaccurate.
Disclosure of Invention
In view of the problems in the prior art, the invention provides an electroencephalogram identification method and system based on a graph volume and gating cycle unit, and mainly solves the problem that the electroencephalogram identification rate in the prior art is not high.
In order to achieve the above and other objects, the present invention adopts the following technical solutions.
An electroencephalogram identification method based on a graph volume and gating cycle unit comprises the following steps:
acquiring electroencephalogram data, and preprocessing the electroencephalogram data, wherein the preprocessing comprises the following steps: filtering, correcting the reference, removing the mean value, eliminating the electro-oculogram artifact, and normalizing;
inputting the preprocessed electroencephalogram data into a plurality of image convolution networks for relatively independent feature extraction, stacking the features extracted by the image convolution networks into a feature matrix, inputting the feature matrix into a gating circulation unit for classification and identification, and outputting an identification result.
Optionally, the rejecting electro-oculogram artifacts comprises:
performing wavelet packet decomposition on electroencephalogram data for multiple times to obtain multi-order high-frequency components and low-frequency components of the electroencephalogram data;
screening out high-frequency components and low-frequency components of corresponding frequency bands according to signal frequency bands of the electro-oculogram, and reconstructing an electro-oculogram signal based on the screened components;
and taking the reconstructed electro-ocular signal and the electroencephalogram data as the input of an independent component analysis algorithm, acquiring an independent component in the electroencephalogram data, and subtracting the independent component from the electroencephalogram data.
Optionally, the step of acquiring independent components in the electroencephalogram data includes:
preprocessing data input into an independent component analysis algorithm to obtain an observation matrix, wherein the preprocessing comprises centralization and whitening;
initializing a mixed matrix, calculating initial independent components through the mixed matrix, updating the mixed matrix to carry out multiple iterations, comparing independent components obtained by each iteration, and selecting a matrix with the maximum non-Gaussian characteristic of each independent component as the final output independent component estimation.
Optionally, the electroencephalogram data for providing the electro-oculogram artifact is normalized to the range of [ -1,1], and the data is labeled and then divided into a training set and a test set.
Optionally, the normalization calculation manner is expressed as:
Figure BDA0002898451890000031
wherein X represents data before normalization, X min ,X max Respectively representing the minimum value and the maximum value of the channel data where X is located, and X' represents the normalized data.
Optionally, the step of performing feature extraction on each graph convolution network includes:
creating graph data according to input electroencephalogram data, wherein the electroencephalogram data of one channel corresponds to one node in the graph data, and edges connecting the nodes are constructed according to the correlation among the nodes so as to obtain an adjacency matrix;
and generating a Laplace matrix according to the adjacency matrix, aggregating same-dimension characteristics of adjacent nodes according to the Laplace matrix, and performing nonlinear transformation on the aggregated characteristics to obtain characteristics of the electroencephalogram data.
Optionally, the calculation method of the correlation between the nodes is represented as:
Figure BDA0002898451890000032
x and Y respectively represent electroencephalogram data of different channels, k represents the length of a single-channel sequence, and N is the length of the sequence after single-channel zero padding.
Optionally, the graph convolution network feature extraction process is expressed as:
H i+1 =leaky_relu(LH i W G )
wherein H i Characteristic matrix, W, representing the i-th layer G Representing a nonlinear transformation weight matrix, L is a Laplace matrix, and leakage _ relu is a relu activation function with leakage.
Optionally, constructing a loss function of the gated loop unit for network training, where the loss function is expressed as:
Figure BDA0002898451890000033
wherein the content of the first and second substances,
Figure BDA0002898451890000041
is the data tag and y is the network output.
An electroencephalogram recognition system based on a atlas and gated cyclic unit, comprising:
the electroencephalogram data processing module is used for acquiring electroencephalogram data and preprocessing the electroencephalogram data, wherein the preprocessing comprises the following steps: filtering, correcting the reference, removing the mean value, eliminating the electro-oculogram artifact, and normalizing;
and the electroencephalogram identification module is used for inputting the preprocessed electroencephalogram data into a plurality of image convolution networks to perform feature extraction relatively independently, stacking the features extracted by the image convolution networks into a feature matrix, inputting the feature matrix into the gated circulation unit to perform classification and identification, and outputting an identification result.
As described above, the electroencephalogram identification method and system based on the atlas and gating circulation unit have the following beneficial effects.
The electro-oculogram is accurately and effectively provided, and the influence of the electro-oculogram on classification of the electroencephalogram signals is greatly reduced; the accuracy of electroencephalogram identification can be effectively improved through the combination of the constructed graph volume and the gating cycle.
Drawings
FIG. 1 is a flowchart of an electroencephalogram recognition method based on a histogram and gated cyclic unit in an embodiment of the present invention.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It should be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
Referring to fig. 1, the present invention provides an electroencephalogram identification method based on a histogram and gating cycle unit, including the following steps.
And step S01, acquiring electroencephalogram data. The electroencephalogram signal data acquisition can be realized by using a 32-channel Emotiv electroencephalogram acquisition instrument, electrodes are arranged according to an international standard electrode method, CMS (CMS) and DRL (dry electrode) are selectively arranged at the forehead position, the sampling frequency is 128HZ, and the acquired electroencephalogram signals are transmitted to a computer in a wireless mode. The acquired person takes a rest for a certain time according to the instruction, then keeps a quiet state for two seconds as baseline calibration data, then imagines left hand (right hand) movement continuously, the same action lasts for 12 seconds, the acquisition is continued after the rest, and half of the left hand and half of the right hand are acquired.
Step S02, data preprocessing. The electroencephalogram signals collected by the electroencephalogram collecting equipment are often accompanied with various noises and direct current interference, and due to alpha (8-13HZ) and beta (13-30HZ) rhythms expressed by ERD and ERS, the electroencephalogram signals are subjected to 8-30HZ band-pass filtering, and a six-order Butterworth band-pass filter can be designed by using a signal module in a script library. Meanwhile, in the process of acquiring the EEG signals, the amplitude and the waveform are often changed greatly along with the time, and a potential offset phenomenon is generated, so that the acquired data are subjected to reference correction, and the mean value of the EEG in the previous correction time is subtracted from the acquired data. Meanwhile, because the amplitude of an electro-oculogram (EOG) artifact generated by blinking or eye movement is multiple times of that of an EEG (electroencephalogram) and can generate great interference on the training and testing of a model, the EOG artifact is removed by adopting WTICA.
The WTICA counterfeit removal process comprises the following steps:
decomposing electroencephalogram signals by utilizing the capability of decomposing high frequency and low frequency of signals continuously by wavelet packets:
let a and D represent the low and high frequency signals after the primary decomposition, respectively, and S represents the initial signal, then S can be represented by equation (1):
Figure BDA0002898451890000051
wherein AA represents the low-frequency part of the secondary decomposition of the low-frequency part in the primary decomposition, AD represents the high-frequency part of the secondary decomposition of the low-frequency part in the primary decomposition, and DA, DD and the like are analogized in sequence.
And selecting a wavelet packet decomposition coefficient corresponding to the corresponding frequency band for reconstruction according to the frequency band (0-10HZ) of the EOG signal to be extracted.
Further, the reconstructed signal and each channel original signal are taken as input of ICA (independent component analysis). The EOG signal reconstructed by wavelet packet decomposition is not accurate, so we can use the reconstructed signal as the EOG observation signal under mixing, and then combine with the original channel data as the input of ICA: the matrix X is observed. Then X can be represented as:
X=AS
where A is the mixing matrix and S is the independent component in the mixed signal.
ICA aims to find the mixing matrix a. Let W be A -1 If X is multiplied by W, WX is multiplied by WAS, S can be solved.
The ICA steps are as follows: the input data is first subjected to a centering and whitening pre-process. Centralization is to subtract one number from each independent value in the data to make the center of each dimensionality of the data move to zero, and is equivalent to the process of translating the data, and the centers of all data after translation are 0. The whitening is to map the variance of the data to a specific range, firstly calculate the new coordinate of the data in the specific range, then normalize the variance of the new coordinate, the whitened data has low feature correlation and features with the same variance, and can effectively remove redundant information in the data.
The mixing matrix W is then randomly initialized. And because S is WX, updating the W matrix, obtaining a plurality of corresponding S matrixes after a plurality of iterations, calculating the non-Gaussian property of each vector of the S matrix, and finding the matrix with the maximum non-Gaussian property to obtain the independent component estimation S. If the deviation statistics of the mean/covariance of the elements in each vector with respect to the vector can be calculated, the non-Gaussian property of the vector can be determined.
And after extracting the EOG signal from the S, subtracting the EOG artifact from the original signal of each channel to obtain the EEG true trace.
Finally, considering that EEG has positive and negative polarity, if the data is normalized to [0,1], a large amount of important information existing in the EEG negative value region is lost, and therefore, the EEG data is normalized to the interval of [ -1,1] by using an equation (2), and then data labels are marked to divide a training set and a test set. Labeling labels imagine a left hand motion label of 0 and a right hand motion label of 1.
Figure BDA0002898451890000061
Wherein X represents data before normalization, X min ,X max Respectively representing the minimum value and the maximum value of the channel data where X is located, and X' represents the normalized data.
Step S03: EEG features were extracted using GCN. The extraction of EEG features using the graph convolution network GCN is as follows:
graph data is created from the EEG signal. A graph comprises nodes N and edges E, each node comprises own characteristics, and the edges represent the relationship between the nodes. The matrix H represents each node and all the features they contain (H ∈ R) n*m N is the number of nodes, m is the number of features contained in each node), and the relationship between nodes is expressed by an adjacency matrix A (A ∈ R) n*n ). Each channel can be used as a node, signals contained in each channel are used as the characteristics of the node to establish a characteristic matrix H, and then a correlation coefficient r is calculated according to a cross-correlation judgment formula (3) c k Setting a threshold value when r ck Greater than the threshold indicates that the two nodes are related, with the edge being 1, otherwise the edge is 0, thus establishing the adjacency matrix a.
Figure BDA0002898451890000071
X and Y respectively represent electroencephalogram data of different channels, k represents the length of a single-channel sequence, and N is the length of the sequence after single-channel zero padding.
A laplacian matrix L is generated from the adjacency matrix a. Firstly, to avoid the loss of the self-characteristics of A, let
Figure BDA0002898451890000072
And I is an identity matrix. Avoid changing the distribution of the eigen-costs in the matrix multiplication
Figure BDA0002898451890000073
Normalization to generate Laplace matrix
Figure BDA0002898451890000074
Wherein
Figure BDA0002898451890000075
In the form of a diagonal matrix,
Figure BDA0002898451890000076
and extracting features by GCN forward propagation. The feature extraction process of the GCN can be represented by the following formula:
H i+1 =leaky_relu(LH i W G )(4)
wherein H i A feature matrix, W, representing the i-th layer G Representing a non-linear transformation weight matrix.
The extraction process of the above formula can be considered as two steps:
1. aggregating same-dimension features of neighboring nodes
Figure BDA0002898451890000077
2. Non-linear transformation of aggregate features
Figure BDA0002898451890000078
leakage _ relu is the relu activation function with leakage.
In step S04, the features extracted by the GCNs are stacked and input to GRU classification recognition.
EEG feature combinations are extracted using multiple GCNs and the extracted features are stacked into a matrix. One GCN extracted feature matrix (H) out ∈R n*n_f N is the number of channels, n _ f is the number of features extracted from a single channel), a feature matrix after horizontal stacking after GCN extraction (X belongs to R) n*m And m is a × n _ f). Taking X as the input of the gated loop unit GRU, the GRU propagation process is as follows:
r t =σ(W r ·[h t-1 ,x t ])(5)
z t =σ(W Z ·[h t-1 ,x t ])(6)
Figure BDA0002898451890000079
Figure BDA0002898451890000081
y=σ(W o h t=m )(9)
in the formula, r t ,z t
Figure BDA0002898451890000082
h t Respectively representing the reset gate, update gate, candidate set, state at time t. And y is the final output. The update gate is used to control how much state information from the previous time is brought into the current state, and the reset gate controls how much of the previous state was written to the current candidate set.
The model training process is as follows:
and (4) sending the data preprocessed in the step (S02) to GCN for feature extraction, horizontally stacking the features extracted by each GCN and then sending the features into GRU, wherein the GRU outputs a classification result, and the number of nodes in an output layer can be 1 due to two classifications. Taking a cross entropy loss function as a loss function
Figure BDA0002898451890000083
Figure BDA0002898451890000084
Is a data tag.
The parameter to be trained in the model is W o ,W h ,W z ,W r ,W G
The gradient of each parameter is:
Figure BDA0002898451890000085
Figure BDA0002898451890000086
Figure BDA0002898451890000087
Figure BDA0002898451890000088
Figure BDA0002898451890000089
Figure BDA00028984518900000810
the parameter update may be updated using an Adam optimizer according to the gradient.
Step S05, the test set verifies network performance. Inputting the EEG test set data in the step 2 into a trained GCN-GRN network for feature extraction and classification, if the output y > is 0.5, predicting the motion as right-hand imagination, otherwise, predicting the motion as left-hand imagination, and calculating the accuracy of classification of the left-hand and right-hand imagination motions.
The embodiment also provides an electroencephalogram identification system based on the image volume and gating cycle unit, which is used for executing the electroencephalogram identification method based on the image volume and gating cycle unit in the method embodiment. Since the technical principle of the system embodiment is similar to that of the method embodiment, repeated description of the same technical details is omitted.
In one embodiment, a system for electroencephalogram recognition based on atlas and gated loop units, comprises:
the electroencephalogram data processing module is used for acquiring electroencephalogram data and preprocessing the electroencephalogram data, wherein the preprocessing comprises the following steps: filtering, correcting the reference, removing the mean value, eliminating the electro-oculogram artifact, and normalizing;
and the electroencephalogram identification module is used for inputting the preprocessed electroencephalogram data into a plurality of image volume networks to relatively independently perform feature extraction, stacking the features extracted by the image volume networks into feature matrixes, inputting the feature matrixes into the gated circulation unit to perform classification and identification, and outputting identification results.
In summary, according to the electroencephalogram identification method and system based on the graph convolution and the gating cycle unit, EOG artifacts are removed by a WTICA method, and feature extraction and classification are performed by combining GCN and GRU, so that the influence of EOG on a classifier is greatly reduced, and the problem that EOG in an EEG signal is difficult to process in real time is solved; the strong image data processing capacity of the GCN is utilized, a plurality of GCNs are selected to extract electroencephalogram characteristics, useful information of EEG is fully extracted, GRUs are selected as characteristic classifiers, gradient disappearance is avoided to a certain extent, and the characteristic information extracted by different GCNs has the same effect; the model combining GCN and GRU can effectively improve the identification accuracy of the electroencephalogram signals. Therefore, the invention effectively overcomes various defects in the prior art and has high industrial utilization value.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (8)

1. An electroencephalogram identification method based on a graph volume and a gated cyclic unit is characterized by comprising the following steps:
acquiring electroencephalogram data, and preprocessing the electroencephalogram data, wherein the preprocessing comprises the following steps: filtering, correcting the reference, removing the mean value, eliminating the electro-oculogram artifact, and normalizing; the elimination of the electro-oculogram artifact comprises the following steps: performing wavelet packet decomposition on electroencephalogram data for multiple times to obtain multi-order high-frequency components and low-frequency components of the electroencephalogram data; screening out high-frequency components and low-frequency components of corresponding frequency bands according to signal frequency bands of the electro-oculogram, and reconstructing an electro-oculogram signal based on the screened components; taking the reconstructed electro-oculogram signal and the electroencephalogram data as the input of an independent component analysis algorithm, acquiring an independent component in the electroencephalogram data, and subtracting the independent component from the electroencephalogram data; wherein the step of obtaining the independent components in the electroencephalogram data comprises: preprocessing data input into an independent component analysis algorithm to obtain an observation matrix, wherein the preprocessing comprises centralization and whitening; initializing a mixed matrix, calculating initial independent components through the mixed matrix, updating the mixed matrix to carry out multiple iterations, comparing independent components obtained by each iteration, and selecting a matrix with the maximum non-Gaussian characteristic of each independent component as the independent component estimation of final output;
inputting the preprocessed electroencephalogram data into a plurality of graph convolution networks for feature extraction respectively, wherein the feature extraction comprises the following steps: creating graph data according to input electroencephalogram data, wherein the electroencephalogram data of one channel corresponds to one node in the graph data, and edges connecting the nodes are constructed according to the correlation among the nodes so as to obtain an adjacency matrix; generating a Laplace matrix according to the adjacency matrix, aggregating same-dimension characteristics of adjacent nodes according to the Laplace matrix, and performing nonlinear transformation on the aggregated characteristics to obtain characteristics of electroencephalogram data; and stacking the features extracted by each graph convolution network into a feature matrix, inputting the feature matrix into a gating circulation unit for classification and identification, and outputting an identification result.
2. The method of claim 1, wherein the step of obtaining independent components in the brain electrical data comprises:
preprocessing data input into an independent component analysis algorithm to obtain an observation matrix, wherein the preprocessing comprises centralization and whitening;
initializing a mixed matrix, calculating initial independent components through the mixed matrix, updating the mixed matrix to carry out multiple iterations, comparing independent components obtained by each iteration, and selecting a matrix with the maximum non-Gaussian characteristic of each independent component as the final output independent component estimation.
3. The electroencephalogram identification method based on the atlas and gating cyclic unit according to claim 1, wherein the electroencephalogram data with the electro-oculogram artifacts removed are normalized to the range of [ -1,1], and are divided into a training set and a test set after being labeled.
4. The electroencephalogram identification method based on the atlas and gated cyclic unit of claim 3, wherein the normalized calculation mode is expressed as:
Figure FDA0003656466320000021
wherein X represents data before normalization, X min ,X max Respectively representing the minimum value and the maximum value of the channel data where X is located, and X' represents the normalized data.
5. The electroencephalogram identification method based on the graph convolution and gated cyclic unit according to claim 1, wherein the calculation mode of the correlation among the nodes is expressed as:
Figure FDA0003656466320000022
x and Y respectively represent electroencephalogram data of different channels, k represents the length of a single-channel sequence, and N is the length of the sequence after single-channel zero padding.
6. The electroencephalogram recognition method based on the atlas and gating cyclic unit as claimed in claim 1, wherein the atlas network feature extraction process is expressed as:
H i+1 =leaky_relu(LH i W G )
wherein H i A feature matrix, W, representing the i-th layer G Representing a nonlinear transformation weight matrix, L is a Laplace matrix, and leakage _ relu is a relu activation function with leakage.
7. The electroencephalogram recognition method based on the graph convolution and the gated cyclic unit according to claim 1, characterized in that a loss function of the gated cyclic unit is constructed for network training, and the loss function is expressed as:
Figure FDA0003656466320000023
wherein the content of the first and second substances,
Figure FDA0003656466320000024
is the data tag and y is the network output.
8. An electroencephalogram identification system based on a graph volume and gated cyclic unit, comprising:
the electroencephalogram data processing module is used for acquiring electroencephalogram data and preprocessing the electroencephalogram data, wherein the preprocessing comprises the following steps: filtering, correcting the reference, removing the mean value, eliminating the electro-oculogram artifact, and normalizing; the elimination of the electro-oculogram artifact comprises the following steps: performing wavelet packet decomposition on electroencephalogram data for multiple times to obtain multi-order high-frequency components and low-frequency components of the electroencephalogram data; screening out high-frequency components and low-frequency components of corresponding frequency bands according to signal frequency bands of the electro-oculogram, and reconstructing an electro-oculogram signal based on the screened components; taking the reconstructed electro-oculogram signal and the electroencephalogram data as the input of an independent component analysis algorithm, acquiring an independent component in the electroencephalogram data, and subtracting the independent component from the electroencephalogram data;
the electroencephalogram identification module is used for inputting the preprocessed electroencephalogram data into a plurality of graph convolution networks to respectively perform feature extraction, and comprises: creating graph data according to the input electroencephalogram data, wherein the electroencephalogram data of one channel corresponds to one node in the graph data, and edges connecting the nodes are constructed according to the correlation among the nodes so as to obtain an adjacency matrix; generating a Laplace matrix according to the adjacency matrix, aggregating same-dimension characteristics of adjacent nodes according to the Laplace matrix, and performing nonlinear transformation on the aggregated characteristics to obtain characteristics of electroencephalogram data; and are
And stacking the features extracted by each graph convolution network into a feature matrix, inputting the feature matrix into a gating circulation unit for classification and identification, and outputting an identification result.
CN202110048818.0A 2021-01-14 2021-01-14 Electroencephalogram identification method and system based on graph convolution and gate control circulation unit Active CN112890827B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110048818.0A CN112890827B (en) 2021-01-14 2021-01-14 Electroencephalogram identification method and system based on graph convolution and gate control circulation unit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110048818.0A CN112890827B (en) 2021-01-14 2021-01-14 Electroencephalogram identification method and system based on graph convolution and gate control circulation unit

Publications (2)

Publication Number Publication Date
CN112890827A CN112890827A (en) 2021-06-04
CN112890827B true CN112890827B (en) 2022-09-20

Family

ID=76113094

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110048818.0A Active CN112890827B (en) 2021-01-14 2021-01-14 Electroencephalogram identification method and system based on graph convolution and gate control circulation unit

Country Status (1)

Country Link
CN (1) CN112890827B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113476056B (en) * 2021-06-25 2024-03-15 西北工业大学 Motor imagery electroencephalogram signal classification method based on frequency domain graph convolution neural network
CN114969934B (en) * 2022-05-31 2023-09-05 湖南工商大学 Stay cable damage degree identification method and model construction method
CN115116607B (en) * 2022-08-30 2022-12-13 之江实验室 Brain disease prediction system based on resting state magnetic resonance transfer learning

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107961007A (en) * 2018-01-05 2018-04-27 重庆邮电大学 A kind of electroencephalogramrecognition recognition method of combination convolutional neural networks and long memory network in short-term
CN109820525A (en) * 2019-01-23 2019-05-31 五邑大学 A kind of driving fatigue recognition methods based on CNN-LSTM deep learning model
WO2020248008A1 (en) * 2019-06-14 2020-12-17 The University Of Adelaide A method and system for classifying sleep related brain activity
CN110399857B (en) * 2019-08-01 2023-05-23 西安邮电大学 Electroencephalogram emotion recognition method based on graph convolution neural network
CN111544256A (en) * 2020-04-30 2020-08-18 天津大学 Brain-controlled intelligent full limb rehabilitation method based on graph convolution and transfer learning
CN111657935B (en) * 2020-05-11 2021-10-01 浙江大学 Epilepsia electroencephalogram recognition system based on hierarchical graph convolutional neural network, terminal and storage medium
CN111950455B (en) * 2020-08-12 2022-03-22 重庆邮电大学 Motion imagery electroencephalogram characteristic identification method based on LFFCNN-GRU algorithm model

Also Published As

Publication number Publication date
CN112890827A (en) 2021-06-04

Similar Documents

Publication Publication Date Title
CN112890827B (en) Electroencephalogram identification method and system based on graph convolution and gate control circulation unit
Mullen et al. Real-time neuroimaging and cognitive monitoring using wearable dry EEG
Luo et al. Dynamic frequency feature selection based approach for classification of motor imageries
Zhou et al. Removal of EMG and ECG artifacts from EEG based on wavelet transform and ICA
Bentlemsan et al. Random forest and filter bank common spatial patterns for EEG-based motor imagery classification
Bascil et al. Spectral feature extraction of EEG signals and pattern recognition during mental tasks of 2-D cursor movements for BCI using SVM and ANN
Darvishi et al. Brain-computer interface analysis using continuous wavelet transform and adaptive neuro-fuzzy classifier
CN110175510B (en) Multi-mode motor imagery identification method based on brain function network characteristics
Serdar Bascil et al. Multi-channel EEG signal feature extraction and pattern recognition on horizontal mental imagination task of 1-D cursor movement for brain computer interface
CN111310656A (en) Single motor imagery electroencephalogram signal identification method based on multi-linear principal component analysis
Sheoran et al. Methods of denoising of electroencephalogram signal: A review
Abolghasemi et al. EEG–fMRI: dictionary learning for removal of ballistocardiogram artifact from EEG
Sarin et al. Automated ocular artifacts identification and removal from EEG data using hybrid machine learning methods
Mathe et al. Intelligent approach for artifacts removal from EEG signal using heuristic-based convolutional neural network
Geng et al. A fusion algorithm for EEG signal processing based on motor imagery brain-computer interface
Ghonchi et al. Spatio-temporal deep learning for EEG-fNIRS brain computer interface
Zou et al. A supervised independent component analysis algorithm for motion imagery-based brain computer interface
Sherwani et al. Wavelet based feature extraction for classification of motor imagery signals
Assi et al. Kmeans-ICA based automatic method for ocular artifacts removal in a motorimagery classification
Loza et al. Transient model of EEG using Gini Index-based matching pursuit
Yu et al. The research of sEMG movement pattern classification based on multiple fused wavelet function
Guo et al. EEG signal analysis based on fixed-value shift compression algorithm
Pereira et al. Factor analysis for finding invariant neural descriptors of human emotions
Oveisi EEG signal classification using nonlinear independent component analysis
Mourad Automatic correction of short‐duration artefacts in single‐channel EEG recording: a group‐sparse signal denoising algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant