CN114431878A - Electroencephalogram sleep staging method based on multi-scale attention residual error network - Google Patents

Electroencephalogram sleep staging method based on multi-scale attention residual error network Download PDF

Info

Publication number
CN114431878A
CN114431878A CN202011206485.1A CN202011206485A CN114431878A CN 114431878 A CN114431878 A CN 114431878A CN 202011206485 A CN202011206485 A CN 202011206485A CN 114431878 A CN114431878 A CN 114431878A
Authority
CN
China
Prior art keywords
electroencephalogram
channel
sleep
scale
attention
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011206485.1A
Other languages
Chinese (zh)
Inventor
柳长源
孙雨涵
赵鑫雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin University of Science and Technology
Original Assignee
Harbin University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin University of Science and Technology filed Critical Harbin University of Science and Technology
Priority to CN202011206485.1A priority Critical patent/CN114431878A/en
Publication of CN114431878A publication Critical patent/CN114431878A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4809Sleep detection, i.e. determining whether a subject is asleep or not
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4812Detecting sleep stages or cycles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4815Sleep quality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Signal Processing (AREA)
  • Psychiatry (AREA)
  • Physiology (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Fuzzy Systems (AREA)
  • Anesthesiology (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention provides a brain electrical sleep staging method based on a multi-scale attention residual error network, which adds an attention model and a Selu activation function into a residual error module of an original Resnet so as to strengthen brain electrical sleep characteristics with stronger relevance with sleep classification. And then, convolution kernels with different sizes are used in parallel at the same spatial position to obtain multi-scale electroencephalogram characteristic output, so that multi-scale sleep characteristic extraction is carried out on electroencephalogram sleep signals, and network degradation is avoided by residual learning on a neural network. Compared with the conventional electroencephalogram sleep stage method, the electroencephalogram sleep stage method provided by the invention has the advantages that the classification type and the identification precision are obviously improved, and the output result is more stable.

Description

Electroencephalogram sleep stage classification method based on multi-scale attention residual error network
Technical Field
The invention relates to the technical field of electroencephalogram signal processing, in particular to an electroencephalogram sleep staging method based on a multi-scale attention residual error network.
Sleep is a physiological process that everyone experiences every day, with about 1/3 hours of sleep in a person's lifetime. The performance of many basic activities in daily life, such as learning ability, attention, and cognitive ability, are closely related to good sleep quality. The high-quality sleep can promote the growth and development of children, enhance the immunity of organisms and protect the mental health of people. A series of body damages can be caused by long-term sleep deficiency and insomnia, such as endocrine dyscrasia, obesity induction and aging acceleration of a human body, and the problems of hypoimmunity, thought judgment capability, memory decline and the like can also occur, and even death can be caused in severe cases.
Sleep disorder is a social problem which cannot be ignored and even becomes a worldwide problem, and the research on sleep medicine is not slow. The sleep stage is an important link in the medical research of sleep, and the effective evaluation of the sleep quality is an important basis for the diagnosis of sleep diseases. The current sleep staging is mainly completed manually, which wastes time and labor, is easy to cause errors, and the automatic staging research of sleep is followed. The study on the automatic sleep staging can not only release medical staff from the heavy and monotonous electroencephalogram analysis and put the medical staff into deeper sleep study, but also improve the staging efficiency and the accuracy, provide more reliable guarantee for overcoming sleep disorder diseases, and have important social significance and valuable practical value.
Disclosure of Invention
In view of this, the present invention aims to provide a sleep staging method for semi-supervised electroencephalograms under multi-domain characteristics, which is mainly applied to disease diagnosis assistance, sleep staging research, and the like. The method can effectively realize sleep staging based on the electroencephalogram signals, avoid low classification efficiency caused by a large number of redundant characteristic parameters, improve the generalization capability and the calculation efficiency of the model, reduce the requirement for manually judging the marker sample, greatly reduce the occurrence probability of manual misjudgment and provide a new thought for sleep staging research.
In order to achieve the purpose, the invention provides the following technical scheme:
in the method, an attention model and a Selu activation function are added into a residual module of an original Resnet, so that the electroencephalogram sleep characteristics with stronger relevance with sleep classification can be enhanced. And then, convolution kernels with different sizes are used in parallel at the same spatial position to obtain multi-scale electroencephalogram characteristic output, so that multi-scale sleep characteristic extraction is carried out on electroencephalogram sleep signals, and network degradation is avoided by residual learning on a neural network. The method comprises the following specific steps:
s1, acquiring electroencephalogram signals of the subject, and extracting multi-lead electroencephalogram signals;
s2, preprocessing the electroencephalogram signals;
s3, constructing and training a multi-scale attention residual error network model;
s4, carrying out sleep electroencephalogram staging by using the multi-scale attention residual error network classifier;
the S22 process includes the steps of:
taking a new sample new N generated in S1 as an example, determining a security class sample N [ i ] belonging to S1 by a K-proximity algorithm;
randomly selecting a security sample N [ i ], randomly selecting 600 from 3000 dimensionalities of the sample for modification, and recombining the security sample N [ i ] with the remaining 2400 dimensionalities of the sample to generate a new sample new N belonging to S1;
if the data N [ i ] [ m ] to be modified (mth dimension of the security class sample N [ i ]), the data of mth dimension where N [ i ] is the k adjacent points of the security class is determined to be represented by S [ j ] m (j ═ 1,2,3, …, k), and the value of mth dimension of new N is randomly determined from N [ i ] [ m ], S1m, S2m …, and Skm.
The characteristic matrix is constructed by processing an original 1-dimensional original sleep signal of 30s as follows:
the sleep alarm signal is segmented into 30 small segments with the time span of 1s in sequence, the 30 signal sub-segments construct a time domain characteristic information matrix with the size of 30 multiplied by 100 in sequence, the rule is formulated to enable each row of the characteristic matrix to contain 100 signal points, every two continuous signal points contain 0.01s of change information, and every two continuous points in every two columns of 30 signal points contain 1s of signal change information.
The multi-scale attention residual network model constructed in step S3 includes the following levels:
an improved Residual Channel Attention Module (RCAM) was added to the original residual network, using a standard convolution layer with a convolution kernel size of 1 × 15 for the first layer, and a 1 × 3 max pooling process was performed after the first layer convolution calculations. Convolution kernels with the sizes of 1 x 3, 1 x 5 and 1 x 7 are used in the RCAM in parallel, each convolution kernel has corresponding scale output, and the same convolution kernel shares parameters.
In a neural network at each scale, where the number of convolution kernels in the RCAM is set to 64, 128, and 256, to normalize the data batches and speed up the training, a Batch Normalization layer is added after each convolution layer on each channel. Compared with local maximum pooling and average pooling, global average pooling is adopted to make each feature map obtain a corresponding value. Therefore, the dimension of the output of the neural network at each scale through global average potential is 1 × 256, the output of the merging operation of the outputs obtained from the three channels is 3 × 256, and finally the full-connection operation is performed.
The constructed channel feature attention unit is put into a traditional residual error network:
a Channel Attention Unit (CAU) is constructed, and first, the CAU calculates input data X to obtain a weight vector W corresponding to a Channel, where the length of the weight vector W is the same as the number of channels of the input data X, and the size of an element in W represents the degree of correlation between data on each Channel and key information.
Then, performing weighted combination on W and each channel feature of the input data to obtain weighted output data of X1, wherein different colors on X1 represent the contribution of corresponding channel features, the input data X are different channel features output by a previous layer, and the formula is as follows:
X∈RN×C
in the formula, N represents the length of the electroencephalogram emotional characteristic vector of each channel, C represents the number of channels of the electroencephalogram emotional characteristic vector, and output data X1∈RN×CThe process is expressed as shown in the formula:
Figure RE-GDA0002855161860000021
wherein the content of the first and second substances,
Figure RE-GDA0002855161860000022
which means multiplication by element.
The intrinsic mechanism of the channel feature attention unit is as follows:
firstly, the CAU integrates the input data according to the time sequence direction, and processes the input data by using a maximum pooling method and an average pooling method, so as to obtain the feature vectors Xmax and Xavg after maximum pooling and average pooling. And then calculating the input Xavg and Xmax by using a primary neural network (MLP), obtaining two new feature vectors, and summing elements in the feature vectors item by item to obtain a required channel feature weight vector. The number of neurons of the input layer and the output layer of the MLP neural network is the same as the number of input characteristic channels, and the MLP neural network is mainly used for constructing the contribution degree of characteristic information of two input channels of Xavg and Xmax. The calculation formula of CAU is as follows:
M(X)=δ(MLP(Avgpool(X))+MLP(MaxPool(X)))
=δ(W1(W0(Xavg))+W1(W0(Xmax)))
where δ is a Sigmoid activation function, the output is limited between 0 and 1, W0∈RC/r×C,W1∈RC/r×CWeights from input layer to hidden layer, hidden layer to output layer, respectively, Cavg∈R1×CAnd Xmax∈R1×CRepresenting the feature vectors obtained after average pooling and maximum pooling.
The improved attention residual module is as follows:
firstly, a feature channel attention unit CAU is added into a traditional residual error module, so that the feature channel attention unit CAU can learn the interrelation and the association degree of features of different channels, and the feature channel attention unit CAU plays a key role in identifying different emotional states. Then, the Relu activation function in the conventional residual module is replaced by a Scaled Exponential Linear Unit (SeLu) activation function, and although the computation of the Relu activation function is relatively simple, the Relu activation function has a limitation that when the input is less than 0, the output is directly 0, and the gradient is easy to disappear. In order to solve the problems, a Selu activation function is introduced into a residual module, when the input of the SeLu activation function is less than 0, the output is not directly 0, so that neuron death is avoided, and the advantage of unilateral inhibition of the Relu activation function is inherited because the slope of the SeLu activation function is gentle. As shown in formula:
Figure RE-GDA0002855161860000031
in the formula: α ≈ 1.6733, λ ≈ 1.0507, naming the new module as the residual channel attention module.
The invention has the beneficial effects that: according to the electroencephalogram sleep staging method based on the multi-scale attention residual error network, the attention model and the Selu activation function are added into the residual error module of the original Resnet, so that electroencephalogram sleep characteristics with stronger relevance with sleep classification can be enhanced. And then, convolution kernels with different sizes are used in parallel at the same spatial position to obtain multi-scale electroencephalogram characteristic output, so that multi-scale sleep characteristic extraction is carried out on electroencephalogram sleep signals, and network degradation is avoided by residual learning on a neural network. Compared with the conventional electroencephalogram sleep stage method, the electroencephalogram sleep stage method provided by the invention has the advantages that the classification type and the identification precision are obviously improved, and the output result is more stable.
Drawings
In order to make the object, technical scheme and beneficial effect of the invention more clear, the invention provides the following drawings for explanation:
FIG. 1 is an overall flow diagram of the process;
FIG. 2 is a diagram of a process for constructing a time domain feature information matrix of a sample;
FIG. 3 is a CAU overall flow chart;
FIG. 4 is a CAU internal machine diagram;
FIG. 5 is an overall view of the RCAM;
fig. 6 is a diagram of a multi-scale attention residual network architecture.
Detailed Description
Preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of the method of the present invention, and as shown in the figure, the electroencephalogram sleep staging method based on the multi-scale attention residual error network provided by the present invention comprises the following steps:
s1, acquiring electroencephalogram signals of the subject, and extracting multi-lead electroencephalogram signals;
s2, preprocessing the electroencephalogram signals;
s3, constructing and training a multi-scale attention residual error network model;
s4, carrying out sleep electroencephalogram staging by using the multi-scale attention residual error network classifier;
the S22 process includes the steps of:
as shown in fig. 2, taking a new sample new N generated in S1 as an example, the security class sample N [ i ] belonging to S1 is determined by the K-proximity algorithm;
randomly selecting a security sample N [ i ], randomly selecting 600 from 3000 dimensionalities of the sample for modification, and recombining the security sample N [ i ] with the remaining 2400 dimensionalities of the sample to generate a new sample new N belonging to S1;
if the data N [ i ] [ m ] to be modified (mth dimension of the security class sample N [ i ]), and if it is determined that N [ i ] is data of mth dimension of k neighboring points of the security class, represented by S [ j ] m (j is 1,2,3, …, k), the mth dimension value of newN is randomly determined from N [ i ] [ m ], S1m, S2m …, and Skm.
The characteristic matrix is constructed by processing an original 1-dimensional original sleep signal of 30s as follows:
the sleep alarm signal is segmented into 30 small segments with the time span of 1s in sequence, the 30 signal sub-segments construct a time domain characteristic information matrix with the size of 30 multiplied by 100 in sequence, the rule is formulated to enable each row of the characteristic matrix to contain 100 signal points, every two continuous signal points contain 0.01s of change information, and every two continuous points in every two columns of 30 signal points contain 1s of signal change information.
The constructed channel feature attention unit is put into a traditional residual error network:
fig. 3 is an overall flowchart of the CAU, in which the CAU first calculates the input data X to obtain a weight vector W corresponding to a channel, the length of the weight vector W is the same as the number of channels of the input data X, and the size of an element in W indicates the degree of correlation between data on each channel and key information.
Then, performing weighted combination on W and each channel feature of the input data to obtain weighted output data of X1, wherein different colors on X1 represent the contribution of corresponding channel features, the input data X are different channel features output by a previous layer, and the formula is as follows:
X∈RN×C
in the formula, N represents the length of the electroencephalogram emotional characteristic vector of each channel, C represents the number of channels of the electroencephalogram emotional characteristic vector, and output data X1∈RN×CThe process is expressed as shown in the formula:
Figure RE-GDA0002855161860000042
wherein the content of the first and second substances,
Figure RE-GDA0002855161860000043
which means multiplication by element.
The intrinsic mechanism of the channel feature attention unit is as follows:
fig. 4 is a CAU internal mechanism diagram, in which, firstly, the CAU integrates the input data according to the time sequence direction, and processes the input data by using the maximum pooling and average pooling methods, so as to obtain the feature vectors Xmax and Xavg after the maximum pooling and the average pooling. And then calculating the input Xavg and Xmax by using a primary neural network (MLP), obtaining two new feature vectors, and summing elements in the feature vectors item by item to obtain a required channel feature weight vector. The number of neurons of the input layer and the output layer of the MLP neural network is the same as the number of input characteristic channels, and the MLP neural network is mainly used for constructing the contribution degree of characteristic information of two input channels of Xavg and Xmax. The calculation formula of CAU is as follows:
M(X)=δ(MLP(Avgpool(X))+MLP(MaxPool(X)))
=δ(W1(W0(Xavg))+W1(W0(Xmax)))
where δ is a Sigmoid activation function, the output is limited between 0 and 1, W0∈RC/r×C,W1∈RC/r×CWeights from input layer to hidden layer, hidden layer to output layer, Xavg∈R1×CAnd Xmax∈R1×CRepresenting the feature vectors obtained after average pooling and maximum pooling.
The improved attention residual module is as follows:
fig. 5 is an overall structure diagram of the RCAM, and a feature channel attention unit CAU is first added to a conventional residual error module, so that the feature channel attention unit CAU can learn the interrelation and the association degree of features of different channels, which has a key role in identifying different emotional states. Then, the Relu activation function in the conventional residual module is replaced by a Scaled Exponential Linear Unit (SeLu) activation function, and although the computation of the Relu activation function is relatively simple, the Relu activation function has a limitation that when the input is less than 0, the output is directly 0, and the gradient is easy to disappear. In order to solve the problems, a Selu activation function is introduced into a residual module, when the input of the SeLu activation function is less than 0, the output is not directly 0, so that neuron death is avoided, and the advantage of unilateral inhibition of the Relu activation function is inherited because the slope of the SeLu activation function is gentle. As shown in formula:
Figure RE-GDA0002855161860000041
in the formula: α ≈ 1.6733, λ ≈ 1.0507, naming the new module as the residual channel attention module.
The multi-scale attention residual network model constructed in step S3 includes the following levels:
fig. 6 is a diagram of a multi-scale attention residual network structure, where a modified Residual Channel Attention Module (RCAM) is added to the original residual network, the first layer uses a standard convolutional layer with a convolutional kernel size of 1 × 15, and the first layer is processed by 1 × 3 maximal pooling after convolutional calculation. Convolution kernels with the sizes of 1 x 3, 1 x 5 and 1 x 7 are used in the RCAM in parallel, each convolution kernel has corresponding scale output, and the same convolution kernel shares parameters.
In a neural network at each scale, where the number of convolution kernels in RCAM is set to 64, 128 and 256, to normalize the data Batch and speed up the training, a Batch Normalization layer is added after each convolution layer on each channel. Compared with local maximum pooling and average pooling, global average pooling is adopted to make each feature map obtain a corresponding value. Therefore, the dimension of the output of the neural network at each scale through global average potential is 1 × 256, the output of the merging operation of the outputs obtained from the three channels is 3 × 256, and finally the full-connection operation is performed.
Table 1 compares the results of the prior art with the present invention;
Figure RE-GDA0002855161860000051
in the research process, the data in the SLEEP-EDF database is mainly used for researching and analyzing the five SLEEP-period electroencephalogram signals. Aiming at the problems of low accuracy and few classification categories of sleep stages of electroencephalogram signals and the defects of a traditional Resnet network in processing electroencephalogram sleep characteristics of different channels, a multi-scale attention residual error network (MAResnet) model is provided, and an attention model and a Selu activation function are added into a residual error module of the original Resnet, so that the electroencephalogram sleep characteristics with stronger relevance with emotion classification can be strengthened. And then, convolution kernels with different sizes are used in parallel at the same spatial position to obtain multi-scale electroencephalogram characteristic output, so that multi-scale sleep characteristic extraction is carried out on electroencephalogram sleep signals. The experimental result shows that the recognition rate of 5 average sleep periods by the original Resnet is 67.5%, while the recognition rate of the average sleep period by the MAResnet network model reaches 85.2%, so that the effectiveness of the method is proved.

Claims (9)

1. A multi-scale attention residual error network-based electroencephalogram sleep stage method is characterized by comprising the following steps:
s1, acquiring electroencephalogram signals of the subject, and extracting multi-lead electroencephalogram signals;
s2, preprocessing the electroencephalogram signals;
s3, constructing and training a multi-scale attention residual error network model;
and S4, carrying out sleep electroencephalogram staging by using the multi-scale attention residual error network classifier.
2. The multi-scale attention residual network-based electroencephalogram sleep staging method as claimed in claim 1, wherein the data preprocessing described in step S2 includes one or more of the following processes:
s21, removing noise and artifacts in the electroencephalogram signals by using a 2-order Butterworth band-pass filter;
s22, improving the MSMOTE to relieve the unbalanced problem of the data set;
and S23, constructing a feature matrix.
3. The multi-scale attention residual network-based electroencephalogram sleep staging method of claim 2, wherein the S22 process comprises the steps of:
1) to generate S1For example, the new sample new N belongs to S by K-neighbor algorithm1Security class sample N[i]
2) Randomly selecting a security sample N[i]Randomly selecting 600 from 3000 dimensions of the sample to modify, and adding them to N[i]The remaining 2400D of (1) recombines to produce a recombinant S1New sample new N;
3) if the data N to be modified[i][m](safety class sample N)[i]M-th dimension) of (A), judging N[i]For data of m-th dimension of k adjacent points of security class[j]m(j =1,2,3, …, k), then the m-th dimension of new N is from N[i][m]、S1m、S2m…、SkmIs determined randomly.
4. The multi-scale attention residual network-based electroencephalogram sleep staging method of claim 2, wherein the constructing of the feature matrix is to process an original 30s 1-dimensional form of an original sleep signal as follows:
the sleep alarm signal is segmented into 30 small segments with the time span of 1s in sequence, the 30 signal sub-segments construct a time domain characteristic information matrix with the size of 30 multiplied by 100 in sequence, the rule is formulated to enable each row of the characteristic matrix to contain 100 signal points, every two continuous signal points contain 0.01s of change information, and every two continuous points in every two columns of 30 signal points contain 1s of signal change information.
5. The multi-scale attention residual network-based electroencephalogram sleep staging method according to claim 1, characterized in that the multi-scale attention residual network model constructed in step 3 comprises the following levels:
an improved Residual Channel Attention Module (RCAM) was added to the original residual network, using a standard convolution layer with a convolution kernel size of 1 × 15 for the first layer, and a 1 × 3 max pooling process was performed after the first layer convolution calculations.
6. Then convolution kernels with the sizes of 1 × 3, 1 × 5 and 1 × 7 are used in parallel in the RCAM, each convolution kernel has corresponding scale output, and the same convolution kernel shares parameters;
in the neural network on each scale, the number of convolution kernels in RCAM is set to 64, 128 and 256, in order to standardize data Batch and accelerate training speed, a Batch Normalization layer is added behind each convolution layer on each channel, global average pooling (global average pooling) is adopted, each feature map can obtain a corresponding value by global average pooling compared with local maximum pooling and average pooling, therefore, the output dimension of the neural network on each scale after last global average pooling is 1 × 256, the output obtained by merging the three channels is 3 × 256, and finally full connection operation is carried out.
7. The multi-scale attention residual network-based electroencephalogram sleep staging method of claim 5, wherein the constructed channel feature attention unit is put into a traditional residual network:
constructing a Channel Attention Unit (CAU), firstly, CAU inputs dataXPerforming calculation to obtain weight vector corresponding to channelWLength and input dataXThe number of the channels of (a) is the same,Wthe element size in (1) represents the degree of correlation of data on each channel with key information;
then will beWWeighted combination is carried out on the weighted combination with each channel characteristic of the input data to obtain the weighted output data ofX 1 X 1 The different colors in (b) represent the contribution of the corresponding channel features, and the input dataXFor different channel characteristics output by the previous layer, the formula is as follows:
Figure 274937DEST_PATH_IMAGE001
in the formula, N represents the length of the electroencephalogram emotional characteristic vector of each channel,Crepresenting the channel number of the electroencephalogram emotional signal feature vector and outputting data
Figure 697828DEST_PATH_IMAGE002
The process is expressed as shown in the formula:
Figure 819368DEST_PATH_IMAGE003
wherein, in the step (A),
Figure 720328DEST_PATH_IMAGE004
which means multiplication by element.
8. The multi-scale attention residual network-based electroencephalogram sleep staging method according to claim 5 or 6, characterized in that the intrinsic mechanism of the channel feature attention unit is as follows:
firstly, the CAU integrates information of input data according to a time sequence direction, processes the input information by using a maximum pooling method and an average pooling method to obtain characteristic vectors Xmax and Xavg after maximum pooling and average pooling, calculates the input Xavg and Xmax by using a primary neural network (MLP) to obtain two new characteristic vectors, and sums elements in the two new characteristic vectors item by item to obtain a required channel characteristic weight vector; the number of neurons of an input layer and an output layer of the MLP neural network is the same as the number of input characteristic channels, and the MLP neural network is mainly used for constructing the contribution degree of characteristic information of two input channels of Xavg and Xmax; the calculation formula of CAU is as follows:
Figure 610924DEST_PATH_IMAGE005
where δ is the Sigmoid activation function, the output is limited between 0 and 1,
Figure 204716DEST_PATH_IMAGE006
Figure 344710DEST_PATH_IMAGE007
the weights from the input layer to the hidden layer, from the hidden layer to the output layer,
Figure 252624DEST_PATH_IMAGE008
and
Figure 60043DEST_PATH_IMAGE009
representing the feature vectors obtained after average pooling and maximum pooling.
9. The multi-scale attention residual network-based electroencephalogram sleep staging method of claim 5, wherein the improved attention residual module is as follows:
firstly, adding a characteristic channel attention unit CAU into a traditional residual error module, so that the characteristic channel attention unit CAU can learn the interrelation and the association degree of the characteristics of different channels, and the characteristic channel attention unit CAU plays a key role in identifying different emotional states; then, a Relu activation function in a traditional residual error module is replaced by a Scaled Exponential Linear Unit (SeLu) activation function, although the calculation of the ReLu activation function is relatively simple, the ReLu activation function has limitation, namely when the input is less than 0, the output is directly 0, and the gradient is easy to disappear; in order to solve the problems, a Selu activation function is introduced into a residual module, when the input of the SeLu activation function is less than 0, the output is not directly 0, so that neuron death is avoided, and because the slope of the SeLu activation function is gentle, the advantage of unilateral inhibition of the Relu activation function is also inherited, as shown in the formula:
Figure 762419DEST_PATH_IMAGE010
in the formula:
Figure 156754DEST_PATH_IMAGE011
Figure 337199DEST_PATH_IMAGE012
the new module is named residual channel attention module.
CN202011206485.1A 2020-11-02 2020-11-02 Electroencephalogram sleep staging method based on multi-scale attention residual error network Pending CN114431878A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011206485.1A CN114431878A (en) 2020-11-02 2020-11-02 Electroencephalogram sleep staging method based on multi-scale attention residual error network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011206485.1A CN114431878A (en) 2020-11-02 2020-11-02 Electroencephalogram sleep staging method based on multi-scale attention residual error network

Publications (1)

Publication Number Publication Date
CN114431878A true CN114431878A (en) 2022-05-06

Family

ID=81360833

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011206485.1A Pending CN114431878A (en) 2020-11-02 2020-11-02 Electroencephalogram sleep staging method based on multi-scale attention residual error network

Country Status (1)

Country Link
CN (1) CN114431878A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115530847A (en) * 2022-09-30 2022-12-30 哈尔滨理工大学 Electroencephalogram signal automatic sleep staging method based on multi-scale attention
CN117494013A (en) * 2023-12-29 2024-02-02 南方医科大学南方医院 Multi-scale weight sharing convolutional neural network and electroencephalogram emotion recognition method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104766098A (en) * 2015-04-30 2015-07-08 哈尔滨工业大学 Construction method for classifier
CN107067143A (en) * 2016-12-30 2017-08-18 山东鲁能软件技术有限公司 A kind of equipment safety grade separation method
CN109846477A (en) * 2019-01-29 2019-06-07 北京工业大学 A kind of brain electricity classification method based on frequency band attention residual error network
CN110897639A (en) * 2020-01-02 2020-03-24 清华大学深圳国际研究生院 Electroencephalogram sleep staging method based on deep convolutional neural network
US20200337625A1 (en) * 2019-04-24 2020-10-29 Interaxon Inc. System and method for brain modelling

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104766098A (en) * 2015-04-30 2015-07-08 哈尔滨工业大学 Construction method for classifier
CN107067143A (en) * 2016-12-30 2017-08-18 山东鲁能软件技术有限公司 A kind of equipment safety grade separation method
CN109846477A (en) * 2019-01-29 2019-06-07 北京工业大学 A kind of brain electricity classification method based on frequency band attention residual error network
US20200337625A1 (en) * 2019-04-24 2020-10-29 Interaxon Inc. System and method for brain modelling
CN110897639A (en) * 2020-01-02 2020-03-24 清华大学深圳国际研究生院 Electroencephalogram sleep staging method based on deep convolutional neural network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
柳长源 等: "基于MAResnet的脑电情感识别研究", 《仪器仪表学报》, vol. 41, no. 7, pages 2 - 4 *
罗森林 等: "基于CNN-BiLSTM的自动睡眠分期方法", 《北京理工大学学报》, vol. 40, no. 7, pages 1 - 1 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115530847A (en) * 2022-09-30 2022-12-30 哈尔滨理工大学 Electroencephalogram signal automatic sleep staging method based on multi-scale attention
CN117494013A (en) * 2023-12-29 2024-02-02 南方医科大学南方医院 Multi-scale weight sharing convolutional neural network and electroencephalogram emotion recognition method thereof
CN117494013B (en) * 2023-12-29 2024-04-16 南方医科大学南方医院 Multi-scale weight sharing convolutional neural network and electroencephalogram emotion recognition method thereof

Similar Documents

Publication Publication Date Title
Jain et al. Convolutional neural network based Alzheimer’s disease classification from magnetic resonance brain images
CN113627518B (en) Method for realizing neural network brain electricity emotion recognition model by utilizing transfer learning
Aich et al. A nonlinear decision tree based classification approach to predict the Parkinson's disease using different feature sets of voice data
Liu et al. Deep spatio-temporal representation and ensemble classification for attention deficit/hyperactivity disorder
CN114431878A (en) Electroencephalogram sleep staging method based on multi-scale attention residual error network
CN112932501B (en) Method for automatically identifying insomnia based on one-dimensional convolutional neural network
CN111387975B (en) Electroencephalogram signal identification method based on machine learning
An et al. Electroencephalogram emotion recognition based on 3D feature fusion and convolutional autoencoder
Ali et al. Automatic detection and classification of Alzheimer's disease from MRI using TANNN
Yan et al. An EEG signal classification method based on sparse auto-encoders and support vector machine
Thangavel et al. EAD-DNN: Early Alzheimer's disease prediction using deep neural networks
Sagga et al. Epileptic seizures detection on EEG signal using deep learning techniques
Li et al. Seizure detection from multi-channel EEG using entropy-based dynamic graph embedding
Herath et al. Autism spectrum disorder diagnosis support model using Inception V3
Guntari et al. Classification of post-stroke EEG signal using genetic algorithm and recurrent neural networks
CN114662524B (en) Plug-and-play domain adaptation method based on electroencephalogram signals
CN116821790A (en) Freezing gait prediction method based on self-adaptive enhanced integrated learning strategy
CN111860949B (en) Prediction method and device based on time sequence image
Aruna et al. Accelerating deep convolutional neural network on FPGA for ECG signal classification
CN115316955A (en) Light-weight and quick decoding method for motor imagery electroencephalogram signals
Kunduru et al. COVID-19 classification based on a convolutional spiking neural network: A modified exponential IF neuron approach
Tang et al. Explainable and efficient deep early warning system for cardiac arrest prediction from electronic health records
Liu et al. Brain image recognition algorithm and high performance computing of internet of medical things based on convolutional neural network
Oulhadj et al. Diabetic retinopathy prediction based on vision transformer and modified capsule network
Cao et al. Alzheimer’s Disease Stage Detection Method Based on Convolutional Neural Network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination