CN116712035A - Sleep stage method and system based on CNN-PSO-BiLSTM - Google Patents

Sleep stage method and system based on CNN-PSO-BiLSTM Download PDF

Info

Publication number
CN116712035A
CN116712035A CN202310631776.2A CN202310631776A CN116712035A CN 116712035 A CN116712035 A CN 116712035A CN 202310631776 A CN202310631776 A CN 202310631776A CN 116712035 A CN116712035 A CN 116712035A
Authority
CN
China
Prior art keywords
particle
sleep
dimension
model
hidden layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310631776.2A
Other languages
Chinese (zh)
Inventor
赵丹
王涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Haishen Joint Medical Devices Co ltd
Original Assignee
Suzhou Haishen Joint Medical Devices Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Haishen Joint Medical Devices Co ltd filed Critical Suzhou Haishen Joint Medical Devices Co ltd
Priority to CN202310631776.2A priority Critical patent/CN116712035A/en
Publication of CN116712035A publication Critical patent/CN116712035A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/372Analysis of electroencephalograms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4812Detecting sleep stages or cycles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/10Pre-processing; Data cleansing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction

Abstract

The invention relates to a sleep stage method and a sleep stage system based on CNN-PSO-BiLSTM: collecting and processing the sleep stage electroencephalogram signals of the historical to-be-tested person; dividing each historical brain electrical signal of the person to be measured into a plurality of fragments and marking; extracting spatial feature data from each segment; initializing each particle parameter in a particle swarm optimization model; constructing a BiLSTM neural network model by using the parameters of the learning rate eta and the number n of hidden layer neurons, wherein the spatial characteristic data of part of historical testers are input, the corresponding sleep stage labels are output, and training the neural network model; model verification is carried out by taking the remaining space characteristic data of the historical testee as input, and the accuracy of sleep stage results is obtained; and if the change of the accuracy rate of the front and back times is smaller than a set threshold, constructing a neural network sleep automatic stage model based on the optimal eta and the optimal n, collecting the electroencephalogram signals of the person to be tested, processing the electroencephalogram signals, and substituting the processed electroencephalogram signals into the trained sleep automatic stage model to obtain sleep stages.

Description

Sleep stage method and system based on CNN-PSO-BiLSTM
Technical Field
The invention relates to the technical field of sleep stage, in particular to a sleep stage method and a sleep stage system based on CNN-PSO-BiLSTM.
Background
Sleep is classified into awake phase, non-rapid eye movement phase (NREM) and rapid eye movement phase (REM) according to the latest standards of sleep stages of the american society of sleep medicine, wherein NREM can be classified into NREM1, NREM2 and NREM3 as sleep is deepened. Namely, sleep stages are divided into five types, namely, wake stage, N1 stage, N2 stage, N3 stage and REM stage.
Sleep staging is an effective method for diagnosing sleep disorders and monitoring sleep quality, and has important clinical significance. Traditional sleep staging methods are typically obtained by manual analysis of the collected polysomnography data by an expert physician based on sleep classification criteria. The segmentation method has the advantages of large workload and low segmentation efficiency. In addition, the classification result is subjective and is easy to misjudge.
The sleep automatic staging method can improve the efficiency of sleep staging in medical research. In general, there are two steps in the automatic interpretation of sleep stages: feature extraction and sleep stage classification. The sleep characteristics are extracted by adopting a time domain analysis method, a frequency domain analysis method and the like, and the main changes in the sleep process are captured.
For the problem of feature extraction, CNN makes it possible to automatically extract features. By pooling operation, overfitting is avoided while maintaining the feature scale unchanged. CNNs have addressed the shortcomings of conventional fully-connected neural networks due to the presence of convolutional and pooled layers. The convolutional layer of CNN can be seen as a feature extractor that can extract and classify features simultaneously. In addition, due to the back propagation characteristics, the classification result can be fed back by utilizing CNN for automatic sleep stage, so that the characteristics can be extracted better.
CNN models are commonly used to identify two-dimensional images. However, the use of CNN models is not limited to two-dimensional or three-dimensional recognition tasks. The 1D CNN has the same attributes as other CNN models. The only difference is that the convolution kernel in the convolution network structure is also one-dimensional, which is applicable to one-dimensional input data, such as biomedical signals.
From the early classification method based on expert knowledge rules to the current machine learning method, there are also many researches on sleep stage classifiers, such as sleep stage using random forest, decision tree sleep stage model based on support vector machine, integrated support vector machine automatic sleep stage model combined with principal component analysis, etc. In an attempt to different classifiers, deep learning neural networks exhibited better performance in processing time series signals. Since the sleep electroencephalogram is a set of continuous time-series signals, the signals before and after have a certain correlation, and the sleep stage can be better judged in consideration of the correlation. Therefore, in order to better accommodate the problem of automatic sleep staging based on the neurophysiologic signals of the electroencephalogram, there is a need for further development of the structure of the neural network.
In the sleep staging task, since sleep is a continuous process, there is no significant difference in the characteristics of each adjacent sleep cycle, so each sleep cycle is more easily confused with an adjacent sleep cycle. Meanwhile, many super parameters need to be regulated in the training process of the neural network model, only input and output parameters can be directly determined, and other parameters including the number of hidden layer neurons, the learning rate, the training iteration number and the sliding window size are very important for sleep stage identification, but cannot be directly calculated. The current model training method mostly uses experience to manually tune parameters, which consumes a great deal of resources and time, and cannot ensure that the obtained hyper-parameter combination is the optimal solution.
Disclosure of Invention
Aiming at the technical problems and defects existing in the prior art, the invention provides a sleep stage method and a sleep stage system based on CNN-PSO-BiLSTM.
The invention solves the technical problems by the following technical proposal:
the invention provides a sleep stage method based on CNN-PSO-BiLSTM, which is characterized by comprising the following steps:
s1, acquiring electroencephalogram signals of a plurality of history testers in a sleep stage set time period at an acquisition frequency, and preprocessing the electroencephalogram signals corresponding to each history tester to obtain preprocessed electroencephalogram signals;
s2, continuously dividing the electroencephalogram signals corresponding to each historical to-be-detected person into a plurality of fragments according to a set time window, taking the electroencephalogram signals of each fragment as an electroencephalogram signal sleep sample, and labeling sleep stage labels for each electroencephalogram signal sleep sample;
s3, extracting corresponding spatial feature data from each electroencephalogram signal sleep sample by using the 1D CNN as a feature extractor, wherein the spatial feature data comprises time domain feature data, frequency domain feature data and nonlinear feature data;
s4, initializing each particle parameter in a particle swarm optimization model and determining a global optimal position of the particle, wherein the particle parameters comprise particle speeds in different dimensions, particle positions in different dimensions, particle parameters serving as a learning rate eta and particle parameters serving as the number n of neurons of a hidden layer;
s5, constructing a BiLSTM neural network model by using the learning rate eta particle parameter and the hidden layer neuron number n particle parameter, taking a plurality of spatial characteristic data of part of historical testers as input, taking a corresponding sleep stage label as output, and carrying out model training on the constructed BiLSTM neural network model;
s6, a plurality of spatial feature data of the rest of the historical testers are used as input, a trained BiLSTM neural network model is subjected to model verification to obtain sleep stages corresponding to each electroencephalogram signal sleep sample of the rest of the historical testers, the sleep stages obtained through verification are compared with corresponding sleep stage labels, and the accuracy PR of the obtained sleep stage results is used as an adaptability value of each particle;
s7, judging whether the change of the fitness value of the model verification is smaller than a set threshold value or not, if so, entering a step S8, otherwise, entering a step S9;
s8, returning to an optimal learning rate eta particle parameter and an optimal hidden layer neuron number n particle parameter in the particle swarm optimization model, constructing a PSO-BiLSTM neural network sleep automatic stage model based on the optimal learning rate eta particle parameter and the optimal hidden layer neuron number n particle parameter, and entering into a step S10;
s9, taking two particle parameters of a learning rate eta and the number n of hidden layer neurons in a BiLSTM neural network model as target objects for optimizing a particle swarm optimization model, updating each particle parameter in the particle swarm optimization model, determining a global optimal position of the particles, obtaining an optimal learning rate eta and the number n of optimal hidden layer neurons corresponding to the t+1st iteration, inputting the optimal learning rate eta and the number n of the optimal hidden layer neurons into the BiLSTM neural network model, and entering a step S5, wherein t is more than or equal to 1;
s10, acquiring an electroencephalogram signal of a person to be tested, preprocessing the electroencephalogram signal to obtain a preprocessed electroencephalogram signal, and substituting the preprocessed electroencephalogram signal into a trained PSO-BiLSTM neural network sleep automatic stage model to obtain a sleep stage corresponding to the electroencephalogram signal.
The invention also provides a sleep stage system based on the CNN-PSO-BiLSTM, which is characterized by comprising an acquisition module, a partition labeling module, a feature extraction module, an initialization module, a model training module, a model verification module, a judgment module, a model construction module, an optimization module and a prediction module;
the acquisition module is used for acquiring the electroencephalogram signals of a plurality of history testers in a sleep stage set time period at an acquisition frequency, and preprocessing the electroencephalogram signals corresponding to each history tester to obtain preprocessed electroencephalogram signals;
the dividing and labeling module is used for continuously dividing the electroencephalogram signals corresponding to each historical to-be-detected person into a plurality of fragments according to a set time window, taking the electroencephalogram signals of each fragment as an electroencephalogram signal sleep sample, and labeling sleep stage labels for the electroencephalogram signal sleep samples;
the feature extraction module is used for extracting corresponding spatial feature data from each electroencephalogram signal sleep sample by using the 1D CNN as a feature extractor, wherein the spatial feature data comprises time domain feature data, frequency domain feature data and nonlinear feature data;
the initialization module is used for initializing each particle parameter in the particle swarm optimization model and determining a global optimal position of the particle, wherein the particle parameters comprise particle speeds in different dimensions, particle positions in different dimensions, particle parameters serving as a learning rate eta and particle parameters serving as the number n of neurons of the hidden layer;
the model training module is used for constructing a BiLSTM neural network model by using the learning rate eta particle parameter and the hidden layer neuron number n particle parameter, taking a plurality of spatial characteristic data of part of historical testers as input and a corresponding sleep stage label as output, and carrying out model training on the constructed BiLSTM neural network model;
the model verification module is used for carrying out model verification on the trained BiLSTM neural network model by taking a plurality of spatial characteristic data of the rest of the historical testers as input so as to obtain sleep stages corresponding to each electroencephalogram signal sleep sample of the rest of the historical testers, comparing the sleep stages obtained by verification with corresponding sleep stage labels, and taking the accuracy PR of the obtained sleep stage results as the fitness value of each particle;
the judging module is used for judging whether the change of the adaptability value of the model verification is smaller than a set threshold value or not, calling the model building module if yes, and calling the optimizing module if no;
the model construction module is used for returning an optimal learning rate eta particle parameter and an optimal hidden layer neuron number n particle parameter in the particle swarm optimization model, and constructing a PSO-BiLSTM neural network sleep automatic stage model based on the optimal learning rate eta particle parameter and the optimal hidden layer neuron number n particle parameter;
the optimization module is used for taking two particle parameters of a learning rate eta and the number n of hidden layer neurons in the BiLSTM neural network model as target objects for optimizing the particle swarm optimization model, updating each particle parameter in the particle swarm optimization model, determining the global optimal position of the particles, obtaining the optimal learning rate eta and the number n of the optimal hidden layer neurons corresponding to the t+1st iteration, inputting the optimal learning rate eta and the optimal number n of the hidden layer neurons into the BiLSTM neural network model, and calling the model training module, wherein t is more than or equal to 1;
the prediction module is used for collecting the brain electrical signals of the testee, obtaining the preprocessed brain electrical signals after preprocessing, substituting the preprocessed brain electrical signals into the trained PSO-BiLSTM neural network sleep automatic stage model, and predicting to obtain sleep stages corresponding to the brain electrical signals.
The invention has the positive progress effects that:
the invention provides an accurate, efficient and high-robustness sleep automatic stage-dividing method which replaces manual work and is used for realizing automatic stage division of five sleep stage-dividing states so as to reduce the manual workload, ensure the objectivity of classification results and provide guarantee for subsequent diagnosis of sleep disorder and monitoring of sleep quality.
Drawings
FIG. 1 is a flow chart of a sleep stage method based on CNN-PSO-BiLSTM according to a preferred embodiment of the present invention.
FIG. 2 is a block diagram of a sleep stage system based on CNN-PSO-BiLSTM in accordance with a preferred embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
As shown in fig. 1, the present embodiment provides a sleep stage method based on CNN-PSO-BiLSTM, which includes the following steps:
step 101, acquiring electroencephalogram signals of a plurality of history testers in a sleep stage set time period at an acquisition frequency, and preprocessing the electroencephalogram signals corresponding to each history tester to obtain preprocessed electroencephalogram signals.
For example: and acquiring the electroencephalogram signals of 100 historical testees in the sleep stage of 1 hour period at the acquisition frequency of 100Hz, and carrying out preprocessing such as noise reduction and filtering on the electroencephalogram signals of the 100 historical testees to obtain preprocessed electroencephalogram signals.
Step 102, continuously dividing the electroencephalogram signals corresponding to each historical to-be-detected person into a plurality of fragments according to a set time window, taking the electroencephalogram signals of each fragment as an electroencephalogram signal sleep sample, and labeling sleep stage labels for each electroencephalogram signal sleep sample.
For example: dividing the electroencephalogram signals corresponding to each historical testee into 120 electroencephalogram fragments by using a set time window 30s, wherein each electroencephalogram fragment is used as an electroencephalogram signal sleep sample, and each electroencephalogram signal sleep sample is labeled with a sleep stage label, so that each historical testee corresponds to 120 electroencephalogram signal sleep samples and sleep stage labels thereof, 120 electroencephalogram signal sleep samples corresponding to each historical testee in 100 historical testees are used as input samples, and the corresponding sleep stage labels are used as output samples.
Step 103, extracting corresponding spatial feature data from each electroencephalogram signal sleep sample by using the 1D CNN as a feature extractor, wherein the spatial feature data comprises time domain feature data, frequency domain feature data and nonlinear feature data.
For example: the total of 100 historical testees is 12000 brain electrical signal sleep samples, and corresponding spatial feature data are extracted from each brain electrical signal sleep sample.
Step 104, initializing each particle parameter in the particle swarm optimization model, and determining a global optimal position of the particle, wherein the particle parameter comprises particle speeds of different dimensions, particle positions of different dimensions, particle parameters serving as a learning rate eta, particle parameters serving as a hidden layer neuron number n and the like as a 1 st iteration.
And 105, constructing a BiLSTM neural network model by using the learning rate eta particle parameter and the hidden layer neuron number n particle parameter, taking a plurality of spatial characteristic data of part of historical testers as input and a corresponding sleep stage label as output, and performing model training on the constructed BiLSTM neural network model.
For example: sample data of 80 historical testees are used as model training, spatial feature data corresponding to 80 x 120 = 9600 brain electrical signal sleep samples are used as input, corresponding sleep stage labels are used as output, and model training is carried out on the built BiLSTM neural network model, wherein the model training is the 1 st iteration.
And 106, taking a plurality of spatial characteristic data of the rest of the historical testers as input, performing model verification on the trained BiLSTM neural network model to obtain sleep stages corresponding to each electroencephalogram signal sleep sample of the rest of the historical testers, and comparing the sleep stages obtained by verification with corresponding sleep stage labels to obtain the accuracy PR of the sleep stage results as the fitness value of each particle.
For example: and (3) taking sample data of the rest 20 historical testees as model verification, taking spatial characteristic data corresponding to 20 x 120 = 2400 electroencephalogram sleep samples as input, and carrying out model verification on the trained BiLSTM neural network model to obtain sleep stages corresponding to each electroencephalogram sleep sample of the rest historical testees. Comparing the sleep stage obtained by verification with the corresponding sleep stage label, and counting the correct number of the sleep stages obtained by verification, wherein the correct number/2400 is the accuracy PR of the sleep stage result, and the correct number/2400 is taken as the fitness value of each particle.
Step 107, judging whether the variation difference value of the fitness value of the model verification is smaller than a set threshold value, if yes, proceeding to step 108, otherwise proceeding to step 109.
And step 108, returning the optimal learning rate eta particle parameter and the optimal hidden layer neuron number n particle parameter in the particle swarm optimization model, constructing a PSO-BiLSTM neural network sleep automatic stage model based on the optimal learning rate eta particle parameter and the optimal hidden layer neuron number n particle parameter, and entering step 110.
And 109, taking two particle parameters of a learning rate eta and the number n of hidden layer neurons in the BiLSTM neural network model as target objects for optimizing the particle swarm optimization model, updating each particle parameter in the particle swarm optimization model, determining the global optimal position of the particles, obtaining the optimal learning rate eta and the number n of the optimal hidden layer neurons corresponding to the t+1st iteration, inputting the optimal learning rate eta and the number n of the optimal hidden layer neurons into the BiLSTM neural network model, and entering the step 105, wherein t is more than or equal to 1.
Wherein, the learning rate eta updates the formula as follows:
where t represents the t-th iteration, j e { 1..S } is the number of particles,represents the speed of the jth particle at the t-th iteration in the first dimension particle space,/->Represents the learning rate of the jth particle at the t-th iteration in the first dimension particle space, w represents the value of [0,1 ]]A constant value within the range, named inertial weight, c 1 And c 2 For acceleration constant, c is generally taken 1 =c 2 ∈[0,4],/>Is the position of the jth particle at the t-th iteration in the first dimension particle space, and +.>Is the optimal position of the t-th iteration in the first dimension particle space in the whole particle swarm, r 1 And r 2 Is extracted from the uniform distribution U (0, 1) in order to add a random component to the velocity update to diversify the search.
The learning rate η in the population of particles is not allowed to move randomly without boundaries and needs to be limited to a predefined range:
wherein v is 1,min And v 1,max Determining v 1 Boundary of v 1,max A maximum velocity value, v, representing particles in the first dimension of the particle space 1,min Representing the minimum velocity value, η, of a particle in the first dimension of the particle space min And eta max Determining the boundary of eta max Representing the maximum learning rate, η, of particles in the first dimension of the particle space min Representing the minimum learning rate of the particles in the first dimension of the particle space.
Updating formula of hidden layer neuron number n:
wherein the method comprises the steps ofRepresents the speed of the jth particle at the t-th iteration in the second dimension particle space,/->A hidden layer neuron number representing a jth particle at a jth iteration in a second dimension of particle space, and>representing the position of the jth particle at the t-th iteration in the second dimension particle space,/->Representing the optimal position of the t-th iteration in the second dimension of particle space in the whole population of particles.
The number n of hidden layer neurons in a population of particles is not allowed to move randomly without boundaries and needs to be limited to a predefined range:
wherein v is 2,min And v 2,max Determining v 2 Boundary of v 2,max A maximum velocity value, v, representing particles in a second dimension of particle space 2,min Representing a minimum velocity value, n, of a particle in a second dimension of particle space min And n max Determining the boundary of n, n max Representing the maximum hidden layer neuron number, n, of particles in a second dimension of particle space min Representing the minimum hidden layer neuron number of the particles in the first dimension particle space.
Step 110, acquiring an electroencephalogram signal of a person to be tested, preprocessing the electroencephalogram signal to obtain a preprocessed electroencephalogram signal, and substituting the preprocessed electroencephalogram signal into a trained PSO-BiLSTM neural network sleep automatic stage model to obtain a sleep stage corresponding to the electroencephalogram signal.
As shown in fig. 2, the embodiment further provides a sleep stage system based on CNN-PSO-BiLSTM, which includes an acquisition module 1, a partition labeling module 2, a feature extraction module 3, an initialization module 4, a model training module 5, a model verification module 6, a judgment module 7, a model construction module 8, an optimization module 9 and a prediction module 10.
The acquisition module 1 is used for acquiring the electroencephalogram signals of a plurality of history testers in a sleep stage set time period at an acquisition frequency, and preprocessing the electroencephalogram signals corresponding to each history tester to obtain preprocessed electroencephalogram signals.
The partition labeling module 2 is used for continuously dividing the electroencephalogram signals corresponding to each historical to-be-detected person into a plurality of fragments according to a set time window, wherein each fragment electroencephalogram signal serves as an electroencephalogram signal sleep sample, and labeling sleep stage labels for each electroencephalogram signal sleep sample.
The feature extraction module 3 is configured to extract corresponding spatial feature data from each electroencephalogram signal sleep sample by using the 1D CNN as a feature extractor, where the spatial feature data includes time domain feature data, frequency domain feature data, and nonlinear feature data.
The initialization module 4 is configured to initialize each particle parameter in the particle swarm optimization model and determine a global optimal position of the particle, where the particle parameter includes particle speeds in different dimensions, particle positions in different dimensions, a particle parameter as a learning rate η, and a particle parameter as a number n of neurons in a hidden layer.
The model training module 5 is configured to construct a BiLSTM neural network model with a learning rate η particle parameter and a hidden layer neuron number n particle parameter, take a plurality of spatial feature data of a part of historical testees as input, and take a corresponding sleep stage label as output, and perform model training on the constructed BiLSTM neural network model.
The model verification module 6 is configured to perform model verification on the trained BiLSTM neural network model by using a plurality of spatial feature data of the remaining historical testees as input, so as to obtain sleep stages corresponding to sleep samples of each electroencephalogram signal of the remaining historical testees, compare the sleep stages obtained by verification with corresponding sleep stage labels, and obtain an accuracy rate PR of a sleep stage result as a fitness value of each particle.
The judging module 7 is used for judging whether the change of the adaptability value of the model verification is smaller than a set threshold value, and if yes, the model building module 8 is called, and if no, the optimizing module 9 is called.
The model construction module 8 is used for returning the optimal learning rate eta particle parameter and the optimal hidden layer neuron number n particle parameter in the particle swarm optimization model, and constructing the PSO-BiLSTM neural network sleep automatic stage model based on the optimal learning rate eta particle parameter and the optimal hidden layer neuron number n particle parameter.
The optimization module 9 is used for taking two particle parameters of a learning rate eta and the number n of hidden layer neurons in the BiLSTM neural network model as target objects for optimizing the particle swarm optimization model, updating each particle parameter in the particle swarm optimization model, determining the global optimal position of the particles, obtaining the optimal learning rate eta and the number n of optimal hidden layer neurons corresponding to the t+1st iteration, inputting the optimal learning rate eta and the number n of optimal hidden layer neurons into the BiLSTM neural network model, and calling the model training module, wherein t is more than or equal to 1.
The prediction module 10 is configured to collect an electroencephalogram signal of a person under test, obtain a preprocessed electroencephalogram signal after preprocessing, and substitute the preprocessed electroencephalogram signal into a trained PSO-BiLSTM neural network sleep automatic stage model to predict and obtain a sleep stage corresponding to the electroencephalogram signal.
According to the invention, a 1D CNN convolutional neural network is selected to be combined with a PSO-BiLSTM neural network, a high-dimensional feature vector extracted from the convolutional neural network is used as PSO-BiLSTM input in time, a neural network model capable of capturing the time dependence of electroencephalogram data is trained, the advantage of local feature extraction by the convolutional neural network can be utilized, the advantage of super-parameter automatic adjustment and the advantage of long-time sequence global feature can be considered by a bidirectional long-time memory network (BiLSTM) optimized by a particle swarm, and simultaneously, the sleep classification accuracy is obtained by structural modeling.
While specific embodiments of the invention have been described above, it will be appreciated by those skilled in the art that these are by way of example only, and the scope of the invention is defined by the appended claims. Various changes and modifications to these embodiments may be made by those skilled in the art without departing from the principles and spirit of the invention, but such changes and modifications fall within the scope of the invention.

Claims (8)

1. A sleep staging method based on CNN-PSO-BiLSTM, which is characterized by comprising the following steps:
s1, acquiring electroencephalogram signals of a plurality of history testers in a sleep stage set time period at an acquisition frequency, and preprocessing the electroencephalogram signals corresponding to each history tester to obtain preprocessed electroencephalogram signals;
s2, continuously dividing the electroencephalogram signals corresponding to each historical to-be-detected person into a plurality of fragments according to a set time window, taking the electroencephalogram signals of each fragment as an electroencephalogram signal sleep sample, and labeling sleep stage labels for each electroencephalogram signal sleep sample;
s3, extracting corresponding spatial feature data from each electroencephalogram signal sleep sample by using the 1D CNN as a feature extractor, wherein the spatial feature data comprises time domain feature data, frequency domain feature data and nonlinear feature data;
s4, initializing each particle parameter in a particle swarm optimization model and determining a global optimal position of the particle, wherein the particle parameters comprise particle speeds in different dimensions, particle positions in different dimensions, particle parameters serving as a learning rate eta and particle parameters serving as the number n of neurons of a hidden layer;
s5, constructing a BiLSTM neural network model by using the learning rate eta particle parameter and the hidden layer neuron number n particle parameter, taking a plurality of spatial characteristic data of part of historical testers as input, taking a corresponding sleep stage label as output, and carrying out model training on the constructed BiLSTM neural network model;
s6, a plurality of spatial feature data of the rest of the historical testers are used as input, a trained BiLSTM neural network model is subjected to model verification to obtain sleep stages corresponding to each electroencephalogram signal sleep sample of the rest of the historical testers, the sleep stages obtained through verification are compared with corresponding sleep stage labels, and the accuracy PR of the obtained sleep stage results is used as an adaptability value of each particle;
s7, judging whether the change of the fitness value of the model verification is smaller than a set threshold value or not, if so, entering a step S8, otherwise, entering a step S9;
s8, returning to an optimal learning rate eta particle parameter and an optimal hidden layer neuron number n particle parameter in the particle swarm optimization model, constructing a PSO-BiLSTM neural network sleep automatic stage model based on the optimal learning rate eta particle parameter and the optimal hidden layer neuron number n particle parameter, and entering into a step S10;
s9, taking two particle parameters of a learning rate eta and the number n of hidden layer neurons in a BiLSTM neural network model as target objects for optimizing a particle swarm optimization model, updating each particle parameter in the particle swarm optimization model, determining a global optimal position of the particles, obtaining an optimal learning rate eta and the number n of optimal hidden layer neurons corresponding to the t+1st iteration, inputting the optimal learning rate eta and the number n of the optimal hidden layer neurons into the BiLSTM neural network model, and entering a step S5, wherein t is more than or equal to 1;
s10, acquiring an electroencephalogram signal of a person to be tested, preprocessing the electroencephalogram signal to obtain a preprocessed electroencephalogram signal, and substituting the preprocessed electroencephalogram signal into a trained PSO-BiLSTM neural network sleep automatic stage model to obtain a sleep stage corresponding to the electroencephalogram signal.
2. The CNN-PSO-BiLSTM sleep staging method according to claim 1, wherein step S4 is performed as iteration 1, and in step S9, the learning rate η is updated as follows:
where t represents the t-th iteration, j e { 1..S } is the number of particles,represents the speed of the jth particle at the t-th iteration in the first dimension particle space,/->Represents the learning rate of the jth particle at the t-th iteration in the first dimension particle space, w represents the value of [0,1 ]]A constant value within the range, named inertial weight, c 1 And c 2 For acceleration constant, c is generally taken 1 =c 2 ∈[0,4],/>Is the position of the jth particle at the t-th iteration in the first dimension particle space, and +.>Is the optimal position of the t-th iteration in the first dimension particle space in the whole particle swarm, r 1 And r 2 Is extracted from the uniform distribution U (0, 1) in order to add a random component in the velocity update to diversify the search;
updating formula of hidden layer neuron number n:
wherein the method comprises the steps ofRepresents the speed of the jth particle at the t-th iteration in the second dimension particle space,/->A hidden layer neuron number representing a jth particle at a jth iteration in a second dimension of particle space, and>representing the position of the jth particle at the t-th iteration in the second dimension particle space,/->Representing the optimal position of the t-th iteration in the second dimension of particle space in the whole population of particles.
3. The CNN-PSO-BiLSTM sleep staging method according to claim 2, wherein the learning rate η in the population of particles is not allowed to move randomly without boundaries, needs to be limited to a predefined range:
wherein v is 1,min And v 1,max Determining v 1 Boundary of v 1,max A maximum velocity value, v, representing particles in the first dimension of the particle space 1,min Representing the minimum velocity value, η, of a particle in the first dimension of the particle space min And eta max Determining the boundary of eta max Representing the maximum learning rate, η, of particles in the first dimension of the particle space min Representing the minimum learning rate of the particles in the first dimension of the particle space.
4. The CNN-PSO-BiLSTM sleep staging method according to claim 2, wherein the number of hidden layer neurons n in the population of particles is not allowed to randomly move without boundaries, needs to be limited within a predefined range:
wherein v is 2,min And v 2,max Determining v 2 Boundary of v 2,max A maximum velocity value, v, representing particles in a second dimension of particle space 2,min Representing a minimum velocity value, n, of a particle in a second dimension of particle space min And n max Determining the boundary of n, n max Representing the maximum hidden layer neuron number, n, of particles in a second dimension of particle space min Representing the minimum hidden layer neuron number of the particles in the first dimension particle space.
5. The sleep stage system based on the CNN-PSO-BiLSTM is characterized by comprising an acquisition module, a division labeling module, a feature extraction module, an initialization module, a model training module, a model verification module, a judgment module, a model construction module, an optimization module and a prediction module;
the acquisition module is used for acquiring the electroencephalogram signals of a plurality of history testers in a sleep stage set time period at an acquisition frequency, and preprocessing the electroencephalogram signals corresponding to each history tester to obtain preprocessed electroencephalogram signals;
the dividing and labeling module is used for continuously dividing the electroencephalogram signals corresponding to each historical to-be-detected person into a plurality of fragments according to a set time window, taking the electroencephalogram signals of each fragment as an electroencephalogram signal sleep sample, and labeling sleep stage labels for the electroencephalogram signal sleep samples;
the feature extraction module is used for extracting corresponding spatial feature data from each electroencephalogram signal sleep sample by using the 1D CNN as a feature extractor, wherein the spatial feature data comprises time domain feature data, frequency domain feature data and nonlinear feature data;
the initialization module is used for initializing each particle parameter in the particle swarm optimization model and determining a global optimal position of the particle, wherein the particle parameters comprise particle speeds in different dimensions, particle positions in different dimensions, particle parameters serving as a learning rate eta and particle parameters serving as the number n of neurons of the hidden layer;
the model training module is used for constructing a BiLSTM neural network model by using the learning rate eta particle parameter and the hidden layer neuron number n particle parameter, taking a plurality of spatial characteristic data of part of historical testers as input and a corresponding sleep stage label as output, and carrying out model training on the constructed BiLSTM neural network model;
the model verification module is used for carrying out model verification on the trained BiLSTM neural network model by taking a plurality of spatial characteristic data of the rest of the historical testers as input so as to obtain sleep stages corresponding to each electroencephalogram signal sleep sample of the rest of the historical testers, comparing the sleep stages obtained by verification with corresponding sleep stage labels, and taking the accuracy PR of the obtained sleep stage results as the fitness value of each particle;
the judging module is used for judging whether the change of the adaptability value of the model verification is smaller than a set threshold value or not, calling the model building module if yes, and calling the optimizing module if no;
the model construction module is used for returning an optimal learning rate eta particle parameter and an optimal hidden layer neuron number n particle parameter in the particle swarm optimization model, and constructing a PSO-BiLSTM neural network sleep automatic stage model based on the optimal learning rate eta particle parameter and the optimal hidden layer neuron number n particle parameter;
the optimization module is used for taking two particle parameters of a learning rate eta and the number n of hidden layer neurons in the BiLSTM neural network model as target objects for optimizing the particle swarm optimization model, updating each particle parameter in the particle swarm optimization model, determining the global optimal position of the particles, obtaining the optimal learning rate eta and the number n of the optimal hidden layer neurons corresponding to the t+1st iteration, inputting the optimal learning rate eta and the optimal number n of the hidden layer neurons into the BiLSTM neural network model, and calling the model training module, wherein t is more than or equal to 1;
the prediction module is used for collecting the brain electrical signals of the testee, obtaining the preprocessed brain electrical signals after preprocessing, substituting the preprocessed brain electrical signals into the trained PSO-BiLSTM neural network sleep automatic stage model, and predicting to obtain sleep stages corresponding to the brain electrical signals.
6. The CNN-PSO-BiLSTM sleep staging system according to claim 5, wherein the initialization module updates the formula as a 1 st iteration, the learning rate η, as follows:
wherein t representsT iteration, j e {1,., S } is the number of particles,represents the speed of the jth particle at the t-th iteration in the first dimension particle space,/->Represents the learning rate of the jth particle at the t-th iteration in the first dimension particle space, w represents the value of [0,1 ]]A constant value within the range, named inertial weight, c 1 And c 2 For acceleration constant, c is generally taken 1 =c 2 ∈[0,4],/>Is the position of the jth particle at the t-th iteration in the first dimension particle space, and +.>Is the optimal position of the t-th iteration in the first dimension particle space in the whole particle swarm, r 1 And r 2 Is extracted from the uniform distribution U (0, 1) in order to add a random component in the velocity update to diversify the search;
updating formula of hidden layer neuron number n:
wherein the method comprises the steps ofRepresents the speed of the jth particle at the t-th iteration in the second dimension particle space,/->A hidden layer neuron number representing a jth particle at a jth iteration in a second dimension of particle space, and>representing the position of the jth particle at the t-th iteration in the second dimension particle space,/->Representing the optimal position of the t-th iteration in the second dimension of particle space in the whole population of particles.
7. The CNN-PSO-BiLSTM sleep staging system according to claim 6, wherein the learning rate η in the population of particles is not allowed to move randomly without boundaries, needs to be limited to a predefined range:
wherein v is 1,min And v 1,max Determining v 1 Boundary of v 1,max A maximum velocity value, v, representing particles in the first dimension of the particle space 1,min Representing the minimum velocity value, η, of a particle in the first dimension of the particle space min And eta max Determining the boundary of eta max Representing the maximum learning rate, η, of particles in the first dimension of the particle space min Representing the minimum learning rate of the particles in the first dimension of the particle space.
8. The CNN-PSO-BiLSTM sleep stage system of claim 6, wherein the number of hidden layer neurons n in the population of particles is not allowed to randomly move without boundaries, and need to be limited to a predefined range:
wherein v is 2,min And v 2,max Determining v 2 Boundary of v 2,max A maximum velocity value, v, representing particles in a second dimension of particle space 2,min Representing a minimum velocity value, n, of a particle in a second dimension of particle space min And n max Determining the boundary of n, n max Representing the maximum hidden layer neuron number, n, of particles in a second dimension of particle space min Representing the minimum hidden layer neuron number of the particles in the first dimension particle space.
CN202310631776.2A 2023-05-31 2023-05-31 Sleep stage method and system based on CNN-PSO-BiLSTM Pending CN116712035A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310631776.2A CN116712035A (en) 2023-05-31 2023-05-31 Sleep stage method and system based on CNN-PSO-BiLSTM

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310631776.2A CN116712035A (en) 2023-05-31 2023-05-31 Sleep stage method and system based on CNN-PSO-BiLSTM

Publications (1)

Publication Number Publication Date
CN116712035A true CN116712035A (en) 2023-09-08

Family

ID=87870768

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310631776.2A Pending CN116712035A (en) 2023-05-31 2023-05-31 Sleep stage method and system based on CNN-PSO-BiLSTM

Country Status (1)

Country Link
CN (1) CN116712035A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111426816A (en) * 2020-04-10 2020-07-17 昆明理工大学 Method for predicting concentration of dissolved gas in transformer oil based on PSO-L STM
CN112438738A (en) * 2019-09-03 2021-03-05 西安慧脑智能科技有限公司 Sleep stage dividing method and device based on single-channel electroencephalogram signal and storage medium
CN114041753A (en) * 2021-11-16 2022-02-15 上海市第六人民医院 Sleep staging method and device, computer equipment and storage medium
CN116055175A (en) * 2023-01-12 2023-05-02 燕山大学 Intrusion detection method for optimizing neural network by combining symmetric uncertainty and super parameters

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112438738A (en) * 2019-09-03 2021-03-05 西安慧脑智能科技有限公司 Sleep stage dividing method and device based on single-channel electroencephalogram signal and storage medium
CN111426816A (en) * 2020-04-10 2020-07-17 昆明理工大学 Method for predicting concentration of dissolved gas in transformer oil based on PSO-L STM
CN114041753A (en) * 2021-11-16 2022-02-15 上海市第六人民医院 Sleep staging method and device, computer equipment and storage medium
CN116055175A (en) * 2023-01-12 2023-05-02 燕山大学 Intrusion detection method for optimizing neural network by combining symmetric uncertainty and super parameters

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
卢伊虹: "基于CNN-BiLSTM 的自动睡眠分期算法", 《计算机系统应用》, vol. 31, no. 4, 14 April 2022 (2022-04-14), pages 180 - 187 *

Similar Documents

Publication Publication Date Title
US10706260B2 (en) Analyzing digital holographic microscopy data for hematology applications
CN110188836B (en) Brain function network classification method based on variational self-encoder
CN112766379A (en) Data equalization method based on deep learning multi-weight loss function
CN108511055A (en) Ventricular premature beat identifying system and method based on Multiple Classifier Fusion and diagnostic rule
CN108399434B (en) Analysis and prediction method of high-dimensional time series data based on feature extraction
CN114358124B (en) New fault diagnosis method for rotary machinery based on deep countermeasure convolutional neural network
CN113837000A (en) Small sample fault diagnosis method based on task sequencing meta-learning
CN115953666B (en) Substation site progress identification method based on improved Mask-RCNN
CN113392894A (en) Cluster analysis method and system for multi-group mathematical data
CN114120063A (en) Unsupervised fine-grained image classification model training method and classification method based on clustering
CN113069117A (en) Electroencephalogram emotion recognition method and system based on time convolution neural network
CN108805181B (en) Image classification device and method based on multi-classification model
CN110956331A (en) Method, system and device for predicting operation state of digital factory
CN112861881A (en) Honeycomb lung recognition method based on improved MobileNet model
US6941288B2 (en) Online learning method in a decision system
Shafi et al. Statistical texture features based automatic detection and classification of diabetic retinopathy
CN116842460A (en) Cough-related disease identification method and system based on attention mechanism and residual neural network
CN116712035A (en) Sleep stage method and system based on CNN-PSO-BiLSTM
CN115984202A (en) Intelligent identification and evaluation method for cardiovascular function of zebra fish
CN112382382B (en) Cost-sensitive integrated learning classification method and system
CN113066544B (en) FVEP characteristic point detection method based on CAA-Net and LightGBM
CN115358260A (en) Electroencephalogram sleep staging method and device, electronic equipment and storage medium
CN114819099A (en) System for improving image recognition accuracy rate based on intelligent AI
CN114495220A (en) Target identity recognition method, device and storage medium
Jadah et al. Breast Cancer Image Classification Using Deep Convolutional Neural Networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination