CN111671423B - EEG signal representation method, classification method, visualization method and medium - Google Patents

EEG signal representation method, classification method, visualization method and medium Download PDF

Info

Publication number
CN111671423B
CN111671423B CN202010559396.9A CN202010559396A CN111671423B CN 111671423 B CN111671423 B CN 111671423B CN 202010559396 A CN202010559396 A CN 202010559396A CN 111671423 B CN111671423 B CN 111671423B
Authority
CN
China
Prior art keywords
layer
signal
signals
gray
channel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010559396.9A
Other languages
Chinese (zh)
Other versions
CN111671423A (en
Inventor
张军鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan University
Original Assignee
Sichuan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan University filed Critical Sichuan University
Priority to CN202010559396.9A priority Critical patent/CN111671423B/en
Publication of CN111671423A publication Critical patent/CN111671423A/en
Application granted granted Critical
Publication of CN111671423B publication Critical patent/CN111671423B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device

Abstract

The invention relates to the technical field of EEG signal analysis and identification, and discloses an EEG signal representation method, a classification method, a visualization method and a medium. Thus, the present invention is able to convert a cluttered EEG signal into a gray scale map, thereby fully embodying the underlying information and differences in the EEG signal for schizophrenia and depression in the form of a gray scale map.

Description

EEG signal representation method, classification method, visualization method and medium
Technical Field
The invention relates to the technical field of electroencephalogram signal analysis and identification, in particular to an EEG signal representation method, a classification method, a visualization method and a medium.
Background
The electrophysiological marker is a method for recording brain activity by using electrophysiological indexes, has the characteristics of non-invasiveness and low cost, and is an ideal mode for distinguishing schizophrenia from depression. At present, the representation method of the EEG signal is mainly based on the frequency band power characteristic and the time point characteristic, but considering that the clinical manifestations of patients with schizophrenia and depression have similarity and the complexity of the electroencephalogram signal of the patients, in order to fully represent the implicit information and difference of schizophrenia and depression in the EEG signal, a representation method of the EGG signal more suitable for distinguishing the situations of schizophrenia and depression is needed.
Disclosure of Invention
The invention aims to: the method for representing the EGG signal is more suitable for distinguishing the situations of schizophrenia and depression, and can completely reflect the implicit information and difference of schizophrenia and depression in the EEG signal.
In order to achieve the purpose, the invention provides the following technical scheme:
a method of representation of an EEG signal, comprising:
preprocessing EEG signals of a plurality of channels to be represented; and converting the EEG signals of each channel after the pretreatment from a time domain to a frequency domain, and obtaining gray-scale maps corresponding to the EEG signals of the channels to be expressed according to the corresponding power data of the EGG signals of each channel under different frequency values.
According to a specific embodiment, in the method for representing an EEG signal of the present invention, the preprocessing comprises: carrying out filtering operation on EEG signals of a plurality of channels to be expressed to obtain signals of each channel in a corresponding frequency range; and performing artifact removal operation on the signals of the channels in the corresponding frequency range.
According to a specific embodiment, the method for representing EEG signals of the present invention uses an independent component analysis method to perform artifact removal on the signals of each channel in the corresponding frequency range.
According to a specific embodiment, the EEG signal representation method of the present invention uses fast fourier transform to transform the EEG signals of each channel after the preprocessing from time domain to frequency domain.
According to a specific embodiment, in the EEG signal representation method of the present invention, the manner of obtaining the gray scale map is:
Figure GDA0002576540920000021
i=p'×255
wherein p ismax、pminRespectively representing the maximum power and the minimum power of an EEG signal of each channel, wherein p is the power of the frequency corresponding to each channel, p' is the power after p normalization, and i is the gray value of the pixel point of the gray map corresponding to p.
Based on the same inventive concept, the invention also provides a method for classifying EEG signals, which comprises the following steps:
collecting EEG signals of a plurality of channels, and expressing the collected EEG signals of the plurality of channels by adopting the EEG signal expression method to obtain corresponding gray-scale maps;
and inputting the gray scale map into a pre-trained convolutional neural network for classification and identification to obtain a classification result.
According to a specific embodiment, the method for classifying EEG signals of the present invention trains the convolutional neural network using five-fold cross validation. Moreover, the sample gray-scale map is processed by an image processing means including at least one of translation, rotation, flipping, and scaling to increase the amount of training samples.
Based on the same inventive concept, the invention also provides a visualization method of the EGG signal, which comprises the following steps:
obtaining thermodynamic diagrams corresponding to EEG signals of a plurality of channels according to the gradient of a characteristic diagram output by a final convolution layer of a convolutional neural network adopted in the classification method of the EEG signals; and superposing the gray-scale map and the thermodynamic map corresponding to the EEG signals of all the channels to visually display the EGG signals.
Based on the same inventive concept, the present invention also provides a computer readable storage medium having one or more programs stored thereon, characterized in that the one or more programs, when executed by one or more processors, implement the method of representing an EEG signal or the method of classifying an EEG signal of the present invention.
In summary, compared with the prior art, the invention has the beneficial effects that:
1. the EEG signal representation method of the invention comprises the steps of preprocessing EEG signals of a plurality of channels to be represented, converting the preprocessed EEG signals of each channel from a time domain to a frequency domain, and obtaining gray-scale maps corresponding to the EEG signals of the plurality of channels to be represented according to corresponding power data of the EGG signals of each channel under different frequency values. Thus, the present invention is able to convert a cluttered EEG signal into a gray scale map, thereby fully embodying the underlying information and differences in the EEG signal for schizophrenia and depression in the form of a gray scale map.
2. The EEG signal classification method comprises the steps of collecting EEG signals of a plurality of channels, representing the collected EEG signals of the plurality of channels as corresponding gray-scale maps, and inputting the obtained gray-scale maps into a pre-trained convolutional neural network for classification and recognition to obtain a classification result. Therefore, the invention can avoid the medical staff from manually selecting the sample characteristics to be used as the diagnosis basis, reduce the labor intensity of the medical staff and improve the sample processing efficiency.
3. According to the visualization method of the EGG signals, thermodynamic diagrams corresponding to EEG signals of a plurality of channels are obtained according to the gradient of the characteristic diagram output by the last convolution layer of the convolutional neural network adopted in the classification method of the EEG signals; and superposing the gray-scale map and the thermodynamic map corresponding to the EEG signals of all the channels to visually display the EGG signals. Therefore, the invention can more intuitively present the implicit information and difference of schizophrenia and depression in the EEG signal, and is convenient for medical staff to use as the diagnosis basis.
Description of the drawings:
FIG. 1 is a schematic diagram of the acquisition of 16-lead EEG signals according to the present invention;
FIG. 2 is a flow chart of a method of representing an EEG signal in accordance with the present invention;
FIG. 3 is a graph of an EEG signal as it is converted from the time domain to the frequency domain in accordance with the present invention;
FIG. 4 is a schematic illustration of a gray scale image of the present invention;
FIG. 5 is a flow chart of a method of classifying EEG signals according to the present invention;
FIG. 6 is a schematic diagram of the training process and visualization process of the convolutional neural network of the present invention;
FIG. 7 is a flow chart of a method of visualizing an EEG signal in accordance with the present invention;
FIG. 8 is a schematic diagram of a final rendered image of the visualization method of the present invention.
Detailed Description
The invention is described in further detail below with reference to the figures and the embodiments. It should be understood that the scope of the above-described subject matter is not limited to the following examples, and any techniques implemented based on the disclosure of the present invention are within the scope of the present invention.
As shown in fig. 1, the data acquisition process is performed in a state where the patient is kept calm, and the 16-lead electroencephalogram signals of the patient in a state of successive eyes open and eyes closed are recorded, and each state is kept for 5-10 seconds.
As shown in fig. 2, the method for representing an EEG signal of the present invention includes:
preprocessing the EEG signals of 16 channels to be represented, specifically, preprocessing the EEG signals by using the EEG ab, wherein the preprocessing includes: carrying out filtering operation on EEG signals of a plurality of channels to be expressed to obtain signals of each channel in a corresponding frequency range; and performing artifact removal operation on the signals of the channels in the corresponding frequency range. The filtering operation aims at intercepting signals of 1-50Hz, meanwhile, an independent component analysis method is adopted to perform artifact removing operation on the signals of all channels in the corresponding frequency range, and artifacts needing to be removed include electro-oculogram, eye movement, head movement and the like.
Specifically, the EEG signals of the 16 channels after being preprocessed are converted from the time domain to the frequency domain by using fast fourier transform, and the result is shown in fig. 3, wherein signal samples of schizophrenia and depression in the eye closing state are converted into the frequency domain, and it can be found that the power of all electrodes of the signal samples of schizophrenia and depression appears as an obvious peak at about 10 Hz.
After converting the EEG signals of 16 channels from the time domain to the frequency domain, deriving data at 1-50Hz in the frequency domain into a matrix with 50 rows and 16 columns, where the row represents frequency, the column represents lead number, the data in the matrix represents power value of corresponding frequency under lead, and then obtaining a gray scale map corresponding to the EEG signals of 16 channels to be represented according to corresponding power data of the EGG signals of each channel under different frequency values, and the result is shown in fig. 4, where fig. 4 includes a gray scale map corresponding to 10 signal samples.
In practice, the method for representing an EEG signal according to the present invention obtains a gray scale map by:
Figure GDA0002576540920000051
i=p'×255
wherein p ismax、pminRespectively representing the maximum power and the minimum power of an EEG signal of each channel, wherein p is the power of the frequency corresponding to each channel, p' is the power after p normalization, and i is the gray value of the pixel point of the gray map corresponding to p.
As shown in fig. 5, the method for classifying EEG signals of the present invention includes:
EEG signals are acquired for multiple channels, as shown in fig. 1.
Then, the EEG signals of a plurality of collected channels are expressed by adopting the EEG signal expression method of the invention, and corresponding gray-scale maps are obtained.
And finally, inputting the obtained gray-scale image into a pre-trained convolutional neural network for classification and identification to obtain a classification result.
Specifically, the classification method of the EEG signal of the invention adopts five-fold cross validation to train the convolutional neural network, namely, the training data is divided into five parts, four parts are trained each time, and the rest part is used for validating the precision of the model; the model can be independently trained five times by the method, and the precision average value of the test model is taken five times as the final verification precision of the model.
During training, based on 200 signal samples of schizophrenia and 200 signal samples of depression, the signal samples are still converted into corresponding gray-scale maps by the EEG signal representation method of the present invention, and a total of 400 gray-scale maps labeled with corresponding classification labels, such as the gray-scale map shown in fig. 4 containing 10 signal samples, where the numbers 0 and 1 below the gray-scale map are classification labels, 0 represents schizophrenia, and 1 represents depression.
Then, an 8-layer shallow convolutional neural network is constructed, and the specific structure is shown in table 1. Moreover, the entire convolutional network contains five different types of layers, specifically: a cavitation layer, a posing layer, a flat layer, a dropout layer, a dense layer.
Table 1: detailed table of convolutional neural network construction
Figure GDA0002576540920000061
Specifically, the convergence layer: this layer consists of a number of kernel. Each kernel is a matrix, and can convolve the input data, and stride of each convolution is 1. Each kernel convolution yields a feature map.
Pooling layer: the layer is used for down-sampling of data so as to reduce the data dimensionality after the convolution layer outputs, effectively reduce the calculated amount and prevent over-fitting. max-pooling layer is the largest value selected in a 2 x 2 window to be input to the next layer.
The Flatten layer that can spread the output data of the last connected layer into a column.
Dropout layer can temporarily hide part of the neurons during training so as to achieve the purpose of reducing overfitting. After many trials, the model sets the dropout rate to 0.5.
And a Dense layer, wherein all the previously extracted features can be associated and mapped into an output space.
During training, the flow is shown in fig. 6, and the convolutional neural network of the present invention uses two activation functions, namely Relu and Sigmoid, wherein the Relu function can increase the nonlinearity of the model, so that the model learns the linear transformation and the nonlinear transformation of the input data at the same time, and the model has more robustness to the noise of the input data. The Relu function is calculated as:
Figure GDA0002576540920000071
the activation function of the last layer selects a Sigmoid function for calculating the probability value of the last model outputting a certain category, wherein ycThe output of the last layer of the representation model for a class c, p, represents the probability of outputting that class c. The Sigmoid function is calculated as:
Figure GDA0002576540920000072
after the convolutional neural network is constructed, training the model, dividing the sample into 320 training sets and 80 verification sets, and setting parameters during model training as follows after a plurality of tests: the optimizer was selected as RMSProp with the learning rate set to 10-4Each time the sample batch of the input model is 32, the loss function selects a binary cross entropy.
In addition, the classification result for the model is generally expressed by a confusion matrix, which is specifically shown in table 2. The confusion matrix is an m x m number table, where m represents the number of classes, the rows of the list represent the true sample classes, and the columns represent the predicted sample classes. For example, for a binary classification problem, the confusion matrix is 2 × 2, and instead of merely calculating the accuracy of the model, we can obtain the indexes of sensitivity, specificity, precision, and recycle, so the indexes of sensitivity, specificity, F1-score, and accuracy are often used to comprehensively consider the model. The specific calculation formula is as follows:
Figure GDA0002576540920000073
Figure GDA0002576540920000074
Figure GDA0002576540920000075
Figure GDA0002576540920000076
in the formula, TP represents true positive, FP represents false positive, TN represents true negative, and FN represents false negative.
Table 2: detailed table of convolutional neural network construction
Figure GDA0002576540920000081
In order to increase the sample amount of convolutional neural network training and avoid the over-fitting problem, the invention also processes the sample gray-scale image by adopting an image processing means of at least one of translation, rotation, turnover and scaling to generate more credible images, so that the convolutional neural network learns more local information in the training samples.
Although the EGG signal representation method provided by the invention can completely represent the implicit information and difference of schizophrenia and depression in the EEG signal in the form of gray scale maps, the gray scale maps are difficult to distinguish by naked eyes and are difficult to directly serve as a diagnosis basis.
Therefore, the present invention further provides a method for visualizing an EGG signal, as shown in fig. 7, which includes:
obtaining thermodynamic diagrams corresponding to EEG signals of a plurality of channels according to the gradient of a characteristic diagram output by a final convolution layer of a convolutional neural network adopted in the classification method of the EEG signals; the grayscale and thermodynamic diagrams corresponding to the EEG signals of the respective channels are superimposed together to obtain a final presented image to visually display the EGG signal, as shown in fig. 8.
As shown in fig. 6, the method of the present invention visualizes the convolutional neural network by using the Grad-cam (gradient class activation map) to find out the basis of the convolutional neural network for making a final classification decision, in other words, the finally obtained visualized image can also be directly used as the diagnosis basis of the medical staff.
Wherein, to obtain the CAM for the intersecting class C, the gradient value of the score for class C (i.e., the value before the softmax layer is input) with respect to the last convolutional layer output feature map is first calculated. And the gradient value can be understood as the importance of the feature map finally output by the convolutional layer on the last decision of class C made by the model, i.e. the weight of the feature map on class C can be calculated.
Figure GDA0002576540920000091
Wherein, ycIs the score corresponding to the intersecting class C (i.e. the value before entering the softmax layer),
Figure GDA0002576540920000093
is the pixel of the (i, j) th position corresponding to k feature maps, Z is the pixel number of the feature maps,
Figure GDA0002576540920000094
the weight of the kth feature map corresponding to class C.
All weights are weighted
Figure GDA0002576540920000095
Characteristic diagram A corresponding to the characteristic diagramkMultiplying and summing the products, and taking the positive value to obtain the Grad-CAM of class C.
Figure GDA0002576540920000092
The pixels which only have positive influence on class C are extracted by adopting the Relu function, and pixel points related to other categories cannot be doped after the CAM is superposed with the original image.
In addition, the present invention also provides a readable storage medium, such as a ROM storage device, a removable hard disk, a usb disk, or an optical disk, etc., wherein one or more programs are written into the memory and executed by one or more processors. The program in the memory thus realizes, when executed by the processor, the inventive EEG signal representation method or the inventive EEG signal classification method or the inventive EGG signal visualization method.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (8)

1. A method for visualizing an EGG signal, comprising:
preprocessing EEG signals of a plurality of channels to be represented; converting the EEG signals of each channel after the pretreatment from a time domain to a frequency domain, and obtaining gray-scale maps corresponding to the EEG signals of a plurality of channels to be expressed according to corresponding power data of the EGG signals of each channel under different frequency values;
inputting the gray-scale map into a pre-trained convolutional neural network, and obtaining thermodynamic diagrams corresponding to EEG signals of a plurality of channels according to the gradient of a characteristic map output by a last convolutional layer of the convolutional neural network; superposing the gray-scale map and the thermodynamic map corresponding to the EEG signal of each channel to visually display the EGG signal;
the convolutional neural network comprises five different types of layers, namely a convolution layer, a posing layer, a flat layer, a dropout layer and a dense layer, and the convolution layer is used for convolving input data; posing layer is used for down-sampling of data; the flat layer is used for generating output data of the constraint layer into a column; the Dropout layer is used for temporarily hiding part of neurons during training so as to achieve the aim of reducing overfitting; the dense layer is used for associating all the extracted features and mapping the associated features into an output space;
the convolutional neural network specifically includes: a first, Pooling layer, a second, a third, a Flatten layer, a Dropout layer, a first and a second dense layer; moreover, the first dense layer adopts a Relu function, and the second dense layer adopts a Sigmod function.
2. A method for visualization of an EGG signal according to claim 1, wherein the preprocessing comprises: carrying out filtering operation on EEG signals of a plurality of channels to be expressed to obtain signals of each channel in a corresponding frequency range; and performing artifact removal operation on the signals of the channels in the corresponding frequency range.
3. A method for visualizing an EGG signal as in claim 2, wherein an artifact removal operation is performed on the signal of each channel in the corresponding frequency range using independent component analysis.
4. A method for visualizing an EGG signal as in claim 2, wherein the preprocessed EEG signals of each channel are transformed from the time domain to the frequency domain using a fast fourier transform.
5. A method for visualizing an EGG signal as in claim 1, wherein said gray scale map is obtained by:
Figure 665905DEST_PATH_IMAGE002
wherein the content of the first and second substances,p maxp minrespectively representing the maximum power and minimum power of the EEG signal for each channel,pfor the power of the corresponding frequency of each channel,
Figure DEST_PATH_IMAGE003
is composed ofpThe power of the power after the normalization is carried out,iis composed ofpAnd the gray value of the pixel point corresponding to the gray map.
6. The method for visualizing an EGG signal as in claim 1, wherein said convolutional neural network is trained using a five-fold cross-validation method.
7. A method for visualization of EGG signals according to claim 6, wherein the sample gray scale map is processed using image processing means comprising at least one of translation, rotation, flipping and scaling to increase the amount of training samples.
8. A computer readable storage medium having one or more programs stored thereon, which when executed by one or more processors implement the method for visualizing an EGG signal as claimed in any one of claims 1 to 7.
CN202010559396.9A 2020-06-18 2020-06-18 EEG signal representation method, classification method, visualization method and medium Active CN111671423B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010559396.9A CN111671423B (en) 2020-06-18 2020-06-18 EEG signal representation method, classification method, visualization method and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010559396.9A CN111671423B (en) 2020-06-18 2020-06-18 EEG signal representation method, classification method, visualization method and medium

Publications (2)

Publication Number Publication Date
CN111671423A CN111671423A (en) 2020-09-18
CN111671423B true CN111671423B (en) 2022-02-18

Family

ID=72436451

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010559396.9A Active CN111671423B (en) 2020-06-18 2020-06-18 EEG signal representation method, classification method, visualization method and medium

Country Status (1)

Country Link
CN (1) CN111671423B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112560919A (en) * 2020-12-07 2021-03-26 杭州智瑞思科技有限公司 Man-machine asynchronous recognition method based on one-dimensional interpretable convolutional neural network
CN113397564A (en) * 2021-07-22 2021-09-17 北京脑陆科技有限公司 Depression identification method, device, terminal and medium based on image processing
CN113768519B (en) * 2021-09-16 2023-12-26 天津大学 Method for analyzing consciousness level of patient based on deep learning and resting state electroencephalogram data

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005036455A2 (en) * 2003-10-08 2005-04-21 Case Western Reserve University Method for displaying spectral trends in complex signals
CN105147248A (en) * 2015-07-30 2015-12-16 华南理工大学 Physiological information-based depressive disorder evaluation system and evaluation method thereof
CN109272048A (en) * 2018-09-30 2019-01-25 北京工业大学 A kind of mode identification method based on depth convolutional neural networks
CN109299751A (en) * 2018-11-26 2019-02-01 南开大学 The SSVEP brain electricity classification method of convolutional Neural model based on the enhancing of EMD data
CN110338820A (en) * 2019-06-13 2019-10-18 四川大学 A kind of depression and schizophrenia recognition methods
CN110543832A (en) * 2019-08-13 2019-12-06 同济大学 Electroencephalogram data classification method based on random forest and convolutional neural network

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005036455A2 (en) * 2003-10-08 2005-04-21 Case Western Reserve University Method for displaying spectral trends in complex signals
CN105147248A (en) * 2015-07-30 2015-12-16 华南理工大学 Physiological information-based depressive disorder evaluation system and evaluation method thereof
CN109272048A (en) * 2018-09-30 2019-01-25 北京工业大学 A kind of mode identification method based on depth convolutional neural networks
CN109299751A (en) * 2018-11-26 2019-02-01 南开大学 The SSVEP brain electricity classification method of convolutional Neural model based on the enhancing of EMD data
CN110338820A (en) * 2019-06-13 2019-10-18 四川大学 A kind of depression and schizophrenia recognition methods
CN110543832A (en) * 2019-08-13 2019-12-06 同济大学 Electroencephalogram data classification method based on random forest and convolutional neural network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A Convolutional Neural Network approach for Classification of Dementia Stages based on 2D-Spectral Representation of EEG recordings;CosimoIeracitano et al;《Neurocomputing》;20190105;第323卷;第1-20页 *
CosimoIeracitano et al.A Convolutional Neural Network approach for Classification of Dementia Stages based on 2D-Spectral Representation of EEG recordings.《Neurocomputing》.2019,第323卷第96-107页. *

Also Published As

Publication number Publication date
CN111671423A (en) 2020-09-18

Similar Documents

Publication Publication Date Title
Khatamino et al. A deep learning-CNN based system for medical diagnosis: An application on Parkinson’s disease handwriting drawings
Impedovo et al. Dynamic handwriting analysis for the assessment of neurodegenerative diseases: a pattern recognition perspective
CN111671423B (en) EEG signal representation method, classification method, visualization method and medium
Müller et al. EEG/ERP-based biomarker/neuroalgorithms in adults with ADHD: Development, reliability, and application in clinical practice
Qiao et al. Ternary-task convolutional bidirectional neural turing machine for assessment of EEG-based cognitive workload
CN114970599A (en) Identification method and identification device for attention defect associated electroencephalogram signals and storage medium
CN112185493A (en) Personality preference diagnosis device and project recommendation system based on same
CN113012163A (en) Retina blood vessel segmentation method, equipment and storage medium based on multi-scale attention network
CN113974627B (en) Emotion recognition method based on brain-computer generated confrontation
Ahire et al. A comprehensive review of machine learning approaches for dyslexia diagnosis
Uyulan Development of LSTM&CNN based hybrid deep learning model to classify motor imagery tasks
Huang et al. Detection of Diseases Using Machine Learning Image Recognition Technology in Artificial Intelligence
CN113869382A (en) Semi-supervised learning epilepsia electroencephalogram signal identification method based on domain embedding probability
CN111772629A (en) Brain cognitive skill transplantation method
CN115736920A (en) Depression state identification method and system based on bimodal fusion
CN113921131A (en) Method for recognizing handwriting of neurodegenerative patient through digital writing
Mesbahi et al. Automatic segmentation of medical images using convolutional neural networks
Perez-Gay et al. Category learning can alter perception and its neural correlate
Liu et al. Facial expression awareness based on multi-scale permutation entropy of EEG
CN114595731B (en) Semantic segmentation method of nonlinear medical sensor data based on continuous learning
Cheng et al. Sleep stage recognition from EEG using a distributed multi-channel decision-making system
Aldhyani et al. Modeling and diagnosis Parkinson disease by using hand drawing: deep learning model
Liu et al. A learnable front-end based efficient channel attention network for heart sound classification
Aynali Noise reduction of eeg signals using autoencoders built upon gru based rnn layers
Sasidhar et al. Dyslexia Discernment in children based on Handwriting images using Residual Neural Network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant