CN111882036A - Convolutional neural network training method, electroencephalogram signal identification method, device and medium - Google Patents

Convolutional neural network training method, electroencephalogram signal identification method, device and medium Download PDF

Info

Publication number
CN111882036A
CN111882036A CN202010710647.9A CN202010710647A CN111882036A CN 111882036 A CN111882036 A CN 111882036A CN 202010710647 A CN202010710647 A CN 202010710647A CN 111882036 A CN111882036 A CN 111882036A
Authority
CN
China
Prior art keywords
neural network
convolutional neural
electroencephalogram
layer
electroencephalogram signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010710647.9A
Other languages
Chinese (zh)
Other versions
CN111882036B (en
Inventor
王力
黄伟键
刘彦俊
颜振雄
王友康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou University
Original Assignee
Guangzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou University filed Critical Guangzhou University
Priority to CN202010710647.9A priority Critical patent/CN111882036B/en
Publication of CN111882036A publication Critical patent/CN111882036A/en
Application granted granted Critical
Publication of CN111882036B publication Critical patent/CN111882036B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Dermatology (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Human Computer Interaction (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses a convolutional neural network training method, an electroencephalogram signal identification method, a device and a medium. The convolutional neural network trained by the method is a multi-input, multi-convolution scale and multi-convolution type hybrid convolutional neural network, the sizes of the multi-input convolutional layers and the convolutional cores are reasonably designed, and the method has high identification accuracy; the training set used for training the convolutional neural network is obtained by performing time domain data enhancement and frequency domain data enhancement expansion on the basis of the acquired electroencephalogram signals, so that the training data volume of the convolutional neural network can be increased, the overfitting phenomenon is reduced, the noise interference in the electroencephalogram signals can be effectively responded, and the recognition effect is improved. The invention is widely applied to the technical field of signal processing.

Description

Convolutional neural network training method, electroencephalogram signal identification method, device and medium
Technical Field
The invention relates to the technical field of signal processing, in particular to a convolutional neural network training method, an electroencephalogram signal identification device and a medium.
Background
The brain-computer interface can convert brain activities into computer control instructions so as to control external equipment, and can be widely applied to the fields of medical science, industrial control and the like. The electroencephalogram signal has the advantages of non-invasiveness, high time resolution and the like, so that the electroencephalogram signal is used as a signal source of an electroencephalogram interface. The brain electrical signal is applied to a brain-computer interface, and relates to the identification process of the brain electrical signal, namely, the type or the characteristics of the brain electrical signal are identified, so that the brain electrical signal is converted into a computer control instruction. The electroencephalogram signal also has the characteristics of non-stability, non-linearity, randomness and the like, namely the characteristics of the electroencephalogram signal are changed along with time, so the electroencephalogram signal is easily interfered by noise, is unfavorable for the identification of the electroencephalogram signal and is also unfavorable for the application of a brain-computer interface.
Disclosure of Invention
In view of at least one of the above technical problems, it is an object of the present invention to provide a convolutional neural network training method, an electroencephalogram signal recognition method, an apparatus, and a medium.
In one aspect, an embodiment of the present invention includes a convolutional neural network training method, including:
performing a plurality of acquisition processes; each acquisition process is respectively used for acquiring an electroencephalogram signal;
performing time domain data enhancement and frequency domain data enhancement on the electroencephalogram signal;
and training the convolutional neural network by using the electroencephalogram signals subjected to the time domain data enhancement and the frequency domain data enhancement.
Further, the acquiring process includes:
acquiring electroencephalogram signals generated by a subject during motor imagery through C3, Cz and C4 channels;
classifying the electroencephalogram signals into left-hand motor imagery electroencephalogram signals or right-hand motor imagery electroencephalogram signals;
carrying out classification marking on the electroencephalogram signals;
classifying the electroencephalogram signals into a training set or a testing set; the training set is used for training the convolutional neural network, and the test set is used for testing the convolutional neural network.
Further, the convolutional neural network training method further comprises the following steps:
carrying out abnormal value screening on the electroencephalogram signals;
and carrying out first band-pass filtering on the electroencephalogram signals.
Further, the time domain data enhancement comprises:
decomposing the electroencephalogram signal into data segments in a time domain;
carrying out the data segment exchange among the electroencephalogram signals acquired in each acquisition process; the swapped data segments have the same time domain position;
and carrying out second band-pass filtering on the electroencephalogram signal.
Further, the frequency domain data enhancement comprises:
performing third band-pass filtering on the electroencephalogram signals enhanced by the time domain data; the third band-pass filtering has a plurality of pass bands, and the result of the third band-pass filtering is to obtain the frequency components of the electroencephalogram signal;
carrying out the frequency component conversion between the electroencephalogram signals acquired in each acquisition process; the frequency components that are transposed have the same frequency domain position.
Further, when the intensity of the electroencephalogram signal reaches a preset threshold value, the time domain data enhancement is stopped;
and when all the electroencephalogram signals are subjected to the time domain data enhancement at least once, stopping the time domain data enhancement.
Further, the convolutional neural network comprises an input layer, a time convolutional layer, a depth convolutional layer, a first pooling layer, a separable convolutional layer, a second pooling layer, a full-connection layer and an output layer which are connected in sequence;
the input layer is used for receiving the electroencephalogram signals;
the time convolution layer is used for extracting time characteristic information from the electroencephalogram signals;
the depth convolution layer is used for extracting spatial characteristic information from the electroencephalogram signal;
the separable convolutional layer is used for extracting frequency characteristic information from the electroencephalogram signal;
the first pooling layer and the second pooling layer are used for compressing and simplifying the temporal feature information, the spatial feature information and the frequency feature information;
the full connection layer is used for fusing the output results of the second pooling layer;
and the output layer is used for carrying out classified output according to the fusion result of the full connection layer.
On the other hand, the embodiment of the invention also comprises an electroencephalogram signal identification method, which comprises the following steps:
acquiring an electroencephalogram signal to be processed;
inputting the electroencephalogram signal to be processed into a convolutional neural network; the convolutional neural network is trained by a training method in an embodiment;
acquiring an output result of the convolutional neural network; the output result of the convolutional neural network comprises the type of the electroencephalogram signal.
In another aspect, an embodiment of the present invention further includes a computer apparatus, including a memory and a processor, where the memory is used to store at least one program, and the processor is used to load the at least one program to perform the method of the embodiment.
In another aspect, the present invention also includes a storage medium having stored therein processor-executable instructions, which when executed by a processor, are configured to perform the method of the embodiments.
The invention has the beneficial effects that: the convolutional neural network trained in the embodiment is a multi-input, multi-convolution scale and multi-convolution type hybrid convolutional neural network, the sizes of the multi-input convolutional layer and the convolutional core are reasonably designed, and the recognition accuracy is high; the training set used for training the convolutional neural network is obtained by performing time domain data enhancement and frequency domain data enhancement expansion on the basis of the acquired electroencephalogram signals, so that the training data volume of the convolutional neural network can be increased, the overfitting phenomenon is reduced, the noise interference in the electroencephalogram signals can be effectively responded, and the recognition effect is improved.
Drawings
FIG. 1 is a flow chart of a convolutional neural network training method in an embodiment;
FIG. 2 is a schematic diagram of the stimulation of a subject to generate an electroencephalogram signal according to an embodiment;
FIG. 3 is a schematic diagram of the electrode distribution of the EEG signal acquisition instrument used in the embodiment;
FIG. 4 is a schematic diagram of the working timing sequence of acquiring electroencephalogram signals in the embodiment;
FIG. 5 is a schematic diagram of time domain data enhancement in an embodiment;
fig. 6 is a schematic diagram of frequency domain data enhancement in an embodiment.
Detailed Description
In this embodiment, referring to fig. 1, the convolutional neural network training method includes the following steps:
p1, executing a plurality of acquisition processes; each acquisition process is respectively used for acquiring an electroencephalogram signal;
p2, performing time domain data enhancement and frequency domain data enhancement on the electroencephalogram signals;
and P3, training the convolutional neural network by using the electroencephalogram signals subjected to time domain data enhancement and frequency domain data enhancement.
In this embodiment, when the obtaining process in step P1 is executed each time, the examinee is required to perform an imagination of a certain kind of motion, so that the brain of the examinee generates an electroencephalogram signal, and the electroencephalogram signal is collected by an electroencephalogram signal collecting instrument including channels C3, Cz, and C4. In this embodiment, the same type of motion may be required for the subject during each acquisition, so that interference caused by different types of motion may be reduced. The method comprises the steps of instructing a subject to perform action imagination once and acquiring electroencephalogram signals once every time an acquisition process is executed, and acquiring a plurality of electroencephalogram signals after the acquisition processes are executed for a plurality of times.
Specifically, in this embodiment, the step P1 specifically includes the following sub-steps:
p101, acquiring electroencephalogram signals generated by a subject during motor imagery through C3, Cz and C4 channels;
p102, classifying the electroencephalogram signals into left-hand motor imagery electroencephalogram signals or right-hand motor imagery electroencephalogram signals;
p103, carrying out classification marking on the electroencephalogram signals;
p104, classifying the electroencephalogram signals into a training set or a testing set; the training set is used for training the convolutional neural network, and the test set is used for testing the convolutional neural network.
In this embodiment, steps P101-P104 may be performed in each acquisition process. In this embodiment, when step P101 is executed, the display device shown in fig. 2 is used to display the image and sound prompt, the left-hand imaginary movement prompt and the right-hand imaginary movement prompt to the subject, so as to stimulate the subject to perform motor imagery to generate an electroencephalogram signal. In this embodiment, the electrode distribution of the electroencephalogram signal acquisition instrument is as shown in fig. 3, and when step P101 is executed, the channels C3, Cz, and C4 in fig. 3 are used to acquire electroencephalogram signals.
In executing step P101, referring to fig. 4, after instructing the subject to perform motor imagery, a timer is started, 0-3 seconds is a preparation phase, and the subject is ready in the preparation phase; 3-7 seconds are an imagination phase, and the subject performs motor imagination in the imagination phase; the 7 th to 8 th seconds are idle periods, and the steps P102 to P104 can be executed by the system in the idle period, i.e. the time for each acquisition process when the step P1 is executed is 8 seconds in total.
In this embodiment, the step P102 and the step P103 are executed to mark the acquired electroencephalogram signal as a left-hand motor imagery electroencephalogram signal or a right-hand motor imagery electroencephalogram signal. In executing step P104, the electroencephalogram signals are classified in the following manner:
1. if the acquisition process is executed n times, n electroencephalograms with the serial numbers of 1, 2, 3 … … n and the like are obtained, the electroencephalogram with the serial number of 1 is taken as a test set, and other electroencephalograms are taken as training sets;
2. taking the electroencephalogram signal with the number of 2 as a test set, and taking other electroencephalogram signals as a training set;
……
and n, taking the electroencephalogram signals with the number of n as a test set, and taking other electroencephalogram signals as a training set.
Through the above process, n groups of test sets and training sets can be obtained, wherein the training sets are used for training the convolutional neural network, and the test sets are used for testing the convolutional neural network. In this embodiment, for the electroencephalogram signals in the training set, the time domain data enhancement and the frequency domain data enhancement in step P2 are performed and then used for training the convolutional neural network.
In this embodiment, the electroencephalogram signal obtained in step P1 may also be preprocessed. The pretreatment process comprises the following steps:
A1. carrying out abnormal value screening on the electroencephalogram signals;
A2. and carrying out first band-pass filtering on the electroencephalogram signals.
In this embodiment, under the condition that a total of 8 seconds of electroencephalogram signal acquisition processes are executed, the electroencephalogram signals acquired by each acquisition process can be intercepted and retained, specifically, only the 3.5 th to 7 th second parts of each electroencephalogram signal are retained, other parts of each electroencephalogram signal are deleted, the 3.5 th to 7 th second parts of each electroencephalogram signal have less noise from the viewpoint of noise distribution, the noise of other parts is more, so that more abnormal values exist, and abnormal values can be screened out through interception and retention.
When step a2 is executed, a first bandpass filtering may be performed on the electroencephalogram signal acquired in each acquisition process, where a passband of the first bandpass filtering is 2Hz to 35 Hz. In this embodiment, after the first band-pass filtering is performed, the electroencephalogram signals acquired in each acquisition process are all constructed into a format of 3 × 875.
In this embodiment, the time domain data enhancement in step P2 includes:
p201, decomposing the electroencephalogram signals into data segments in a time domain;
p202, interchanging the data segments among the electroencephalogram signals acquired in each acquisition process; the swapped data segments have the same time domain position;
and P203, carrying out second band-pass filtering on the electroencephalogram signal.
The principle of steps P201-P203 can be seen in fig. 5. In this embodiment, the electroencephalogram signal 1, the electroencephalogram signal 2, and the electroencephalogram signal 3 are respectively acquired through three acquisition processes, and then the electroencephalogram signal 1, the electroencephalogram signal 2, and the electroencephalogram signal 3 are respectively divided into three data segments in a time domain. When the format of the electroencephalogram signal is 3 × 875, the size of each data segment is 3 × 291, 3 × 292, and 3 × 292, respectively. Referring to fig. 5, the swapped data segments have the same time domain position, for example, the last data segment of the electroencephalogram signal 1 is swapped with the last data segment of the electroencephalogram signal 2, and the middle data segment of the electroencephalogram signal 2 is swapped with the middle data segment of the electroencephalogram signal 3. And after the conversion is finished, carrying out second band-pass filtering on the electroencephalogram signals 1, the electroencephalogram signals 2 and the electroencephalogram signals 3, wherein the pass band of the second band-pass filtering is 2Hz-35 Hz.
After the steps P201 to P203 are performed once, the strength of the electroencephalogram signal is enhanced, in this embodiment, the steps P201 to P203 are repeatedly performed for many times, and the steps P201 to P203 are not performed until the strength of the electroencephalogram signal is enhanced to 3 times of the original strength.
In this embodiment, after one or more steps P201 to P203 are performed, frequency domain data enhancement is performed on the electroencephalogram data subjected to time domain data enhancement.
In this embodiment, the frequency domain data enhancement in step P2 includes:
p204, performing third band-pass filtering on the electroencephalogram signals subjected to the time domain data enhancement; the third band-pass filtering has a plurality of pass bands, and the result of the third band-pass filtering is to obtain the frequency components of the electroencephalogram signal;
p205, carrying out the conversion of the frequency components among the electroencephalogram signals acquired in each acquisition process; the frequency components that are transposed have the same frequency domain position.
The principle of steps P204-P205 can be seen in fig. 5. In this embodiment, the electroencephalogram signal 1, the electroencephalogram signal 2, and the electroencephalogram signal 3 are respectively obtained through three obtaining processes. And (3) performing third band-pass filtering on two electroencephalograms, as shown in fig. 6, and performing third band-pass filtering on an electroencephalogram 1 and an electroencephalogram 2 respectively, wherein the third band-pass filtering has three pass bands of 4-7Hz (theta rhythm), 8-13Hz (mu rhythm), 13-32Hz (beta rhythm) and the like. Performing third band-pass filtering on the electroencephalogram signal 1 to obtain frequency components of the electroencephalogram signal 1 on 4-7Hz, 8-13Hz and 13-32 Hz; and carrying out third band-pass filtering on the electroencephalogram signal 2 to obtain frequency components of the electroencephalogram signal 2 on 4-7Hz, 8-13Hz and 13-32 Hz.
In the execution of step P205, the frequency components to be transposed have the same frequency domain position, for example, referring to fig. 5, the frequency component at 13-32Hz in the electroencephalogram signal 1 is transposed with the frequency component at 13-32Hz in the electroencephalogram signal 2, thereby completing the frequency enhancement process between the electroencephalogram signal 1 and the electroencephalogram signal 2.
In this embodiment, the steps P204 to P205 may be repeatedly executed for a plurality of times, and each time the steps P204 to P205 are executed, pairwise matching and frequency component conversion are performed on different electroencephalogram signals, until the steps P204 to P205 are not executed after pairwise matching and frequency component conversion between all electroencephalogram signals are completed. For example, the electroencephalogram signals collected in this embodiment include an electroencephalogram signal 1, an electroencephalogram signal 2, and an electroencephalogram signal 3, and frequency components are exchanged by respectively combining the electroencephalogram signal 1 with the electroencephalogram signal 2, combining the electroencephalogram signal 1 with the electroencephalogram signal 3, and combining the electroencephalogram signal 2 with the electroencephalogram signal 3.
The time domain data enhancement and the frequency domain data enhancement realized by executing the steps P201-P205 can expand the data volume on the basis of the original acquired electroencephalogram signals, thereby being beneficial to the training of the convolutional neural network.
In this embodiment, the signal size of the electroencephalogram signal acquired in each acquisition process is 3 × 3 × 875, and the structure of the used convolutional neural network is shown in table 1.
TABLE 1
Figure BDA0002596407130000061
In this embodiment, the convolutional neural network includes an input layer, a time convolutional layer, a depth convolutional layer, a first pooling layer, a separable convolutional layer, a second pooling layer, a fully-connected layer, and an output layer, which are connected in sequence.
In this embodiment, when the convolutional neural network is trained using the electroencephalogram signal, the convolutional neural network receives the input electroencephalogram signal and transmits the electroencephalogram signal to the time convolution layer.
Each time convolution layer has 3 different convolution kernels, the sizes of which are 1 × 85, 1 × 65 and 1 × 45, respectively, the convolution step size is 1 × 3, and the dimension of the output space is 10. The output format of the time convolution layer is 3 parallel feature maps, each having dimensions of 3 × 264 × 10, 3 × 271 × 10, and 3 × 277 × 10, respectively. The characteristic diagram output by the time convolution layer contains time characteristic information of the electroencephalogram signals.
The feature map output by the time convolution layer is input to the depth convolution layer. Each depth convolution layer has 3 identical convolution kernels, the sizes of the convolution kernels are all 3 × 1, the convolution step size is 1 × 1, and the dimension of an output space is 3. The output format of the depth convolution layer is 3 parallel feature maps, each having dimensions of 1 × 264 × 30, 1 × 271 × 30, and 1 × 277 × 30, respectively. The feature map output by the depth convolution layer contains spatial feature information of the electroencephalogram signal.
The feature map output by the depth convolution layer is input to the first pooling layer. Each first pooling layer has 3 identical convolution kernels, each of which has a size of 1 × 6, a convolution step size of 1 × 3, and a dimension of an output space of 1. The output format of the first pooling layer is 3 parallel feature maps, each having a size of 1 × 87 × 30, 1 × 89 × 30, 1 × 91 × 30, respectively. The first pooling layer may compress and simplify the feature map output by the depth convolution layer.
The signature graph output by the first pooling layer is input to the separable convolutional layer. Each separable convolutional layer has 3 identical convolutional kernels, each of which has a size of 1 × 8, a convolution step size of 1 × 1, and a dimension of an output space of 30. The output format of the separable convolutional layer is 3 parallel feature maps, each of which has a size of 1 × 87 × 30, 1 × 89 × 30, and 1 × 91 × 30, respectively. The characteristic diagram output by the separable convolutional layer contains the frequency characteristic information of the electroencephalogram signal.
The feature map of the separable convolutional layer output is input to a second pooling layer. Each second pooling layer has 3 identical convolution kernels, each of which has a size of 1 × 6, a convolution step size of 1 × 3, and a dimension of an output space of 1. The output format of the second pooling layer is 3 parallel feature maps, each having a size of 1 × 28 × 30, 1 × 29 × 30, respectively. The second pooling layer may compress and simplify the signature graph output by the separable convolutional layer.
The output signals of the second pooling layer are fused through the full connection layer, and then the identification result is obtained through 2 classification output of the softmax layer.
In this embodiment, a label may be set for the electroencephalogram data, and a training end condition may be determined by calculating a distance between the recognition result output by the convolutional neural network and the label of the electroencephalogram data input to the convolutional neural network, for example, when the distance between the recognition result and the corresponding label is less than a preset threshold or the training frequency reaches a predetermined value, the training of the convolutional neural network is ended.
In this embodiment, the convolutional neural network used is a multi-input, multi-convolution scale, multi-convolution type hybrid convolutional neural network, the sizes of the multi-input convolutional layer and the convolutional kernel are reasonably designed, and the recognition accuracy is high. The training set used for training the convolutional neural network is obtained by performing time domain data enhancement and frequency domain data enhancement expansion on the basis of the acquired electroencephalogram signals, so that the training data volume of the convolutional neural network can be increased, the overfitting phenomenon is reduced, the noise interference in the electroencephalogram signals can be effectively responded, and the recognition effect is improved.
In this embodiment, the method for recognizing an electroencephalogram signal based on a convolutional neural network trained by the above training method may include the following steps:
s1, acquiring an electroencephalogram signal to be processed;
s2, inputting the electroencephalogram signal to be processed into a convolutional neural network;
and S3, obtaining an output result of the convolutional neural network.
If the electroencephalogram signals in the training set are labeled to distinguish the types of the electroencephalogram signals when the convolutional neural network is trained, after the steps S1-S3 are executed, the types of the electroencephalogram signals to be processed received by the convolutional neural network can be determined according to the output result of the convolutional neural network.
The convolutional neural network trained based on the steps P1-P3 has high recognition accuracy and unobvious overfitting phenomenon, and can effectively interfere with noise in the electroencephalogram signals to be processed and improve recognition effect.
In this embodiment, a computer device includes a memory and a processor, where the memory is used to store at least one program, and the processor is used to load the at least one program to execute a convolutional neural network training method or an electroencephalogram signal recognition method in the embodiment, so as to achieve the same technical effects as those described in the embodiment.
In this embodiment, a storage medium stores therein processor-executable instructions, which when executed by a processor, are configured to perform a convolutional neural network training method or an electroencephalogram signal recognition method in the embodiment, and achieve the same technical effects as those described in the embodiment.
It should be noted that, unless otherwise specified, when a feature is referred to as being "fixed" or "connected" to another feature, it may be directly fixed or connected to the other feature or indirectly fixed or connected to the other feature. Furthermore, the descriptions of upper, lower, left, right, etc. used in the present disclosure are only relative to the mutual positional relationship of the constituent parts of the present disclosure in the drawings. As used in this disclosure, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. In addition, unless defined otherwise, all technical and scientific terms used in this example have the same meaning as commonly understood by one of ordinary skill in the art. The terminology used in the description of the embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this embodiment, the term "and/or" includes any combination of one or more of the associated listed items.
It will be understood that, although the terms first, second, third, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element of the same type from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the present disclosure. The use of any and all examples, or exemplary language ("e.g.," such as "or the like") provided with this embodiment is intended merely to better illuminate embodiments of the invention and does not pose a limitation on the scope of the invention unless otherwise claimed.
It should be recognized that embodiments of the present invention can be realized and implemented by computer hardware, a combination of hardware and software, or by computer instructions stored in a non-transitory computer readable memory. The methods may be implemented in a computer program using standard programming techniques, including a non-transitory computer-readable storage medium configured with the computer program, where the storage medium so configured causes a computer to operate in a specific and predefined manner, according to the methods and figures described in the detailed description. Each program may be implemented in a high level procedural or object oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language. Furthermore, the program can be run on a programmed application specific integrated circuit for this purpose.
Further, operations of processes described in this embodiment can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The processes described in this embodiment (or variations and/or combinations thereof) may be performed under the control of one or more computer systems configured with executable instructions, and may be implemented as code (e.g., executable instructions, one or more computer programs, or one or more applications) collectively executed on one or more processors, by hardware, or combinations thereof. The computer program includes a plurality of instructions executable by one or more processors.
Further, the method may be implemented in any type of computing platform operatively connected to a suitable interface, including but not limited to a personal computer, mini computer, mainframe, workstation, networked or distributed computing environment, separate or integrated computer platform, or in communication with a charged particle tool or other imaging device, and the like. Aspects of the invention may be embodied in machine-readable code stored on a non-transitory storage medium or device, whether removable or integrated into a computing platform, such as a hard disk, optically read and/or write storage medium, RAM, ROM, or the like, such that it may be read by a programmable computer, which when read by the storage medium or device, is operative to configure and operate the computer to perform the procedures described herein. Further, the machine-readable code, or portions thereof, may be transmitted over a wired or wireless network. The invention described in this embodiment includes these and other different types of non-transitory computer-readable storage media when such media include instructions or programs that implement the steps described above in conjunction with a microprocessor or other data processor. The invention also includes the computer itself when programmed according to the methods and techniques described herein.
A computer program can be applied to input data to perform the functions described in the present embodiment to convert the input data to generate output data that is stored to a non-volatile memory. The output information may also be applied to one or more output devices, such as a display. In a preferred embodiment of the invention, the transformed data represents physical and tangible objects, including particular visual depictions of physical and tangible objects produced on a display.
The above description is only a preferred embodiment of the present invention, and the present invention is not limited to the above embodiment, and any modifications, equivalent substitutions, improvements, etc. within the spirit and principle of the present invention should be included in the protection scope of the present invention as long as the technical effects of the present invention are achieved by the same means. The invention is capable of other modifications and variations in its technical solution and/or its implementation, within the scope of protection of the invention.

Claims (10)

1. A convolutional neural network training method, comprising:
performing a plurality of acquisition processes; each acquisition process is respectively used for acquiring an electroencephalogram signal;
performing time domain data enhancement and frequency domain data enhancement on the electroencephalogram signal;
and training the convolutional neural network by using the electroencephalogram signals subjected to the time domain data enhancement and the frequency domain data enhancement.
2. The convolutional neural network training method of claim 1, wherein the acquisition process comprises:
acquiring electroencephalogram signals generated by a subject during motor imagery through C3, Cz and C4 channels;
classifying the electroencephalogram signals into left-hand motor imagery electroencephalogram signals or right-hand motor imagery electroencephalogram signals;
carrying out classification marking on the electroencephalogram signals;
classifying the electroencephalogram signals into a training set or a testing set; the training set is used for training the convolutional neural network, and the test set is used for testing the convolutional neural network.
3. The convolutional neural network training method of claim 1, further comprising:
carrying out abnormal value screening on the electroencephalogram signals;
and carrying out first band-pass filtering on the electroencephalogram signals.
4. The convolutional neural network training method of any one of claims 1-3, wherein the time domain data enhancement comprises:
decomposing the electroencephalogram signal into data segments in a time domain;
carrying out the data segment exchange among the electroencephalogram signals acquired in each acquisition process; the swapped data segments have the same time domain position;
and carrying out second band-pass filtering on the electroencephalogram signal.
5. The convolutional neural network training method of claim 4, wherein the frequency domain data enhancement comprises:
performing third band-pass filtering on the electroencephalogram signals enhanced by the time domain data; the third band-pass filtering has a plurality of pass bands, and the result of the third band-pass filtering is to obtain the frequency components of the electroencephalogram signal;
carrying out the frequency component conversion between the electroencephalogram signals acquired in each acquisition process; the frequency components that are transposed have the same frequency domain position.
6. The convolutional neural network training method of claim 5, wherein:
when the intensity of the electroencephalogram signal reaches a preset threshold value, stopping the time domain data enhancement;
and when all the electroencephalogram signals are subjected to the time domain data enhancement at least once, stopping the time domain data enhancement.
7. The convolutional neural network training method of claim 5, wherein the convolutional neural network comprises an input layer, a time convolutional layer, a depth convolutional layer, a first pooling layer, a separable convolutional layer, a second pooling layer, a fully-connected layer and an output layer, which are connected in sequence;
the input layer is used for receiving the electroencephalogram signals;
the time convolution layer is used for extracting time characteristic information from the electroencephalogram signals;
the depth convolution layer is used for extracting spatial characteristic information from the electroencephalogram signal;
the separable convolutional layer is used for extracting frequency characteristic information from the electroencephalogram signal;
the first pooling layer and the second pooling layer are used for compressing and simplifying the temporal feature information, the spatial feature information and the frequency feature information;
the full connection layer is used for fusing the output results of the second pooling layer;
and the output layer is used for carrying out classified output according to the fusion result of the full connection layer.
8. An electroencephalogram signal identification method is characterized by comprising the following steps:
acquiring an electroencephalogram signal to be processed;
inputting the electroencephalogram signal to be processed into a convolutional neural network; the convolutional neural network is trained by the training method according to any one of claims 1 to 7;
acquiring an output result of the convolutional neural network; the output result of the convolutional neural network comprises the type of the electroencephalogram signal.
9. A computer apparatus comprising a memory for storing at least one program and a processor for loading the at least one program to perform the method of any one of claims 1-8.
10. A storage medium having stored therein processor-executable instructions, which when executed by a processor, are configured to perform the method of any one of claims 1-8.
CN202010710647.9A 2020-07-22 2020-07-22 Convolutional neural network training method, electroencephalogram signal identification method, device and medium Active CN111882036B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010710647.9A CN111882036B (en) 2020-07-22 2020-07-22 Convolutional neural network training method, electroencephalogram signal identification method, device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010710647.9A CN111882036B (en) 2020-07-22 2020-07-22 Convolutional neural network training method, electroencephalogram signal identification method, device and medium

Publications (2)

Publication Number Publication Date
CN111882036A true CN111882036A (en) 2020-11-03
CN111882036B CN111882036B (en) 2023-10-31

Family

ID=73155193

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010710647.9A Active CN111882036B (en) 2020-07-22 2020-07-22 Convolutional neural network training method, electroencephalogram signal identification method, device and medium

Country Status (1)

Country Link
CN (1) CN111882036B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112370017A (en) * 2020-11-09 2021-02-19 腾讯科技(深圳)有限公司 Training method and device of electroencephalogram classification model and electronic equipment
CN114942410A (en) * 2022-05-31 2022-08-26 哈尔滨工业大学 Interference signal identification method based on data amplification

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109299751A (en) * 2018-11-26 2019-02-01 南开大学 The SSVEP brain electricity classification method of convolutional Neural model based on the enhancing of EMD data
CN109711383A (en) * 2019-01-07 2019-05-03 重庆邮电大学 Convolutional neural networks Mental imagery EEG signal identification method based on time-frequency domain
CN109784242A (en) * 2018-12-31 2019-05-21 陕西师范大学 EEG Noise Cancellation based on one-dimensional residual error convolutional neural networks
CN110059565A (en) * 2019-03-20 2019-07-26 杭州电子科技大学 A kind of P300 EEG signal identification method based on improvement convolutional neural networks
CN110069958A (en) * 2018-01-22 2019-07-30 北京航空航天大学 A kind of EEG signals method for quickly identifying of dense depth convolutional neural networks
CN110163180A (en) * 2019-05-29 2019-08-23 长春思帕德科技有限公司 Mental imagery eeg data classification method and system
CN110263606A (en) * 2018-08-30 2019-09-20 周军 Scalp brain electrical feature based on end-to-end convolutional neural networks extracts classification method
CN110353702A (en) * 2019-07-02 2019-10-22 华南理工大学 A kind of emotion identification method and system based on shallow-layer convolutional neural networks
CN110765920A (en) * 2019-10-18 2020-02-07 西安电子科技大学 Motor imagery classification method based on convolutional neural network
CN110929581A (en) * 2019-10-25 2020-03-27 重庆邮电大学 Electroencephalogram signal identification method based on space-time feature weighted convolutional neural network
CN111012336A (en) * 2019-12-06 2020-04-17 重庆邮电大学 Parallel convolutional network motor imagery electroencephalogram classification method based on spatio-temporal feature fusion

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110069958A (en) * 2018-01-22 2019-07-30 北京航空航天大学 A kind of EEG signals method for quickly identifying of dense depth convolutional neural networks
CN110263606A (en) * 2018-08-30 2019-09-20 周军 Scalp brain electrical feature based on end-to-end convolutional neural networks extracts classification method
CN109299751A (en) * 2018-11-26 2019-02-01 南开大学 The SSVEP brain electricity classification method of convolutional Neural model based on the enhancing of EMD data
CN109784242A (en) * 2018-12-31 2019-05-21 陕西师范大学 EEG Noise Cancellation based on one-dimensional residual error convolutional neural networks
CN109711383A (en) * 2019-01-07 2019-05-03 重庆邮电大学 Convolutional neural networks Mental imagery EEG signal identification method based on time-frequency domain
CN110059565A (en) * 2019-03-20 2019-07-26 杭州电子科技大学 A kind of P300 EEG signal identification method based on improvement convolutional neural networks
CN110163180A (en) * 2019-05-29 2019-08-23 长春思帕德科技有限公司 Mental imagery eeg data classification method and system
CN110353702A (en) * 2019-07-02 2019-10-22 华南理工大学 A kind of emotion identification method and system based on shallow-layer convolutional neural networks
CN110765920A (en) * 2019-10-18 2020-02-07 西安电子科技大学 Motor imagery classification method based on convolutional neural network
CN110929581A (en) * 2019-10-25 2020-03-27 重庆邮电大学 Electroencephalogram signal identification method based on space-time feature weighted convolutional neural network
CN111012336A (en) * 2019-12-06 2020-04-17 重庆邮电大学 Parallel convolutional network motor imagery electroencephalogram classification method based on spatio-temporal feature fusion

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112370017A (en) * 2020-11-09 2021-02-19 腾讯科技(深圳)有限公司 Training method and device of electroencephalogram classification model and electronic equipment
CN114942410A (en) * 2022-05-31 2022-08-26 哈尔滨工业大学 Interference signal identification method based on data amplification
CN114942410B (en) * 2022-05-31 2022-12-20 哈尔滨工业大学 Interference signal identification method based on data amplification

Also Published As

Publication number Publication date
CN111882036B (en) 2023-10-31

Similar Documents

Publication Publication Date Title
CN111317468B (en) Electroencephalogram signal classification method, electroencephalogram signal classification device, computer equipment and storage medium
CN111329474A (en) Electroencephalogram identity recognition method and system based on deep learning and information updating method
CN111882036B (en) Convolutional neural network training method, electroencephalogram signal identification method, device and medium
Xu et al. High accuracy classification of EEG signal
CN108334766B (en) Electronic device, unlocking method and Related product
CN105266804B (en) A kind of brain-electrical signal processing method based on low-rank and sparse matrix decomposition
Leeds et al. Comparing visual representations across human fMRI and computational vision
CN107656612B (en) Large instruction set brain-computer interface method based on P300-SSVEP
CN109965871B (en) Method, system, medium, and apparatus for analyzing brain-computer interface signal
CN113536882B (en) Multi-class motor imagery electroencephalogram signal feature extraction and classification method
CN111671420A (en) Method for extracting features from resting electroencephalogram data and terminal equipment
Caramia et al. Optimizing spatial filter pairs for EEG classification based on phase-synchronization
CN108523883A (en) A kind of continuous Mental imagery identifying system of left and right index finger based on actual act modeling
CN113220120A (en) Self-adaptive motor imagery brain-computer interface training method fusing subjective and objective evaluation
Zhang et al. An amplitudes-perturbation data augmentation method in convolutional neural networks for EEG decoding
KR102300459B1 (en) Apparatus and method for generating a space-frequency feature map for deep-running based brain-computer interface
Fan et al. Joint filter-band-combination and multi-view CNN for electroencephalogram decoding
CN111772629A (en) Brain cognitive skill transplantation method
CN116541751B (en) Electroencephalogram signal classification method based on brain function connection network characteristics
Velásquez-Martínez et al. Motor imagery classification for BCI using common spatial patterns and feature relevance analysis
CN116369950A (en) Target detection method based on electroencephalogram tracing and multi-feature extraction
CN113952707B (en) Motion sensing game action recognition method, scoring method and system based on RFID
CN115392287A (en) Electroencephalogram signal online self-adaptive classification method based on self-supervision learning
CN113662561B (en) Electroencephalogram feature extraction method and device of subband cascade co-space mode
CN115644842A (en) Vital sign signal extraction method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant