CN116842329A - Motor imagery task classification method and system based on electroencephalogram signals and deep learning - Google Patents

Motor imagery task classification method and system based on electroencephalogram signals and deep learning Download PDF

Info

Publication number
CN116842329A
CN116842329A CN202310843420.5A CN202310843420A CN116842329A CN 116842329 A CN116842329 A CN 116842329A CN 202310843420 A CN202310843420 A CN 202310843420A CN 116842329 A CN116842329 A CN 116842329A
Authority
CN
China
Prior art keywords
data
electroencephalogram
signals
motor imagery
signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310843420.5A
Other languages
Chinese (zh)
Inventor
黄辰
王时绘
张龑
张丽
刘小雨
宋建华
吴伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hubei University
Original Assignee
Hubei University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei University filed Critical Hubei University
Priority to CN202310843420.5A priority Critical patent/CN116842329A/en
Publication of CN116842329A publication Critical patent/CN116842329A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/10Pre-processing; Data cleansing
    • G06F18/15Statistical pre-processing, e.g. techniques for normalisation or restoring missing data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/372Analysis of electroencephalograms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/725Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • G06N3/0442Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/047Probabilistic or stochastic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2123/00Data types
    • G06F2123/02Data types in the time domain, e.g. time-series data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • G06F2218/04Denoising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Medical Informatics (AREA)
  • Computational Linguistics (AREA)
  • Psychiatry (AREA)
  • Pathology (AREA)
  • Surgery (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Physiology (AREA)
  • Multimedia (AREA)
  • Probability & Statistics with Applications (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Psychology (AREA)
  • Fuzzy Systems (AREA)
  • Human Computer Interaction (AREA)
  • Neurosurgery (AREA)
  • Neurology (AREA)

Abstract

The invention discloses a motor imagery task classification method and system based on electroencephalogram signals and deep learning. Firstly acquiring an electroencephalogram signal, carrying out normalization processing on the electroencephalogram signal to obtain a normalized data value, and carrying out polar coordinate conversion on the normalized data value; converting the polar coordinate angle into a gram matrix, and storing the gram matrix as a characteristic diagram symmetrical along a diagonal line; and finally, inputting the feature images which are symmetrical along the diagonal line into a dual-attention hybrid neural network model to obtain a motor imagery task classification result. According to the invention, the motor imagery electroencephalogram is converted into the gram angle field diagram which keeps time dependence and reflects time domain waveform characteristics, and the depth of the mixed neural network is constructed to excavate information of an important area, so that the problems of large individual variability and difficulty in learning effective characteristics are solved, and the motor imagery tasks can be accurately classified.

Description

Motor imagery task classification method and system based on electroencephalogram signals and deep learning
Technical Field
The invention relates to the technical field of brain-computer interfaces, in particular to a motor imagery task classification method and system based on brain electrical signals and deep learning.
Background
In motor imagery tasks, the patient is often required to imagine performing certain specific movements, such as flexion and extension of the arm or movement of the leg. These imaginative movements produce specific brain electrical signals that can be used to control external devices such as prostheses or wheelchairs. However, since the brain structures and motor imagery abilities of each individual are different, how to accurately classify these motor imagery tasks, and thus accurately control external devices, is currently a challenging task.
Disclosure of Invention
The motor imagery task classification method and system based on the electroencephalogram signals and the deep learning can accurately classify the motor imagery tasks, so that control instructions are output to external equipment, and accurate control of the external equipment is achieved.
The invention provides a motor imagery task classification method based on electroencephalogram signals and deep learning, which comprises the following steps:
collecting brain electrical signals;
normalizing the electroencephalogram signals to obtain normalized data values;
by the formulaPerforming polar coordinate conversion on the normalized data value; wherein θ represents a polar coordinate angle, +.>Data value representing normalized time series at time i,/->Represents normalized time series, r represents polar radius, t i Representing a time stamp, N being a constant;
by the formulaConverting the polar coordinate angle into a gram matrix, and storing the gram matrix as a characteristic diagram symmetrical along a diagonal line; wherein I represents a unit vector, θ 1 And theta n The angle of the polar coordinates at time 1 and time n, respectively, +.>Representation->Is a transposed vector of (2);
and inputting the feature map symmetrical along the diagonal line into a dual-attention hybrid neural network model to obtain a motor imagery task classification result.
Specifically, after the electroencephalogram signal is acquired, the method further comprises:
and filtering and data expansion are carried out on the electroencephalogram signals.
Specifically, the filtering the electroencephalogram signal includes:
by the formulaCalculating the average value of the electroencephalogram signals to obtain a reference signal +.>Wherein X is i (t) represents an electroencephalogram signal of each electrode, N is the total electrode number, i represents an electrode number, and t represents time;
by the formulaCalculating to obtain a corrected signal value Y i (t)。
Specifically, the filtering the electroencephalogram signal includes:
calculating various parameters of the five-order zero-phase Butterworth filter according to the normalized cut-off frequency, and constructing a transfer function of the five-order zero-phase Butterworth filter through the parameters;
and filtering the electroencephalogram signals according to the transfer function.
Specifically, the performing data expansion on the electroencephalogram signal includes:
sliding the original electroencephalogram signal of C×T horizontally along the time axis in step length of S, and dividing the data into two sections, namely 0- (T-S) and (T-S) -T;
exchanging sequences of two sections of data, sliding the (T-S) -T section of data to a starting point, and repeating k times until kS is more than or equal to T;
each piece of data is cut into a predetermined number of pieces using a window of predetermined length that slides over the data, each piece having the same tag as the original data.
The invention also provides a motor imagery task classification system based on the electroencephalogram signals and the deep learning, which comprises the following steps:
the electroencephalogram signal acquisition module is used for acquiring electroencephalogram signals;
the normalization module is used for carrying out normalization processing on the electroencephalogram signals to obtain normalized data values;
the polar coordinate conversion module is used for passing through the formulaPerforming polar coordinate conversion on the normalized data value; wherein θ represents a polar coordinate angle, +.>Data value representing normalized time series at time i,/->Represents normalized time series, r represents polar radius, t i Representing a time stamp, N being a constant;
a gram matrix conversion module for passing through formulaConverting the polar coordinate angle into a gram matrix, andsaving the gram matrix as a feature map symmetrical along a diagonal line; wherein I represents a unit vector, θ 1 And theta n The angle of the polar coordinates at time 1 and time n, respectively, +.>Representation->Is a transposed vector of (2);
and the motor imagery task classification module is used for inputting the feature images which are symmetrical along the diagonal line into the dual-attention hybrid neural network model to obtain a motor imagery task classification result.
Specifically, the method further comprises the steps of:
the electroencephalogram signal filtering module is used for filtering the electroencephalogram signals;
and the electroencephalogram data expansion module is used for carrying out data expansion on the electroencephalogram signals.
Specifically, the electroencephalogram signal filtering module at least comprises:
a spatial filtering unit for passing through the formulaCalculating the average value of the electroencephalogram signals to obtain a reference signal +.>Wherein X is i (t) represents an electroencephalogram signal of each electrode, N is the total electrode number, i represents an electrode number, and t represents time;
a signal correction unit for passing through the formulaCalculating to obtain a corrected signal value Y i (t)。
Specifically, the electroencephalogram signal filtering module at least comprises:
the frequency filtering unit is used for calculating various parameters of the five-order zero-phase Butterworth filter according to the normalized cut-off frequency, and constructing a transfer function of the five-order zero-phase Butterworth filter through the parameters; and filtering the electroencephalogram signals according to the transfer function.
Specifically, the electroencephalogram data expansion module includes:
the cyclic sliding recombination unit is used for horizontally sliding the C multiplied by T original electroencephalogram signals along a time axis in a step length of S, and the data are divided into two sections, namely 0- (T-S) and (T-S) -T; exchanging sequences of two sections of data, sliding the (T-S) -T section of data to a starting point, and repeating k times until kS is more than or equal to T;
and a data clipping unit for clipping each data into a predetermined number of pieces using a window of a predetermined length, the window sliding over the data, each piece having the same tag as the original data.
One or more technical schemes provided by the invention have at least the following technical effects or advantages:
firstly acquiring an electroencephalogram signal, carrying out normalization processing on the electroencephalogram signal to obtain a normalized data value, and carrying out polar coordinate conversion on the normalized data value; converting the polar coordinate angle into a gram matrix, and storing the gram matrix as a characteristic diagram symmetrical along a diagonal line; and finally, inputting the feature images which are symmetrical along the diagonal line into a dual-attention hybrid neural network model to obtain a motor imagery task classification result. According to the invention, the motor imagery electroencephalogram is converted into the gram angle field diagram which keeps time dependence and reflects time domain waveform characteristics, and the depth of the mixed neural network is constructed to excavate information of an important area, so that the problems of large individual variability and difficulty in learning effective characteristics are solved, and the motor imagery tasks can be accurately classified.
In addition, the invention has the following advantages:
1. the common average reference filter and the five-order zero-phase Butterworth filter are used for carrying out spatial filtering and frequency filtering on the data on the premise of not affecting the characteristics of the original data, so that the local abnormal data are removed, the signal quality is improved, and the signal-to-noise ratio of the data is improved.
2. The new data enhancement mode is adopted, namely, the time sequence is divided into two parts along a time axis and positions are exchanged, then the time sequence is circularly operated to obtain a reconstructed time sequence, and the reconstructed time sequence is cut according to a fixed size, so that the data volume is increased, and the robustness of the deep learning model is improved.
Drawings
Fig. 1 is a flowchart of a motor imagery task classification method based on electroencephalogram signals and deep learning provided by an embodiment of the invention;
fig. 2 is a schematic diagram of a motor imagery task classification method based on electroencephalogram signals and deep learning according to an embodiment of the present invention;
fig. 3 is a schematic diagram of data expansion in a motor imagery task classification method based on electroencephalogram signals and deep learning according to an embodiment of the present invention;
fig. 4 is a graph of a treatment effect of a gram angle field in a motor imagery task classification method based on electroencephalogram signals and deep learning according to an embodiment of the present invention;
fig. 5 is a block diagram of a motor imagery task classification system based on electroencephalogram signals and deep learning according to an embodiment of the present invention.
Detailed Description
The embodiment of the invention provides a motor imagery task classification method and a motor imagery task classification system based on electroencephalogram signals and deep learning, which can accurately classify motor imagery tasks, so that control instructions are output to external equipment, and accurate control of the external equipment is realized.
The technical scheme in the embodiment of the invention aims to achieve the technical effects, and the overall thought is as follows:
the embodiment of the invention needs to process the original motor imagery electroencephalogram signals. The signal processing is mainly composed of two parts: signal preprocessing and generating a gram angle field diagram;
specifically, signal preprocessing, designing a common average reference filter and a five-order zero-phase Butterworth filter for carrying out space and frequency filtering on motor imagery electroencephalogram data based on the characteristics of the electroencephalogram data in a time domain and a space domain so as to obtain a target frequency band;
1) Spatial filtering, applying a common average reference filter, removing internal noise and external noise sources, leaving only unique activity of each electroencephalogram signal in each channel;
2) Frequency filtering, namely extracting Mu wave band and Beta wave band by using a five-order zero-phase Butterworth filter;
3) And expanding the data, and obtaining new data through sliding, reorganizing and shearing.
The gram of the angular field is generated by using a time sequence coding method, namely the gram of the angular difference field, and the time sequence is mapped on a unit circle in a polar coordinate system so as to maintain time correlation. Then, processing the electroencephalogram signals according to a gram angle field formula, and calculating a corresponding GADF image;
the main processing flow of the generation of the gram angle field diagram is as follows:
1) Sequence normalization, scaling the time sequence x= { X1, X2, …, xn } to [ -1,1] by a scaler;
2) Polar coordinate conversion, converting the result of the calculation in the previous step into a polar coordinate time sequence; the radius represents the distance from the pixel point to the center point of the image, and the angle represents the included angle between the pixel point and the vertical direction;
3) And after the acquisition of the gram angle domain and the conversion into a polar coordinate system, the gram angle domain is converted into a characteristic diagram symmetrical along a diagonal line by using a gram angle difference field coding technology.
The embodiment of the invention builds a hybrid neural network model integrating two layers of attention mechanisms, can extract local characteristic information from the gram angle field image, and establishes global spatial correlation of the characteristics. Meanwhile, the information of the important area is mined by using an attention mechanism; the model combines a convolutional neural network and a long-term and short-term memory network, and adopts a two-layer attention mechanism. The model is divided into five parts, including a CNN local feature extraction layer, a local feature attention layer, an LSTM global feature learning layer, a global feature attention layer and a classification layer; the network first mines local features through CNNs, then enhances the expressive power of basic local features through attentional mechanisms, and then uses the RNNs' long-term storage power to build up spatial global correlations of local features. Next, the attention distribution of the hidden states of the Long Short Term Memory network (LSTM) is calculated by an attention mechanism, and finally weighted summation is performed on the hidden states. And finally, sending the motion imagination characteristics to a classifier as optimized motion imagination characteristics to obtain a final classification result.
In order to better understand the above technical solutions, the following detailed description will refer to the accompanying drawings and specific embodiments.
Referring to fig. 1 and fig. 2, the motor imagery task classification method based on electroencephalogram signals and deep learning provided by the embodiment of the invention includes:
step S110: collecting brain electrical signals;
in order to improve the quality of the electroencephalogram signals and the robustness of the deep learning model, after the electroencephalogram signals are acquired, the method further comprises the following steps:
and filtering and data expansion are carried out on the brain electrical signals.
In the embodiment of the invention, two modes of filtering the electroencephalogram signal are provided, namely spatial filtering and frequency filtering. Specifically, filtering the electroencephalogram signal includes:
one electrode or several electrodes are selected as reference electrodes. In this embodiment, an electrode located at the center of the head is selected as a reference electrode; by the formulaCalculating the average value of the brain electrical signals to obtain a reference signalWherein X is i (t) represents an electroencephalogram signal of each electrode, N is the total electrode number, i represents an electrode number, and t represents time;
by the formulaCalculating to obtain a corrected signal value Y i (t), that is, subtracting the value of the reference signal from the signal of each electrode to obtain a correctionA post signal; wherein Y is i (t) is the corrected signal value of the ith electrode at time t. And removing the influence of the reference signal on the signal to obtain a corrected signal, thereby realizing the spatial filtering of the brain electrical signal.
Filtering the brain electrical signal, further comprising:
defining a cut-off frequency of the filter, i.e. a signal frequency range to be preserved; and calculating various parameters of the five-order zero-phase Butterworth filter according to the normalized cut-off frequency, wherein the parameters comprise pole positions, zero positions and the like. The transfer function of the five-order zero-phase butterworth filter is constructed by these parameters, i.e. the filter is expressed as a proportional relationship between the input signal and the output signal.
And filtering the electroencephalogram signals according to a transfer function, wherein the output signals passing through the filter are new signals obtained after the filter is processed, only signals below or above a designated frequency are reserved, and signals of other frequencies are weakened or removed, so that the frequency filtering of the electroencephalogram signals is realized.
Specifically, referring to fig. 3, performing data expansion on an electroencephalogram signal includes:
sliding the original electroencephalogram signal of C×T horizontally along the time axis in step length of S, and dividing the data into two sections, namely 0- (T-S) and (T-S) -T;
exchanging sequences of two sections of data, sliding the (T-S) -T section of data to a starting point, and repeating k times until kS is more than or equal to T;
each piece of data is cut into a predetermined number of pieces using a window of predetermined length that slides over the data, each piece having the same tag as the original data.
It should be noted that the order of spatial filtering, frequency filtering and data expansion is not limited in this embodiment of the present invention, i.e., the order may be interchanged.
Step S120: normalizing the electroencephalogram signals to obtain normalized data values;
the specific explanation of this step is as follows:
by the formulaScaling the time series X to [ -1,1 by a scaler]Is within the range of (2); wherein (1)>The normalized data value of the time series at the ith moment is represented by X, the time series is represented by X i The data value at the i-th time point is represented by max (X) which is the maximum value in the time series, and min (X) which is the minimum value in the time series.
Step S130: by the formulaPerforming polar coordinate conversion on the normalized data value; wherein θ represents a polar coordinate angle, +.>Data value representing normalized time series at time i,/->Represents normalized time series, r represents polar radius, t i Representing a time stamp, N being a constant; the radius represents the distance from the pixel point to the center point of the image, and the angle represents the included angle between the pixel point and the vertical direction.
Step S140: by the formulaConverting the polar coordinate angle into a gram matrix, and storing the gram matrix as a characteristic diagram symmetrical along a diagonal line, as shown in fig. 4; wherein I represents a unit vector [1, …,1],θ 1 And theta n The angle of the polar coordinates at time 1 and time n, respectively, +.>Representation->Is a transposed vector of (2);
step S150: and inputting the feature images which are symmetrical along the diagonal line into the dual-attention hybrid neural network model to obtain a motor imagery task classification result, so that a control instruction is output to external equipment, and accurate control of the external equipment is realized.
In this embodiment, the dual-attention hybrid neural network includes five levels, and the main process flow of construction is as follows:
s1: the CNN local feature extraction layer consists of a convolution layer and a downsampling layer, and firstly, feature extraction is carried out on an input image through convolution operation. Then, the feature map is subjected to dimension reduction processing by a downsampling operation. Finally, the feature vector generated by serialization is transmitted into an LSTM network to be used as an input feature;
s2: the local feature attention layer is divided into two main steps, the first step is to calculate the channel attention weight of the local feature map by introducing a learnable parameter. The second step is to calculate a vector weighted sum of global spatial locations;
s3: the LSTM global feature learning layer is divided into two main steps, wherein the first step is to transmit the output sequences of the CNN local feature extraction layer and the local feature attention layer into the LSTM network. The second step is to generate a global feature vector;
s4: and a global feature attention layer which defines a query vector by taking the hidden state of the LSTM output as an input and calculates an attention weight vector of each position according to the inner product of the query vector and the input. Then, a weighted sum is calculated by using the attention weight vector and the LSTM output feature to generate a final global feature vector;
s5: and the classification layer is used for calculating the loss of the whole model on the training set by using the cross entropy loss function.
Specifically, inputting the feature map symmetrical along the diagonal line into a dual-attention hybrid neural network model to obtain a motor imagery task classification result, which specifically comprises the following steps:
the obtained feature images are input into a dual-attention hybrid neural network model, the feature images firstly extract the space features of multidimensional input through a CNN local feature extraction layer, the convolution operation is sequentially carried out on the feature images and the data in a longitudinal direction by taking a convolution kernel as a unit, simplified data are obtained, the maximum value pooling operation is carried out, and the maximum value of a feature domain is taken as a serialization feature vector to be transmitted into a local feature attention layer. And secondly, the local feature attention layer obtains the correlation between each local feature and the query vector by calculating the vector weighted sum of the channel attention weight and the global space position of the mapping, and enhances the expression capability of the basic local feature. The LSTM global feature learning layer then utilizes the gating unit and strong memory capabilities to establish global spatial correlation of the glaamer image information. And then, the global feature focusing layer performs weighted fusion on the output features of the global feature learning layer, and increases the weight of the features most relevant to the recognition task. And finally, the classification layer is responsible for classifying the weighted features fused by the global feature attention layer. In order to reduce the dimension of the features to the number of categories, the classification is performed by using full connection, a 4-dimensional vector is obtained, and the classification result of the motor imagery feature map is obtained by combining with a softmax function for normalization.
Referring to fig. 5, a motor imagery task classification system based on electroencephalogram signals and deep learning provided by an embodiment of the present invention includes:
an electroencephalogram signal acquisition module 100 for acquiring an electroencephalogram signal;
in order to improve the quality of the electroencephalogram signals and the robustness of the deep learning model, the method further comprises:
the electroencephalogram signal filtering module is used for filtering the electroencephalogram signals;
and the electroencephalogram data expansion module is used for carrying out data expansion on the electroencephalogram signals.
In the embodiment of the invention, two modes of filtering the electroencephalogram signal are provided, namely spatial filtering and frequency filtering. Specifically, the electroencephalogram signal filtering module at least comprises:
and the spatial filtering unit is used for selecting one electrode or a plurality of electrodes as reference electrodes. By the formulaCalculating the average value of the brain electrical signals to obtain a reference signal +.>Wherein X is i (t) represents an electroencephalogram signal of each electrode, N is the total electrode number, i represents an electrode number, and t represents time;
a signal correction unit for passing through the formulaCalculating to obtain a corrected signal value Y i (t) subtracting the value of the reference signal from the signal of each electrode to obtain a corrected signal; wherein Y is i (t) is the corrected signal value of the ith electrode at time t. And removing the influence of the reference signal on the signal to obtain a corrected signal, thereby realizing the spatial filtering of the brain electrical signal. In this embodiment, an electrode located at the center of the head is selected as the reference electrode.
The electroencephalogram signal filtering module at least comprises:
a frequency filtering unit, configured to define a cut-off frequency of the filter, that is, a signal frequency range to be reserved; and calculating various parameters of the five-order zero-phase Butterworth filter according to the normalized cut-off frequency, wherein the parameters comprise pole positions, zero positions and the like. The transfer function of the five-order zero-phase butterworth filter is constructed by these parameters, i.e. the filter is expressed as a proportional relationship between the input signal and the output signal. And filtering the electroencephalogram signals according to a transfer function, wherein the output signals passing through the filter are new signals obtained after the filter is processed, only signals below or above a designated frequency are reserved, and signals of other frequencies are weakened or removed, so that the frequency filtering of the electroencephalogram signals is realized.
An electroencephalogram data expansion module, comprising:
the cyclic sliding recombination unit is used for horizontally sliding the C multiplied by T original electroencephalogram signals along a time axis in a step length of S, and the data are divided into two sections, namely 0- (T-S) and (T-S) -T; exchanging sequences of two sections of data, sliding the (T-S) -T section of data to a starting point, and repeating k times until kS is more than or equal to T;
and a data clipping unit for clipping each data into a predetermined number of pieces using a window of a predetermined length, the window sliding over the data, each piece having the same tag as the original data.
The normalization module 200 is used for performing normalization processing on the electroencephalogram signals to obtain normalized data values;
specifically, the normalization module 200 is specifically configured to pass the formulaScaling the time series X to [ -1,1 by a scaler]Is within the range of (2); wherein (1)>The normalized data value of the time series at the ith moment is represented by X, the time series is represented by X i The data value at the i-th time point is represented by max (X) which is the maximum value in the time series, and min (X) which is the minimum value in the time series.
A polar coordinate conversion module 300 for passing through the formulaPerforming polar coordinate conversion on the normalized data value; wherein θ represents a polar coordinate angle, +.>Data value representing normalized time series at time i,/->Represents normalized time series, r represents polar radius, t i Representing a time stamp, N being a constant; the radius represents the distance from the pixel point to the center point of the image, and the angle represents the included angle between the pixel point and the vertical direction.
A gram matrix conversion module 400 for passing through the formulaConverting the polar coordinate angle into a gram matrix, and storing the gram matrix as a characteristic diagram symmetrical along a diagonal line; wherein I represents a unit vector [1, …,1],θ 1 And theta n The angle of the polar coordinates at time 1 and time n, respectively, +.>Representation->Is a transposed vector of (2);
the motor imagery task classification module 500 is configured to input a feature map symmetrical along a diagonal line into the dual-attention hybrid neural network model to obtain a motor imagery task classification result.
In this embodiment, the dual-attention hybrid neural network includes five levels, and the main process flow of construction is as follows:
s1: the CNN local feature extraction layer consists of a convolution layer and a downsampling layer, and firstly, feature extraction is carried out on an input image through convolution operation. Then, the feature map is subjected to dimension reduction processing by a downsampling operation. Finally, the feature vector generated by serialization is transmitted into an LSTM network to be used as an input feature;
s2: the local feature attention layer is divided into two main steps, the first step is to calculate the channel attention weight of the local feature map by introducing a learnable parameter. The second step is to calculate a vector weighted sum of global spatial locations;
s3: the LSTM global feature learning layer is divided into two main steps, wherein the first step is to transmit the output sequences of the CNN local feature extraction layer and the local feature attention layer into the LSTM network. The second step is to generate a global feature vector;
s4: and a global feature attention layer which defines a query vector by taking the hidden state of the LSTM output as an input and calculates an attention weight vector of each position according to the inner product of the query vector and the input. Then, a weighted sum is calculated by using the attention weight vector and the LSTM output feature to generate a final global feature vector;
s5: and the classification layer is used for calculating the loss of the whole model on the training set by using the cross entropy loss function.
Specifically, the motor imagery task classification module 500 is specifically configured to input a feature image symmetrical along a diagonal line into the dual-attention hybrid neural network model, extract spatial features of multi-dimensional input by the feature image through the CNN local feature extraction layer, perform convolution operation with data in longitudinal order by taking a convolution kernel as a unit, obtain simplified data, perform maximum pooling operation, and take a maximum value of a feature domain as a serialized feature vector to be transferred into the local feature attention layer. And secondly, the local feature attention layer obtains the correlation between each local feature and the query vector by calculating the vector weighted sum of the channel attention weight and the global space position of the mapping, and enhances the expression capability of the basic local feature. The LSTM global feature learning layer then utilizes the gating unit and strong memory capabilities to establish global spatial correlation of the glaamer image information. And then, the global feature focusing layer performs weighted fusion on the output features of the global feature learning layer, and increases the weight of the features most relevant to the recognition task. And finally, the classification layer is responsible for classifying the weighted features fused by the global feature attention layer. In order to reduce the dimension of the features to the number of categories, the classification is performed by using full connection, a 4-dimensional vector is obtained, and the classification result of the motor imagery feature map is obtained by combining with a softmax function for normalization.
The embodiment of the invention provides a motor imagery task classification method and a motor imagery task classification system based on electroencephalogram signals and deep learning, which can accurately classify motor imagery tasks, so that control instructions are output to external equipment, and accurate control of the external equipment is realized.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Embodiments of the present invention are not described in detail and are well known to those skilled in the art. Finally, it is noted that the above embodiments are only for illustrating the technical solution of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications and equivalents may be made thereto without departing from the spirit and scope of the technical solution of the present invention, which is intended to be covered by the scope of the claims of the present invention.

Claims (10)

1. A motor imagery task classification method based on electroencephalogram signals and deep learning is characterized by comprising the following steps:
collecting brain electrical signals;
normalizing the electroencephalogram signals to obtain normalized data values;
by the formulaPerforming polar coordinate conversion on the normalized data value; wherein θ represents a polar coordinate angle, +.>Data value representing normalized time series at time i,/->Represents normalized time series, r represents polar radius, t i Representing a time stamp, N being a constant;
by the formulaConverting the polar coordinate angle into a gram matrix, and storing the gram matrix as a characteristic diagram symmetrical along a diagonal line; wherein I represents a unit vector, θ 1 And theta n The angle of the polar coordinates at time 1 and time n, respectively, +.>Representation->Is a transposed vector of (2);
and inputting the feature map symmetrical along the diagonal line into a dual-attention hybrid neural network model to obtain a motor imagery task classification result.
2. The motor imagery task classification method based on brain signals and deep learning of claim 1, further comprising, after the acquiring of the brain signals:
and filtering and data expansion are carried out on the electroencephalogram signals.
3. The motor imagery task classification method based on brain signals and deep learning of claim 2, wherein the filtering the brain signals includes:
by the formulaCalculating the average value of the electroencephalogram signals to obtain a reference signal +.>Wherein X is i (t) represents an electroencephalogram signal of each electrode, N is the total electrode number, i represents an electrode number, and t represents time;
by the formulaCalculating to obtain a corrected signal value Y i (t)。
4. The motor imagery task classification method based on brain signals and deep learning of claim 2, wherein the filtering the brain signals includes:
calculating various parameters of the five-order zero-phase Butterworth filter according to the normalized cut-off frequency, and constructing a transfer function of the five-order zero-phase Butterworth filter through the parameters;
and filtering the electroencephalogram signals according to the transfer function.
5. The motor imagery task classification method based on brain signals and deep learning of claim 2, wherein the performing data expansion on the brain signals comprises:
sliding the original electroencephalogram signal of C×T horizontally along the time axis in step length of S, and dividing the data into two sections, namely 0- (T-S) and (T-S) -T;
exchanging sequences of two sections of data, sliding the (T-S) -T section of data to a starting point, and repeating k times until kS is more than or equal to T;
each piece of data is cut into a predetermined number of pieces using a window of predetermined length that slides over the data, each piece having the same tag as the original data.
6. An electroencephalogram signal and deep learning-based motor imagery task classification system, which is characterized by comprising:
the electroencephalogram signal acquisition module is used for acquiring electroencephalogram signals;
the normalization module is used for carrying out normalization processing on the electroencephalogram signals to obtain normalized data values;
the polar coordinate conversion module is used for passing through the formulaPerforming polar coordinate conversion on the normalized data value; wherein θ represents a polar coordinate angle, +.>Data value representing normalized time series at time i,/->Represents normalized time series, r represents polar radius, t i Representing a time stamp, N being a constant;
a gram matrix conversion module for passing through formulaConverting the polar coordinate angle into a gram matrix, and storing the gram matrix as a characteristic diagram symmetrical along a diagonal line; wherein I represents a unit vector, θ 1 And theta n The angle of the polar coordinates at time 1 and time n, respectively, +.>Representation->Is a transposed vector of (2);
and the motor imagery task classification module is used for inputting the feature images which are symmetrical along the diagonal line into the dual-attention hybrid neural network model to obtain a motor imagery task classification result.
7. The motor imagery task classification system based on electroencephalogram signals and deep learning as set forth in claim 6, further comprising:
the electroencephalogram signal filtering module is used for filtering the electroencephalogram signals;
and the electroencephalogram data expansion module is used for carrying out data expansion on the electroencephalogram signals.
8. The motor imagery task classification system based on brain signals and deep learning of claim 7, wherein the brain signal filtering module comprises at least:
a spatial filtering unit for passing through the formulaCalculating the average value of the electroencephalogram signals to obtain a reference signal +.>Wherein X is i (t) represents an electroencephalogram signal of each electrode, N is the total electrode number, i represents an electrode number, and t represents time;
a signal correction unit for passing through the formulaCalculating to obtain a corrected signal value Y i (t)。
9. The motor imagery task classification system based on brain signals and deep learning of claim 7, wherein the brain signal filtering module comprises at least:
the frequency filtering unit is used for calculating various parameters of the five-order zero-phase Butterworth filter according to the normalized cut-off frequency, and constructing a transfer function of the five-order zero-phase Butterworth filter through the parameters; and filtering the electroencephalogram signals according to the transfer function.
10. The motor imagery task classification system based on brain signals and deep learning of claim 7, wherein the brain signal data expansion module comprises:
the cyclic sliding recombination unit is used for horizontally sliding the C multiplied by T original electroencephalogram signals along a time axis in a step length of S, and the data are divided into two sections, namely 0- (T-S) and (T-S) -T; exchanging sequences of two sections of data, sliding the (T-S) -T section of data to a starting point, and repeating k times until kS is more than or equal to T;
and a data clipping unit for clipping each data into a predetermined number of pieces using a window of a predetermined length, the window sliding over the data, each piece having the same tag as the original data.
CN202310843420.5A 2023-07-10 2023-07-10 Motor imagery task classification method and system based on electroencephalogram signals and deep learning Pending CN116842329A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310843420.5A CN116842329A (en) 2023-07-10 2023-07-10 Motor imagery task classification method and system based on electroencephalogram signals and deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310843420.5A CN116842329A (en) 2023-07-10 2023-07-10 Motor imagery task classification method and system based on electroencephalogram signals and deep learning

Publications (1)

Publication Number Publication Date
CN116842329A true CN116842329A (en) 2023-10-03

Family

ID=88174060

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310843420.5A Pending CN116842329A (en) 2023-07-10 2023-07-10 Motor imagery task classification method and system based on electroencephalogram signals and deep learning

Country Status (1)

Country Link
CN (1) CN116842329A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108564039A (en) * 2018-04-16 2018-09-21 北京工业大学 A kind of epileptic seizure prediction method generating confrontation network based on semi-supervised deep layer
CN112418351A (en) * 2020-12-11 2021-02-26 天津大学 Zero sample learning image classification method based on global and local context sensing
CN113768515A (en) * 2021-09-17 2021-12-10 重庆邮电大学 Electrocardiosignal classification method based on deep convolutional neural network
CN114564991A (en) * 2022-02-28 2022-05-31 合肥工业大学 Electroencephalogram signal classification method based on Transformer guide convolution neural network
CN115804602A (en) * 2022-12-21 2023-03-17 西京学院 Electroencephalogram emotion signal detection method, equipment and medium based on attention mechanism and with multi-channel feature fusion
CN116250846A (en) * 2023-03-15 2023-06-13 电子科技大学 Multi-branch motor imagery electroencephalogram signal feature fusion classification method based on data conversion
CN116385580A (en) * 2023-03-15 2023-07-04 大连理工大学 Brain function image noise suppression method based on space-time characteristics

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108564039A (en) * 2018-04-16 2018-09-21 北京工业大学 A kind of epileptic seizure prediction method generating confrontation network based on semi-supervised deep layer
CN112418351A (en) * 2020-12-11 2021-02-26 天津大学 Zero sample learning image classification method based on global and local context sensing
CN113768515A (en) * 2021-09-17 2021-12-10 重庆邮电大学 Electrocardiosignal classification method based on deep convolutional neural network
CN114564991A (en) * 2022-02-28 2022-05-31 合肥工业大学 Electroencephalogram signal classification method based on Transformer guide convolution neural network
CN115804602A (en) * 2022-12-21 2023-03-17 西京学院 Electroencephalogram emotion signal detection method, equipment and medium based on attention mechanism and with multi-channel feature fusion
CN116250846A (en) * 2023-03-15 2023-06-13 电子科技大学 Multi-branch motor imagery electroencephalogram signal feature fusion classification method based on data conversion
CN116385580A (en) * 2023-03-15 2023-07-04 大连理工大学 Brain function image noise suppression method based on space-time characteristics

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
YAQI CHU 等: "Decoding multiclass motor imagery EEG from the same upper limb by combining Riemannian geometry features and partial least squares regression", 《JOURNAL OF NEURAL ENGINEERING》, pages 1 - 19 *
ZIYANG FU 等: "Deep Learning Model of Sleep EEG Signal by Using Bidirectional Recurrent Neural Network Encoding and Decoding", 《MDPI》, pages 1 - 12 *
曾碧卿 等: "层次化双注意力神经网络模型的情感分析研究", 《智能系统学报》, vol. 15, no. 3, 26 December 2019 (2019-12-26), pages 460 - 467 *
梁浩鹏 等: "基于GADF和PAM-Resnet的旋转机械小样本故障诊断方法", 《控制与决策》, 1 August 2022 (2022-08-01), pages 1 - 8 *
梁浩鹏 等: "基于GADF和PAM-Resnet的旋转机械小样本故障诊断方法", 《控制与决策》, pages 1 - 8 *
陈敏: "心电信号分析及心律失常分类方法研究", 《中国优秀硕士学位论文全文数据库 (医药卫生科技辑)》, pages 062 - 82 *

Similar Documents

Publication Publication Date Title
CN109816593B (en) Super-resolution image reconstruction method for generating countermeasure network based on attention mechanism
CN110348330B (en) Face pose virtual view generation method based on VAE-ACGAN
CN109377530A (en) A kind of binocular depth estimation method based on deep neural network
CN112967178B (en) Image conversion method, device, equipment and storage medium
CN113792641B (en) High-resolution lightweight human body posture estimation method combined with multispectral attention mechanism
CN113112583B (en) 3D human body reconstruction method based on infrared thermal imaging
CN113077554A (en) Three-dimensional structured model reconstruction method based on any visual angle picture
CN116152611B (en) Multistage multi-scale point cloud completion method, system, equipment and storage medium
CN112163990A (en) Significance prediction method and system for 360-degree image
CN110570375A (en) image processing method, image processing device, electronic device and storage medium
CN112906675A (en) Unsupervised human body key point detection method and system in fixed scene
CN117115563A (en) Remote sensing land coverage classification method and system based on regional semantic perception
CN117173022A (en) Remote sensing image super-resolution reconstruction method based on multipath fusion and attention
CN109829857B (en) Method and device for correcting inclined image based on generation countermeasure network
CN116842329A (en) Motor imagery task classification method and system based on electroencephalogram signals and deep learning
CN113808006B (en) Method and device for reconstructing three-dimensional grid model based on two-dimensional image
CN116978057A (en) Human body posture migration method and device in image, computer equipment and storage medium
CN112837420B (en) Shape complement method and system for terracotta soldiers and horses point cloud based on multi-scale and folding structure
Huang et al. Single image super-resolution reconstruction of enhanced loss function with multi-gpu training
Han et al. Feature based sampling: a fast and robust sampling method for tasks using 3D point cloud
CN117152825B (en) Face reconstruction method and system based on single picture
CN117274607B (en) Multi-path pyramid-based lightweight medical image segmentation network, method and equipment
Wang et al. Image Generation and Recognition Technology Based on Attention Residual GAN
CN113538663B (en) Controllable human body shape complementing method based on depth characteristic decoupling
CN112101330B (en) Image processing method, image processing apparatus, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination