CN110110855A - Based on deep-cycle neural network and there is the brain network reconstruction method for supervising dictionary learning - Google Patents

Based on deep-cycle neural network and there is the brain network reconstruction method for supervising dictionary learning Download PDF

Info

Publication number
CN110110855A
CN110110855A CN201910452205.6A CN201910452205A CN110110855A CN 110110855 A CN110110855 A CN 110110855A CN 201910452205 A CN201910452205 A CN 201910452205A CN 110110855 A CN110110855 A CN 110110855A
Authority
CN
China
Prior art keywords
brain
network
dictionary
matrix
deep
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910452205.6A
Other languages
Chinese (zh)
Other versions
CN110110855B (en
Inventor
赵世杰
张松瑶
王琦钰
韩军伟
郭雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN201910452205.6A priority Critical patent/CN110110855B/en
Publication of CN110110855A publication Critical patent/CN110110855A/en
Application granted granted Critical
Publication of CN110110855B publication Critical patent/CN110110855B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Databases & Information Systems (AREA)
  • Molecular Biology (AREA)
  • Algebra (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Operations Research (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

The present invention relates to a kind of based on deep-cycle neural network and has the brain network reconstruction method of supervision dictionary learning, it is the novel brain network reconfiguration calculation method of a kind of Fusion Model driving and data-driven method advantage, for the functional brain network diversified and concurrent from the reconstruct of task state fMRI data.Specifically, diversification, adaptive regression variable are derived automatically from using deep-cycle neural network;The brain function network for utilizing regression variable reconstruction tasks to cause using there is supervision dictionary learning method.The invention proposes the regression variables that a kind of deep-cycle neural network carrys out automatic learning data driving.Later, the brain network activation figure of these regression variables is reconstructed using effective supervision dictionary learning and sparse representation method.Experiments have shown that this calculation method has superiority in identification diversification and complicated with brain network facet.

Description

Based on deep-cycle neural network and there is the brain network reconstruction method for supervising dictionary learning
Technical field
The invention belongs to field of medical image processing, are related to one kind based on deep-cycle neural network and have supervision dictionary The brain network reconstruction method of habit can be applied to brain function activity analysis.
Background technique
Task state functional mri (tfMRI) is a kind of to measure the caused blood of neuron activity using the magnetic radiography that shakes The emerging Brain imaging techniques that hydraulic power changes.Because of its spatial resolution height, Noninvasive, there is no the features such as radioactive exposure, It is widely applied in terms of brain function activity analysis and clinical diagnosis.To task state fMRI data midbrain network The detection and reconstruct of ingredient are that one of main research of task state Functional magnetic resonance imaging and cerebral function are living The basis of dynamic analysis and clinical application.
Although there are many traditional brain network reconstruction method, the primary analysis method of task state fMRI data Model-driven, that is, general linear model (GLM).The basic thought of this model driven method is using from blood The regression variable of the hypothesis of kinetic reaction function (HRF) and its derivative carrys out the brain function network of reconstruction tasks initiation.But this One main problem of class method is regression variable over-simplification, lacks adaptability, does not also account for task state functional MRI The sequence characteristic of data also has ignored other diversifications, concurrent brain activity network.Therefore, it is necessary to be directed to functional MRI The characteristics of data and brain activity, carries out the research of new brain network reconstruction method.
Current already present brain network reconfiguration calculation method has the disadvantage in that regression variable over-simplification, lacks pervasive Property, cause many diversifications, concurrent brain activity network ignored.
Summary of the invention
Technical problems to be solved
In order to avoid the shortcomings of the prior art, the present invention proposes one kind based on deep-cycle neural network and has supervision The brain network reconstruction method of dictionary learning.
Technical solution
It is a kind of based on deep-cycle neural network and have supervision dictionary learning brain network reconstruction method, it is characterised in that step It is rapid as follows:
Step 1, tectonic network model: network model is deep-cycle neural network DRNN, including two circulation nervous layers RNN and full articulamentum FC;Two circulation nervous layer RNN and a full articulamentum FC form cascaded structure;It is described each RNN layers include 30 LSTM unit, that is, shot and long term memory networks;
The deep-cycle neural network DRNN constructed in step 2, training step 1: with the task matrix in tfMRI data set (T) input as deep-cycle neural network DRNN, network model are trained to model convergence, and RNN layers at top In the output of each unit obtain the regression variable of data-driven, the full brain signal matrix S predicted after full articulamentum;
Step 3: using full brain signal matrix S as the input for the dictionary learning for having supervision, the dictionary learning for having supervision is carried out, When study, by a part of D in dictionary DcIt is not involved in update as constant, only to rest part DlIt optimizes, obtains dictionary D And coefficient matrices A;
Step 4: every a line of matrix A being mapped in the standard form of brain, generation is led on motor task by DRNN The brain network activation figure of data-driven out completes the reconstruct of space brain network.
The deep-cycle neural network DRNN constructed in the training step 1 of the step 2 is: by the stimulation square at each moment Battle array is straightened as vector, and the stimulus vector of t time is spliced into matrix T, and size is n × t, as the defeated of DRNN network Enter;It exports to obtain the regression variable for the brain activity for representing specific time in each unit of the RNN layer at top, connect entirely Layer, obtains the tfMRI signal matrix S of full brain;The vector is dimension n × 1;The tfMRI signal matrix S dimension of the full brain is M × t, wherein m is voxel number, and t is the time.
1. the step 3: the tfMRI signal S of full brain is that the sparse linear of the atom of basic dictionary D combines, every in dictionary D The signal of a dictionary atom represents the functional activity of specific brain regions network, and corresponding weight vectors represent the brain network in matrix A Spatial distribution;Dictionary atom is divided into two parts:
Wherein DCAs the dictionary atom of predefined model-driven, DlFor by the dictionary atom of tfMRI data-driven, and Only to DlIt optimizes, steps are as follows:
Step a: input signalWherein D0For initial dictionary matrix, DCFor predefined dictionary atom, Dl0Random initializtion, T are cycle-index;
Step b: circulation starts, cycle-index iter=1:T;I=iter%n wherein T > n;
Step c: s is extracted from signal Si
Step d: sparse coding:
Step e: D is updatedl(t), but DCIt remains unchanged
Step f: end loop returns to D and A matrix.
Beneficial effect
It is proposed by the present invention it is a kind of based on deep-cycle neural network and have supervision dictionary learning brain network reconstruction method, It is the novel brain network reconfiguration calculation method of a kind of Fusion Model driving and data-driven method advantage, is used for from task state function Magnetic resonance imaging data reconstruct diversification and concurrent functional brain network.Specifically, certainly using deep-cycle neural network Dynamic export diversification, adaptive regression variable;Regression variable reconstruction tasks are utilized to draw using there is supervision dictionary learning method The brain function network of hair.
The invention proposes the regression variables that a kind of deep-cycle neural network carrys out automatic learning data driving.Later, it adopts The brain network activation figure of these regression variables is reconstructed with effective supervision dictionary learning and sparse representation method.Experiments have shown that this Calculation method has superiority in identification diversification and complicated with brain network facet.
Firstly, compared with model-driven is as general linear model (GLM) this conventional method of regression variable, the present invention The method of proposition uses data-driven and exports regression variable.Secondly, in the part using supervision dictionary learning reconstruct brain network, It, partially as the constant in dictionary atom, will only optimize other dictionary atoms as derived from DRNN, sufficiently show brain network and exist Specific activities under task stimulation, and can also identify many other concurrent cerebral function networks, such as different time Delay network illustrates that this model has robustness and superiority.
Detailed description of the invention
Fig. 1: it show the flow chart of the method for the present invention.The present invention is first using task stimulation matrix T as input, by two A RNN layers and a full articulamentum, export full brain tfMRI signal matrix S.Dictionary learning by there is supervision obtain dictionary D and Every a line of A matrix is reconstructed the space brain network under task stimulation by coefficient matrices A.
Data-driven regression variable derived from the DRNN of part under Fig. 2: Motor task
Fig. 3: the individual of the regression variable reconstruct of the data-driven as derived from DRNN and the example of group roomage response
Fig. 4: the brain cyberspace for the regression variable reconstruct that different time postpones under Motor task design
Specific embodiment
Now in conjunction with embodiment, attached drawing, the invention will be further described:
Step 1: choosing data set.HCP data set is considered as one of most system and most comprehensive neuroimaging data set. Therefore by taking HCP data set as an example, brain network reconfiguration of the calculation method under task stimulation in the verifying present invention has exploitativeness.
Step 2: construction depth Recognition with Recurrent Neural Network model (DRNN).Network model DRNN in this experiment is by two RNN Layer and a full articulamentum composition, are wherein made of for each DRNN layers 30 LSTM units.Two RNN layers and one full connections Layer forms cascaded structure.
Step 3: deep-cycle neural network (DRNN) being instructed using task stimulation matrix (T) in HCP data set Practice.
The input of step 3a:DRNN network is that task stimulates matrix Task Design (T).By the stimulation square at each moment Battle array is straightened as vector (dimension n × 1), and the stimulus vector of a period of time is spliced into matrix T, and size is n × t, as The input of DRNN network.
The output of step 3b:DRNN network is full brain signal Whole Brain Signals (S).The RNN layer at top it is every A unit output represents the brain activity of specific time, provides input as regression variable for next step analysis.It is complete by one Articulamentum obtains the tfMRI signal matrix S of full brain (matrix dimensionality is m × t, and wherein m is voxel number, and t is the time).
Step 4: using full brain signal matrix S obtained in step 3 as input, carrying out the dictionary learning for having supervision.
Step 4a: full brain tfMRI signal matrix S can sparsely be indicated by dictionary D.In dictionary learning, the tfMRI of full brain Signal S can be counted as the sparse linear combination of the atom of basic dictionary D.For example, si=D × Ai, S=D × A, wherein A is dilute Dredge the correlation coefficient matrix indicated.Particularly, the function that the signal of each dictionary atom represents specific brain regions network in dictionary D is lived Dynamic, corresponding weight vectors represent the spatial distribution of the brain network in matrix A.
Step 4b: in the present invention, dictionary atom is particularly divided into two parts, wherein DCAs predefined model-driven Dictionary atom, DlFor by the dictionary atom of tfMRI data-driven.In this calculation method, only to DlIt optimizes.
For signal S, cost function is defined as:
Wherein λ is regularization parameter.In short, the problem of supervision dictionary learning, can be rewritten as the matrix point in (3-4) Solution problem:
Calculation method circulation step is as follows:
Step a: input signalWherein D0For initial dictionary matrix, DCFor predefined dictionary atom, Dl0Random initializtion, T are cycle-index.
Step b: circulation starts, cycle-index iter=1:T;I=iter%n wherein T > n;
Step c: s is extracted from signal Si
Step d: sparse coding:
Step e: D is updatedl(t), but DCIt remains unchanged
Step f: end loop returns to D and A matrix.
Identical with the partitioning scheme of dictionary D, coefficient matrices A is equally resorted to two parts, returns and becomes as derived from DRNN Brain cyberspace A under the task stimulation of amount reconstructCAnd the complicated with brain cyberspace of data-driven is distributed Al
Step 5: being gone out on missions using the reconstruct of coefficient matrices A obtained in step 4 stimulates the brain network caused.By coefficient square Every a line of matrix A is mapped in the standard form of brain by battle array A as input, and generation is exported on motor task by DRNN Data-driven brain network activation figure, to realize the reconstruct work of space brain network.

Claims (3)

1. it is a kind of based on deep-cycle neural network and have supervision dictionary learning brain network reconstruction method, it is characterised in that step It is as follows:
Step 1, tectonic network model: network model is deep-cycle neural network DRNN, including two circulation nervous layer RNN with One full articulamentum FC;Two circulation nervous layer RNN and a full articulamentum FC form cascaded structure;Described each RNN layers Including 30 LSTM unit, that is, shot and long term memory networks;
The deep-cycle neural network DRNN constructed in step 2, training step 1: with the task matrix (T) in tfMRI data set As the input of deep-cycle neural network DRNN, network model is trained to model convergence, in top RNN layers The output of each unit obtains the regression variable of data-driven, the full brain signal matrix S predicted after full articulamentum;
Step 3: using full brain signal matrix S as the input for the dictionary learning for having supervision, carrying out the dictionary learning for having supervision, learn When, by a part of D in dictionary DcIt is not involved in update as constant, only to rest part DlIt optimizes, obtain dictionary D and is Matrix number A;
Step 4: every a line of matrix A being mapped in the standard form of brain, is generated on motor task as derived from DRNN The brain network activation figure of data-driven completes the reconstruct of space brain network.
2. based on deep-cycle neural network and there is the brain network reconstruction method for supervising dictionary learning according to claim 1, It is characterized by: the deep-cycle neural network DRNN constructed in the training step 1 of the step 2 is: by the thorn at each moment Sharp matrix is straightened as vector, and the stimulus vector of t time is spliced into matrix T, and size is n × t, as DRNN network Input;It exports to obtain the regression variable for the brain activity for representing specific time in each unit of the RNN layer at top, connect entirely Layer, obtains the tfMRI signal matrix S of full brain;The vector is dimension n × 1;The tfMRI signal matrix S dimension of the full brain is M × t, wherein m is voxel number, and t is the time.
3. based on deep-cycle neural network and there is the brain network reconstruction method for supervising dictionary learning according to claim 1, It is characterized by: the step 3: the tfMRI signal S of full brain is that the sparse linear of the atom of basic dictionary D combines, in dictionary D The signal of each dictionary atom represents the functional activity of specific brain regions network, and corresponding weight vectors represent the brain network in matrix A Spatial distribution;Dictionary atom is divided into two parts:
Wherein DCAs the dictionary atom of predefined model-driven, DlFor by the dictionary atom of tfMRI data-driven, and it is only right DlIt optimizes, steps are as follows:
Step a: input signalWherein D0For Initial dictionary matrix, DCFor predefined dictionary atom, Dl0Random initializtion, T are cycle-index;
Step b: circulation starts, cycle-index iter=1:T;I=iter%n wherein T > n;
Step c: s is extracted from signal Si
Step d: sparse coding:
Step e: D is updatedl(t), but DCIt remains unchanged
Step f: end loop returns to D and A matrix.
CN201910452205.6A 2019-05-28 2019-05-28 Brain network reconstruction method based on deep cycle neural network and supervised dictionary learning Active CN110110855B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910452205.6A CN110110855B (en) 2019-05-28 2019-05-28 Brain network reconstruction method based on deep cycle neural network and supervised dictionary learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910452205.6A CN110110855B (en) 2019-05-28 2019-05-28 Brain network reconstruction method based on deep cycle neural network and supervised dictionary learning

Publications (2)

Publication Number Publication Date
CN110110855A true CN110110855A (en) 2019-08-09
CN110110855B CN110110855B (en) 2022-04-29

Family

ID=67492686

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910452205.6A Active CN110110855B (en) 2019-05-28 2019-05-28 Brain network reconstruction method based on deep cycle neural network and supervised dictionary learning

Country Status (1)

Country Link
CN (1) CN110110855B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112419437A (en) * 2019-11-29 2021-02-26 上海联影智能医疗科技有限公司 System and method for reconstructing magnetic resonance images

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102138782A (en) * 2011-03-10 2011-08-03 电子科技大学 Brain functional effective connection analyzing method
CN102366323A (en) * 2011-09-30 2012-03-07 中国科学院自动化研究所 Detection method for causal connection strength of magnetic resonance brain imaging based on PCA (Principal component analysis) and GCA (Granger causality analysis)
CN103034778A (en) * 2012-09-28 2013-04-10 中国科学院自动化研究所 Method for extracting brain function network of individual based on analysis of multiple tested brain function data
CN103325119A (en) * 2013-06-27 2013-09-25 中国科学院自动化研究所 Default state brain network center node detecting method based on modality fusion
CN103345749A (en) * 2013-06-27 2013-10-09 中国科学院自动化研究所 Method for detecting brain network function connectivity lateralization based on modality fusion
CN105596004A (en) * 2015-12-28 2016-05-25 中国人民解放军国防科学技术大学 Brain functional magnetic resonance imaging blind source separation method based on group canonical correlation analysis
CN106997581A (en) * 2017-03-01 2017-08-01 杭州电子科技大学 A kind of method that utilization deep learning rebuilds high spectrum image
CN107658018A (en) * 2017-10-12 2018-02-02 太原理工大学 A kind of fusion brain network establishing method based on structure connection and function connects
CN108577835A (en) * 2018-05-17 2018-09-28 太原理工大学 A kind of brain function network establishing method based on micro- state
CN108921286A (en) * 2018-06-29 2018-11-30 杭州电子科技大学 A kind of tranquillization state function brain network establishing method for exempting from threshold value setting
CN109065128A (en) * 2018-09-28 2018-12-21 郑州大学 A kind of sparse brain network establishing method of weighted graph regularization
CN109741806A (en) * 2019-01-07 2019-05-10 北京推想科技有限公司 A kind of Medical imaging diagnostic reports auxiliary generating method and its device

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102138782A (en) * 2011-03-10 2011-08-03 电子科技大学 Brain functional effective connection analyzing method
CN102366323A (en) * 2011-09-30 2012-03-07 中国科学院自动化研究所 Detection method for causal connection strength of magnetic resonance brain imaging based on PCA (Principal component analysis) and GCA (Granger causality analysis)
CN103034778A (en) * 2012-09-28 2013-04-10 中国科学院自动化研究所 Method for extracting brain function network of individual based on analysis of multiple tested brain function data
CN103325119A (en) * 2013-06-27 2013-09-25 中国科学院自动化研究所 Default state brain network center node detecting method based on modality fusion
CN103345749A (en) * 2013-06-27 2013-10-09 中国科学院自动化研究所 Method for detecting brain network function connectivity lateralization based on modality fusion
CN105596004A (en) * 2015-12-28 2016-05-25 中国人民解放军国防科学技术大学 Brain functional magnetic resonance imaging blind source separation method based on group canonical correlation analysis
CN106997581A (en) * 2017-03-01 2017-08-01 杭州电子科技大学 A kind of method that utilization deep learning rebuilds high spectrum image
CN107658018A (en) * 2017-10-12 2018-02-02 太原理工大学 A kind of fusion brain network establishing method based on structure connection and function connects
CN108577835A (en) * 2018-05-17 2018-09-28 太原理工大学 A kind of brain function network establishing method based on micro- state
CN108921286A (en) * 2018-06-29 2018-11-30 杭州电子科技大学 A kind of tranquillization state function brain network establishing method for exempting from threshold value setting
CN109065128A (en) * 2018-09-28 2018-12-21 郑州大学 A kind of sparse brain network establishing method of weighted graph regularization
CN109741806A (en) * 2019-01-07 2019-05-10 北京推想科技有限公司 A kind of Medical imaging diagnostic reports auxiliary generating method and its device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
TAO HU等: "《Electron Microscopy Reconstruction of Brain Structure Using Sparse Representations Over Learned Dictionaries》", 《IEEE TRANSACTIONS ON MEDICAL IMAGING》 *
YIMIN ZHOU等: "《Distributed Dimensionality Reconstruction Algorithm for High Dimensional Data in Internet of Brain Things》", 《IEEE ACCESS》 *
胡颖等: "《静息态功能磁共振成像的脑功能分区综述》", 《中国图象图形学报》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112419437A (en) * 2019-11-29 2021-02-26 上海联影智能医疗科技有限公司 System and method for reconstructing magnetic resonance images

Also Published As

Publication number Publication date
CN110110855B (en) 2022-04-29

Similar Documents

Publication Publication Date Title
Li et al. A multi-scale fusion convolutional neural network based on attention mechanism for the visualization analysis of EEG signals decoding
D’Angelo et al. Realistic modeling of neurons and networks: towards brain simulation
CA2880758C (en) Method and computing system for modelling a primate brain
Kasabov et al. New algorithms for encoding, learning and classification of fMRI data in a spiking neural network architecture: A case on modeling and understanding of dynamic cognitive processes
CN110619322A (en) Multi-lead electrocardio abnormal signal identification method and system based on multi-flow convolution cyclic neural network
CN108135520B (en) Generating natural language representations of psychological content from functional brain images
US10726546B2 (en) Tissue-to-flow image generation in medical imaging
CN110090017B (en) Electroencephalogram signal source positioning method based on LSTM
CN108542390A (en) Vascular plaque ingredient recognition methods based on more contrast nuclear magnetic resonance images
Bzdok et al. Semi-supervised factored logistic regression for high-dimensional neuroimaging data
Rejer EEG feature selection for BCI based on motor imaginary task
CN109816630A (en) FMRI visual coding model building method based on transfer learning
He et al. Alzheimer's disease diagnosis model based on three-dimensional full convolutional DenseNet
CN107248180A (en) A kind of fMRI natural image coding/decoding methods based on hidden state model
CN110321827A (en) A kind of pain level appraisal procedure based on face pain expression video
Qiang et al. Learning brain representation using recurrent Wasserstein generative adversarial net
CN113180695B (en) Brain-computer interface signal classification method, system, equipment and storage medium
CN110110855A (en) Based on deep-cycle neural network and there is the brain network reconstruction method for supervising dictionary learning
CN110392549A (en) Determine the method and apparatus for causing the best big brain stimulation of expected behavior
Qiao et al. Cheart: A conditional spatio-temporal generative model for cardiac anatomy
DE102019213931A1 (en) Method and computer program product for identifying a vehicle user and control device for automated driving functions
Shpigelman et al. Spikernels: Embedding spiking neurons in inner-product spaces
Li et al. Latent source mining of fMRI data via deep belief network
Zhang et al. Deep Linear Modeling of Hierarchical Functional Connectivity in the Human Brain
Ma Neurophysiological analysis of the genesis mechanism of EEG during the interictal and ictal periods using a multiple neural masses model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant