CN113505528A - Method, system, device and storage medium for identifying brain neural development time-varying functional connection difference - Google Patents

Method, system, device and storage medium for identifying brain neural development time-varying functional connection difference Download PDF

Info

Publication number
CN113505528A
CN113505528A CN202110707741.3A CN202110707741A CN113505528A CN 113505528 A CN113505528 A CN 113505528A CN 202110707741 A CN202110707741 A CN 202110707741A CN 113505528 A CN113505528 A CN 113505528A
Authority
CN
China
Prior art keywords
dictionary
sparse
data
time
sparse depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110707741.3A
Other languages
Chinese (zh)
Other versions
CN113505528B (en
Inventor
乔琛
杨岚
李佳嘉
吴娇
于爱菊
龚若林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN202110707741.3A priority Critical patent/CN113505528B/en
Publication of CN113505528A publication Critical patent/CN113505528A/en
Application granted granted Critical
Publication of CN113505528B publication Critical patent/CN113505528B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • G06F30/27Design optimisation, verification or simulation using machine learning, e.g. artificial intelligence, neural networks, support vector machines [SVM] or training a model
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • Neurology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Image Analysis (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

The invention discloses a method, a system, equipment and a storage medium for identifying the time-varying functional connection difference of cerebral nerve development, which comprises the following steps: constructing a sparse depth dictionary learning model; training a sparse depth dictionary learning model, wherein in the training process, a sparse depth automatic encoder learns a dictionary from original data of a potential space, and meanwhile, a Tl1 norm and KL divergence are used for executing a sparse regularization term; the method, the system, the equipment and the storage medium can identify the brain neural development time-varying function connection difference, and simultaneously combine the advantage of deep learning in high-level nonlinear feature extraction with the interpretability of dictionary learning.

Description

Method, system, device and storage medium for identifying brain neural development time-varying functional connection difference
Technical Field
The invention belongs to the field of data processing, and relates to a method, a system, equipment and a storage medium for recognizing a time-varying functional connection difference in cerebral nerve development.
Background
Most dictionary learning analysis Function Connections (FC)/dynamic function connections (dFC) are based on linear dictionary learning, which ignores non-linear structures of data or higher level features. To solve this problem, kernel techniques are referenced to get a non-linear mapping and then dictionary learning in the transform space, but the interpretability of this approach is challenging. Dictionary learning based on a depth model is also proposed, and Sulam et al propose a multi-layered convolutional sparse coding model to learn a global dictionary from a multi-layered linear combination of serial convolutional dictionaries. Tariyal et al uses a greedy depth dictionary learning with a linear multi-layer strategy to learn a hierarchical representation of data. Bicep et al propose a non-linear dictionary learning model based on a deep self-encoder network from which sparse representations of dictionary and encoded data are learned. Mahdizadehagdam et al builds a deep dictionary learning model by stacking multiple layers of linear dictionaries with mutual information and updating each layer of dictionary. The methods are all used for exploring the internal structure of data based on a multi-layer model through dictionary learning. However, they are either based on multi-layered linear models or are only suitable for classification, and do not efficiently capture nonlinear underlying structures or identify distinguishable features from the data. Therefore, it is necessary to find a method for combining the advantages of deep learning in advanced nonlinear feature extraction and the interpretability of dictionary learning to solve this problem.
Disclosure of Invention
It is an object of the present invention to overcome the above-mentioned drawbacks of the prior art and to provide a method, system, device and storage medium for recognizing a time-varying functional connectivity difference in brain neural development, which can recognize a time-varying functional connectivity difference in brain neural development while combining the advantages of deep learning in advanced nonlinear feature extraction with interpretability of dictionary learning.
In order to achieve the above object, the method for identifying the time-varying functional connectivity difference in brain neural development according to the present invention comprises:
constructing a sparse depth dictionary learning model;
training a sparse depth dictionary learning model, wherein during the training process, a sparse depth automatic encoder learns the dictionary from the raw data of the potential space and uses the dictionary
Figure BDA0003132003870000021
Executing sparse regularization terms on the norm and the KL divergence;
and analyzing the brain neural development time-varying functional connection difference by using the trained sparse depth dictionary learning model.
Further comprising:
acquiring recorded brain development data;
in brain development data, summarizing each individual and corresponding data characteristics and variation values thereof into a piece of unit data, and constructing a data matrix by using the unit data corresponding to each individual, wherein the data matrix comprises a sample size N and a sample characteristic p;
the data matrix is divided into a training set and a test set.
And training the sparse depth dictionary learning model by utilizing the training set and the test set.
Setting the sparse depth automatic encoder to have 2L +1 layers, r (L) is the number of neurons in the L-th layer, L ═ 0,1, Λ,2L, r (2L-L) ═ r (L), and the sparse depth dictionary learning model is expressed by an optimization problem as:
Figure BDA0003132003870000031
wherein, the training set X ═ X1,x2,Λ,xN]∈Rp×N,D=[d1,d2,Λ,dK]∈Rp×KIs a dictionary of X's in the original data space,
Figure BDA0003132003870000032
representing the entire encoding and decoding process of a sparse depth autoencoder, fL(xn) Is a sample xnActivation or response in the L-th layer, V ═ V1,v2,Λ,vN]∈RK×N,vnFor each sample xnSparse representation of the code of (1), xnIs coded as fL(xn) Dictionary D is coded as FL(D),
Figure BDA0003132003870000033
Bernoulli random variable as mean ρ and mean
Figure BDA0003132003870000034
The KL divergence between Bernoulli random variables, p is a sparse parameter,
Figure BDA0003132003870000035
for the average activation value of network layer l neurons j,
Figure BDA0003132003870000036
is a connection weight matrix between the l-th layer and the l-1 st layer,
Figure BDA0003132003870000037
for the deviation of the l-th layer,
Figure BDA0003132003870000038
to represent
Figure BDA00031320038700000310
Norm, J1For obtaining a well-performing depth self-encoder by minimizing the error between the original data and its reconstruction, J2To learn a dictionary of data in a latent space, J3And J4Two regularization terms to control activation of neurons, J5And J6To control the sparsity of dictionaries and representations, the parameter λ1234For balancing the complexity of network fitting, dictionary learning, and models.
Fix D, V, optimize fL,FL,f2LUpdating the dictionary W, and converting the optimization problem into:
Figure BDA0003132003870000039
wherein, C1Is a constant.
Fixed fL,FL,f2LV, updating the dictionary D, and converting the optimization problem into:
Figure BDA0003132003870000041
wherein, C2Is a constant.
Fixing D, fL,FL,f2LUpdating V, and rewriting the optimization problem as follows:
Figure BDA0003132003870000042
wherein, C3Is a constant.
A system for identifying time-varying functional connectivity differences in brain neurodevelopment, comprising:
the construction module is used for constructing a sparse depth dictionary learning model;
a training module to train a sparse depth dictionary learning model, wherein during the training process, the sparse depth autoencoder learns the dictionary from raw data of the underlying space while using
Figure BDA0003132003870000043
Executing sparse regularization terms on the norm and the KL divergence;
and the analysis module is used for analyzing the brain neural development time-varying function connection difference by utilizing the trained sparse depth dictionary learning model.
A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method of identifying time-varying functional connectivity differences in brain neural development when executing the computer program.
A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method of identifying time-varying functional connectivity differences in brain neurodegeneration.
The invention has the following beneficial effects:
the method, the system, the equipment and the storage medium for recognizing the time-varying functional connection difference of cerebral nerve development learn dictionary from the original data of the potential space based on the sparse depth automatic encoder DAE when the sparse depth dictionary learning model is trained during specific operation and simultaneously use
Figure BDA0003132003870000051
The norm and KL divergence execute sparse regularization items, so that the nonlinear potential structure and higher-level features of data are captured well, meanwhile, the self-adaptive learning capacity of the dictionary can be improved through sparse realization, overfitting of the network is avoided, the trained sparse depth dictionary learning model is reused to analyze the brain neural development time-varying function connection difference, the brain neural development time-varying function connection difference is identified, and the method is convenient and simple to operate and extremely high in practicability.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and together with the description serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a detailed flow chart of the SWPC technique for computing dFC a matrix;
FIG. 3a is a diagram of a functional connection in State 1;
FIG. 3b is a distribution diagram of the functional connections in State 2;
FIG. 3c is a diagram of the functional connections in State 3;
FIG. 3d is a distribution diagram of functional connections in State 4;
fig. 3e is a diagram of the decreasing functional connectivity between 13 RSNs with increasing state 1;
fig. 3f is a graph of the diminished functional connectivity between 13 RSNs with increasing state 2;
fig. 3g is a graph of the diminished functional connectivity between 13 RSNs with increasing state 3;
fig. 3h is a diagram of the diminished functional connectivity between 13 RSNs with increasing state 4;
fig. 3i is a diagram of enhanced functional connectivity between 13 RSNs as state 1 grows;
fig. 3j is a diagram of enhanced functional connectivity between 13 RSNs as state 2 increases;
fig. 3k is a diagram of enhanced functional connectivity between 13 RSNs as state 3 increases;
fig. 3l is a diagram of enhanced functional connectivity between 13 RSNs as state 4 increases;
FIG. 3m is a graph of the statistical difference of DT and DT mean values for children and adolescents in four states;
FIG. 3n is a graph of the statistical difference in FT and FT averages for children and adolescents in four states;
FIG. 3o is a diagram of a dynamic state analysis of a teenager;
FIG. 3p is a diagram of a dynamic state analysis of a child.
Detailed Description
The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
The following detailed description is exemplary in nature and is intended to provide further details of the invention. Unless otherwise defined, all technical terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the invention.
Example one
Referring to fig. 1, the method for analyzing brain development data based on learning SDDL of sparse depth dictionary according to the present invention includes the following steps:
1) acquiring recorded brain development data;
2) in brain development data, summarizing each individual and corresponding data characteristics and variation values thereof into a piece of unit data, and constructing a data matrix by using the unit data corresponding to each individual, wherein the data matrix comprises a sample size N and a sample characteristic p;
3) dividing the data matrix obtained in the step 2) into a training set and a test set;
4) establishing a sparse depth dictionary learning model;
5) training the step sparse depth dictionary learning model by utilizing a training set and a testing set to obtain a trained sparse depth dictionary learning model;
6) and analyzing the brain neural development time-varying functional connection difference by using the trained sparse depth dictionary learning model.
Unlike most existing dictionary learning methods which focus on learning dictionaries in the original data space, the dictionary is learned from the original data in the potential space based on the sparse depth automatic encoder DAE, namely the dictionary of the data is learned from the potential space provided by the sparse depth automatic encoder DAE. In addition, the invention uses
Figure BDA0003132003870000073
Executing a sparse regularization term on the norm and the KL divergence, learning a sparse depth self-encoder DAE by using an iteration method, and sequentially obtaining a sparse dictionary and a sparse representation, wherein the sparse regularization term is specifically as follows:
setting a sparse depth auto-encoder to have 2L +1 layers, including two parts, i.e. an encoder and a decoder, let r (L) be the number of neurons in the L-th layer, L ═ 0,1, Λ,2L, r (2L-L) ═ r (L), sparse depth dictionary learning can be expressed as an optimization problem:
Figure BDA0003132003870000071
wherein X ═ X1,x2,Λ,xN]∈Rp×NFor training data, D ═ D1,d2,Λ,dK]∈Rp×KIs a dictionary of X's in the original data space,
Figure BDA0003132003870000072
representing the entire encoding and decoding process of a sparse depth autoencoder, f2LThe output of (a) is the reconstruction of the input data by the sparse depth auto-encoder.
Figure BDA0003132003870000081
Corresponding to the coding relationship of the input data and its output with the sparse depth autocoder in the L-th layer, i.e. x for each samplen,fL(xn) Is a sample xnActivation or response in L-th layer, non-linear coding of dictionary D
Figure BDA0003132003870000082
Is defined as FL(D)=(fL(d1),fL(d2),Λ,fL(dk))。V=[v1,v2,Λ,vN]∈RK×NEach v innFor each sample xnSparse representation of the code of (1), xnIs coded as fL(xn) Dictionary D is coded as FL(D)。
Figure BDA0003132003870000083
Bernoulli random variable as mean ρ and mean
Figure BDA0003132003870000084
Where p is a sparsity parameter,
Figure BDA0003132003870000085
for the average activation value of network layer l neurons j,
Figure BDA0003132003870000086
between the l-th layer and the l-1 st layerThe connection weight matrix of (a) is,
Figure BDA0003132003870000087
for the deviations of the first layer (0. ltoreq. L. ltoreq.2L), for the sake of convenience, let
Figure BDA0003132003870000088
To represent
Figure BDA00031320038700000811
And (4) norm. J. the design is a square1For obtaining a well-performing depth self-encoder by minimizing the error between the original data and its reconstruction, J2To learn a dictionary of data in a latent space, J3And J4Two regularization terms to control activation of neurons, J5And J6To control the sparsity of dictionaries and representations. Using the parameter lambda respectively1234To balance the complexity of network fitting, dictionary learning, and models.
Fix D, V, optimize fL,FL,f2LUpdating W, and rewriting the optimization problem as follows:
Figure BDA0003132003870000089
wherein, C1Is a constant;
fixed fL,FL,f2LV, updating the dictionary D, and rewriting the optimization problem as follows:
Figure BDA00031320038700000810
wherein, C2Is a constant;
fixing D, fL,FL,f2LUpdating V, and rewriting the optimization problem as follows:
Figure BDA0003132003870000091
wherein, C3Is a constant.
Example two
The system for identifying the time-varying functional connection difference of the brain neural development comprises the following steps:
the construction module is used for constructing a sparse depth dictionary learning model;
a training module to train a sparse depth dictionary learning model, wherein during the training process, the sparse depth autoencoder learns the dictionary from raw data of the underlying space while using
Figure BDA0003132003870000092
Executing sparse regularization terms on the norm and the KL divergence;
and the analysis module is used for analyzing the brain neural development time-varying function connection difference by utilizing the trained sparse depth dictionary learning model.
EXAMPLE III
A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method of identifying time-varying functional connectivity differences in brain neural development when executing the computer program.
Example four
A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method of identifying time-varying functional connectivity differences in brain neurodegeneration.
EXAMPLE five
The present invention utilizes sparse deep dictionary learning to study the differences in recurrence patterns in dynamic functional connectivity between children and adolescents. The present invention uses Philadelphia Neuro-degenerative Cohort (PNC) data from a large-scale experimental data cooperative study of brain behavior by cooperative study of the university of Pennsylvania and the Philadelphia children hospital. The data included fMRI data for nearly 900 adolescents between 8-22 years of age, of which there were 193 children (103 to 144 months) and 204 adolescents (216 to 271 months). Standard brain imaging pre-processing steps were implemented using SPM12, including motion correction, a spatial smoothing step of standard Montreal Neurological Institute (MNI) space (3X 3mm spatial resolution), and a 3mm full-width half-maximum (FWHM) Gaussian kernel.
A regression routine is used to eliminate the effects of motion and the functional time series is band-pass filtered using a frequency range of 0.01Hz to 0.1 Hz. 264 brain regions of interest (ROIs) as defined by Power et al were introduced, using a 5mm sphere radius parameter to reduce the data dimension, the time series of all voxels were averaged over the same brain region, reducing the data for each subject to a 264 × T matrix, where T ═ 124 represents the number of time points with a repetition Time (TR) value of 2 s. In order to facilitate understanding of functional connection relationships among ROIs, 12 Resting State Networks (RSNs) are defined according to 264ROIs, which mainly relate to brain movement, memory, language, vision, cognition and other functions, including sensory/physical movement networks (SSN), cingulate cortex task control networks (COTCN), Auditory Networks (AN), Default Mode Networks (DMN), Memory Retrieval Networks (MRN), Visual Networks (VN), forehead task control network control networks (FPTCN), Saliency Networks (SN), subcortical networks (SCN), Ventral Attention Networks (VAN), Dorsal Attention Networks (DAN) and Cerebellar Networks (CN). In addition, there are 28 ROIs that are not closely related to any of the RSNs mentioned above, and they belong to Uncertain Networks (UN).
dFC for each subject after calculation of the 264 × T matrix using a sliding window based on partial correlation (SWPC) technique. Since a time sequence has T time points, the window length l and the scan length s are used to obtain
Figure BDA0003132003870000111
A sub-sequence. By grid search, the present invention selects l 26, s 1, i.e. selects a subsequence of time series M99 with 124 time points. Calculating the correlation coefficient between any two ROIs of each subsequence, and calculating the connectivity (dFC) matrix P epsilon R of the dynamic function of each subject by using SWPC technology99×34716. To reduce computational complexity, a random 10 time series from 99 sliding windows may still be possibleTo maintain discrimination capability. Thus, the total sample size was 3970, which contained 1930 children and 2040 adolescents. In addition, feature selection by the notch minimization method was performed 30 times as a preprocessing to remove noise or irrelevant variables in the data and reduce the computational complexity of deep dictionary learning. The present invention randomly selected 70% of the subjects from the two groups for training, and the rest for testing. The dimension of each sample is determined to be 1677 using the elbow rule.
A specific flow chart for computing dFC matrix by SWPC technique is shown in fig. 2.
Referring to fig. 2, the depth autoencoder includes 5 layers of 1677, 900, 300, 900, 1677 units, respectively. The size K of the dictionary D is 18. Parameter lambda1234Are all 0.1; alpha and rho are respectively 0.05 and 0.001; eta1230.001, 0.001 and 0.01, respectively, and the threshold value of convergence error is set to 10-5. The dictionary is initialized by a K-SVD algorithm, the proposed sparse depth dictionary learning is realized, and finally, a preprocessed data dictionary D based on 3970 samples and 1677 features is obtained. The mean reconstruction error (ARE) of SDDL and several commonly used dictionary learning algorithms (K-SVD, MOD, RLS-DLA, ODL) was 5.2 × 10-3,7.08×10-2,7.07×10-2,7.16×10-2,7.21×10-2. Illustrating that SDDL has the best reconstruction capability compared to other dictionary learning algorithms.
To investigate dFC differences between children and adolescents, a sparse representation V of the sample is formed from the resulting dictionary D
Figure BDA0003132003870000121
Obtaining the compound of the formula (II). D and
Figure BDA0003132003870000122
further for determining time-varying differences between the child and the adolescent.
Figure BDA0003132003870000123
Each sparse vector of
Figure BDA0003132003870000124
All have a dimension K (K18)<<1677) It is a sparse representation vector of the nth sample in dictionary space.
Figure BDA0003132003870000125
Further used for dFC state analysis with time variability, i.e. applying k-means clustering to
Figure BDA0003132003870000126
To detect different patterns (i.e., states) in which dFC recur. Defined as the sum of squared errors by using the elbow rule
Figure BDA0003132003870000127
(k is the number of clusters, CiRepresents the ith cluster, x is the cluster CiOne point of (1), cciIs CiCluster center) the optimum number of states dFC is 4 for either of the two groups. The proportion of the groups of children in the four states was 32%, 27%, 25%, 16%, respectively, while the proportion of the groups of adolescents in the four states was 29%, 33%, 13%, 25%, respectively.
Dictionary D and the cluster centers of the two groups are used to capture the recurring pattern of dFC of either group. In these four recurring patterns or states, different distributions of FC among the 264ROIs and 13 different Resting State Networks (RSNs) in these 4 states are revealed, and population differences between children and adolescents are shown in fig. 3a to 3 p.
Wherein 3a to 3d represent four states of function distribution connection, the number of the outer circles represents 264ROIs, and the color of the inner circles represents RSNs corresponding to the ROIs; 3e to 3h indicate that as the four states grow, the functional connection between the 13 RSNs weakens; 3i to 3l indicate that the functional connection between the 13 RSNs increases in four states as one grows. 3m represents the statistical difference in the mean DT and the residence times (DT) of the four states between children and adolescents, wherein a and a represent significant levels of 0.05 and 0.01, respectively; 3n represents the statistical difference in time Fraction (FT) and FT mean in the four states between children and adolescents, where x and x represent significant levels of 0.05 and 0.01, respectively; 3o and 3p represent the dynamic state analysis of adolescents and children.
The SDDL approach reveals the recurring patterns or states in these four different dynamically functioning connected networks, with some significant differences between DMN, SSN, SN, COTCN, FPTCN, VN and SCN, closely related to information processing, cognition, emotion, working memory, vision and language, further confirming previous studies. It was also found that most functional connections gradually weaken during brain development, i.e. children exhibit a more discrete pattern of functional connections, while adolescents exhibit a more concentrated pattern of functional connections. And as they grow, the brain's function transitions from an undifferentiated system to a specialized neural network.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting the same, and although the present invention is described in detail with reference to the above embodiments, those of ordinary skill in the art should understand that: modifications and equivalents may be made to the embodiments of the invention without departing from the spirit and scope of the invention, which is to be covered by the claims.

Claims (10)

1. A method of identifying differences in time-varying functional connectivity during brain neurodegeneration, comprising:
constructing a sparse depth dictionary learning model;
training a sparse depth dictionary learning model, wherein during the training process, a sparse depth automatic encoder learns the dictionary from the raw data of the potential space and uses the dictionary
Figure FDA0003132003860000011
Executing sparse regularization terms on the norm and the KL divergence;
and analyzing the brain neural development time-varying functional connection difference by using the trained sparse depth dictionary learning model.
2. The method for identifying time-varying functional connectivity differences in brain neurodevelopment according to claim 1, further comprising:
acquiring recorded brain development data;
in brain development data, summarizing each individual and corresponding data characteristics and variation values thereof into a piece of unit data, and constructing a data matrix by using the unit data corresponding to each individual, wherein the data matrix comprises a sample size N and a sample characteristic p;
the data matrix is divided into a training set and a test set.
3. The method for identifying the time-varying functional connectivity differences in brain neural development according to claim 2, wherein the sparse depth dictionary learning model is trained using a training set and a testing set.
4. The method for identifying the time-varying functional connectivity difference in cerebral neurodevelopment according to claim 1, wherein the sparse depth automatic encoder is set to have 2L +1 layers, r (L) is the number of neurons in the L-th layer, L-0, 1, Λ,2L, r (2L-L) r (L), and the sparse depth dictionary learning model is expressed by an optimization problem as:
Figure FDA0003132003860000021
wherein, the training set X ═ X1,x2,Λ,xN]∈Rp×N,D=[d1,d2,Λ,dK]∈Rp×KIs a dictionary of X's in the original data space,
Figure FDA0003132003860000022
representing the entire encoding and decoding process of a sparse depth autoencoder, fL(xn) Is a sample xnActivation or response in the L-th layer, V ═ V1,v2,Λ,vN]∈RK×N,vnFor each sample xnIs generated by the encoder of (a) a sparse representation,xnis coded as fL(xn) Dictionary D is coded as FL(D),
Figure FDA0003132003860000023
Bernoulli random variable as mean ρ and mean
Figure FDA0003132003860000024
The KL divergence between Bernoulli random variables, p is a sparse parameter,
Figure FDA0003132003860000025
for the average activation value of network layer l neurons j,
Figure FDA0003132003860000026
is a connection weight matrix between the l-th layer and the l-1 st layer,
Figure FDA0003132003860000027
for the deviation of the l-th layer,
Figure FDA0003132003860000028
Figure FDA0003132003860000029
to represent
Figure FDA00031320038600000211
Norm, J1For obtaining a well-performing depth self-encoder by minimizing the error between the original data and its reconstruction, J2To learn a dictionary of data in a latent space, J3And J4Two regularization terms to control activation of neurons, J5And J6To control the sparsity of dictionaries and representations, the parameter λ1234For balancing the complexity of network fitting, dictionary learning, and models.
5. According to claimThe method for identifying differences in time-varying functional connectivity during cerebral neurodevelopment according to claim 4, wherein D, V are fixed and f is optimizedL,FL,f2LUpdating the dictionary W, and converting the optimization problem into:
Figure FDA00031320038600000210
wherein, C1Is a constant.
6. The method for identifying the time-varying functional connectivity difference in cerebral neurodegeneration according to claim 4, wherein f is fixedL,FL,f2LV, updating the dictionary D, and converting the optimization problem into:
Figure FDA0003132003860000031
wherein, C2Is a constant.
7. The method for identifying the time-varying functional connectivity difference in cerebral neurodegeneration according to claim 4, wherein D, f is fixedL,FL,f2LUpdating V, and rewriting the optimization problem as follows:
Figure FDA0003132003860000032
wherein, C3Is a constant.
8. A system for identifying differences in time-varying functional connectivity during brain neurodegeneration, comprising:
the construction module is used for constructing a sparse depth dictionary learning model;
a training module to train a sparse depth dictionary learning model, wherein, during the training process, the sparse depth autoencoder follows an origin of the underlying spaceLearning dictionaries in the beginning data, using simultaneously
Figure FDA0003132003860000033
Executing sparse regularization terms on the norm and the KL divergence;
and the analysis module is used for analyzing the brain neural development time-varying function connection difference by utilizing the trained sparse depth dictionary learning model.
9. A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method for identifying time-varying functional connectivity differences in brain neurological development according to any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, carries out the steps of the method for identifying differences in time-varying functional connectivity for cerebral neurological development according to any one of claims 1 to 7.
CN202110707741.3A 2021-06-24 2021-06-24 Method, system, device and storage medium for identifying brain nerve development time-varying function connection difference Active CN113505528B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110707741.3A CN113505528B (en) 2021-06-24 2021-06-24 Method, system, device and storage medium for identifying brain nerve development time-varying function connection difference

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110707741.3A CN113505528B (en) 2021-06-24 2021-06-24 Method, system, device and storage medium for identifying brain nerve development time-varying function connection difference

Publications (2)

Publication Number Publication Date
CN113505528A true CN113505528A (en) 2021-10-15
CN113505528B CN113505528B (en) 2024-04-05

Family

ID=78010648

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110707741.3A Active CN113505528B (en) 2021-06-24 2021-06-24 Method, system, device and storage medium for identifying brain nerve development time-varying function connection difference

Country Status (1)

Country Link
CN (1) CN113505528B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10671917B1 (en) * 2014-07-23 2020-06-02 Hrl Laboratories, Llc System for mapping extracted Neural activity into Neuroceptual graphs
CN111916204A (en) * 2020-07-08 2020-11-10 西安交通大学 Brain disease data evaluation method based on self-adaptive sparse deep neural network

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10671917B1 (en) * 2014-07-23 2020-06-02 Hrl Laboratories, Llc System for mapping extracted Neural activity into Neuroceptual graphs
CN111916204A (en) * 2020-07-08 2020-11-10 西安交通大学 Brain disease data evaluation method based on self-adaptive sparse deep neural network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
冯宝;刘晓刚;: "基于字典稀疏性的脑图像数据盲分离方法", 计算机工程, no. 12 *
李素梅;常永莉;韩旭;胡佳洁;: "基于稀疏字典学习的立体图像质量评价", 天津大学学报(自然科学与工程技术版), no. 01 *

Also Published As

Publication number Publication date
CN113505528B (en) 2024-04-05

Similar Documents

Publication Publication Date Title
CN107273925B (en) Lung parenchyma CT image processing device based on local receptive field and semi-supervised depth self-coding
CN109543727B (en) Semi-supervised anomaly detection method based on competitive reconstruction learning
CN112766355B (en) Electroencephalogram signal emotion recognition method under label noise
CN110991471A (en) Fault diagnosis method for high-speed train traction system
CN112990008A (en) Emotion recognition method and system based on three-dimensional characteristic diagram and convolutional neural network
CN114676720B (en) Mental state identification method and system based on graph neural network
CN113887559A (en) Brain-computer information fusion classification method and system for brain off-loop application
CN117574059A (en) High-resolution brain-electrical-signal deep neural network compression method and brain-computer interface system
CN112155549A (en) ADHD disease diagnosis aid decision-making system based on deep convolution pulse neural network
CN116759067A (en) Liver disease diagnosis method based on reconstruction and Tabular data
CN111209939A (en) SVM classification prediction method with intelligent parameter optimization module
CN108665001B (en) Cross-tested idle state detection method based on deep belief network
CN110569880A (en) Method for decoding visual stimulation by using artificial neural network model
CN114037014A (en) Reference network clustering method based on graph self-encoder
CN113723239A (en) Magnetic resonance image classification method and system based on causal relationship
CN111916204A (en) Brain disease data evaluation method based on self-adaptive sparse deep neural network
CN113505528A (en) Method, system, device and storage medium for identifying brain neural development time-varying functional connection difference
CN115661498A (en) Self-optimization single cell clustering method
CN115346084A (en) Sample processing method, sample processing apparatus, electronic device, storage medium, and program product
CN113539517A (en) Prediction method of time sequence intervention effect
CN117668701B (en) AI artificial intelligence machine learning system and method
CN114332460B (en) Semi-supervised single image rain removing processing method
CN114626408B (en) Electroencephalogram signal classification method and device, electronic equipment, medium and product
CN118296442B (en) Multiple-study cancer subtype classification method, system, device, medium and program product
CN115017125B (en) Data processing method and device for improving KNN method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant