CN115841607A - Brain network structure and similarity joint learning method based on graph attention network - Google Patents
Brain network structure and similarity joint learning method based on graph attention network Download PDFInfo
- Publication number
- CN115841607A CN115841607A CN202211240213.2A CN202211240213A CN115841607A CN 115841607 A CN115841607 A CN 115841607A CN 202211240213 A CN202211240213 A CN 202211240213A CN 115841607 A CN115841607 A CN 115841607A
- Authority
- CN
- China
- Prior art keywords
- brain
- brain network
- similarity
- network structure
- network
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Image Analysis (AREA)
Abstract
The invention belongs to the field of deep learning and brain network structures, and particularly relates to a brain network structure and similarity joint learning method based on a graph attention network, which comprises the following steps: performing cortical segmentation processing and morphological feature extraction on the acquired brain image data, and modeling a brain network of a subject into a graph; estimating an initial brain network structure through Pearson correlation calculation; obtaining the similarity between brain network structures through a twin map attention learning network; calculating a graph regularization loss function and a twin network loss function, and constraining the characteristics of an initial brain network structure; and updating the adjacency matrix of the brain network according to the embedding characteristics of the brain network, and obtaining the similarity of the updated brain network structure and the calculated brain network structure. The morphological brain network is effectively estimated by jointly optimizing two tasks of brain network structure estimation and similarity learning, and valuable information is provided for subsequent tasks such as individual identification, disease auxiliary diagnosis and the like.
Description
Technical Field
The invention belongs to the field of deep learning and brain network structures, and particularly relates to a brain network structure and similarity joint learning method based on a graph attention network.
Background
Alzheimer's disease (AD, commonly known as senile dementia) is one of the most common neurodegenerative diseases in the elderly population, and AD is clinically characterized by a decline in memory and other cognitive functions, and there is no effective treatment at present. Mild Cognitive Impairment (MCI), which is the transitional stage between normal aging and AD, is very important for the early treatment of AD disease and for delaying the progression of the disease. The development of computer aided diagnosis technology based on medical images for MCI and other brain diseases has become a research hotspot in the neuroscience field nowadays. However, accurate adjuvant diagnosis of MCI disease remains quite challenging due to the mild symptoms at the MCI stage, and the insignificant changes in brain function and anatomy.
Based on sMRI, an individual morphological brain network is constructed for a single tested object, important evidence can be provided for recognizing a full brain morphological connection mode under the disease states of MCI and the like, and sufficiently sensitive and specific imaging markers are provided for clinical applications of disease auxiliary diagnosis, curative effect evaluation and the like. However, the current individual morphological brain network construction method is still immature, and is limited by the speed and accuracy of cerebral cortex surface segmentation on one hand, and on the other hand, follow-up tasks such as disease diagnosis and the like are not usually considered, and the biological significance of the method is to be improved. Therefore, the research establishes a reliable individual morphology brain network construction method, the MCI brain network abnormal mechanism is disclosed on the basis, the accuracy of the auxiliary diagnosis of the disease is improved, the method is an urgent need for the current MCI disease research, and has important academic significance and clinical value.
At present, most of existing individual morphological brain network construction methods estimate the brain network structure only based on one morphological feature by calculating the similarity of feature mean values between ROIs or the statistical correlation of feature value distribution, and cannot fully reflect the complex characteristics of the cerebral cortex. In recent years, researchers have proposed that the brain network structure is estimated by using various morphological features, and that the ability of diagnosis assistance and the ability of prediction of diseases are improved. These algorithms can be broadly classified into a method based on a pair of ROIs and a method based on a plurality of ROIs according to the calculation manner of morphological connection between nodes.
The morphological connections estimated based on the paired ROI approach reflect the correlation between two ROIs. Pearson correlation analysis, the most commonly used method, directly captures the similarity of node pairs to feature vectors. By utilizing various characteristic distances, the professor of the Hangzhou electronic science and technology university, such as the professor Liehua, calculates the high-order morphological similarity between the ROIs, provides high-level supplementary information for the traditional Pearson correlation method, provides the method for measuring the correlation of characteristic distribution between the ROIs by calculating an index function of the multivariate Euclidean distances, effectively estimates the individual morphological brain network structure and improves the accuracy of auxiliary diagnosis of diseases. While such methods are computationally simple and biologically intuitive, they ignore the potential effects of other regions and may generate false connections. Aiming at the problem, the method based on a plurality of ROIs considers the influence of a plurality of regions, usually estimates the brain network structure by solving the optimization problem with the L1 norm regularization term, introduces sparse prior to the network structure and has better biological significance. Although such methods are still rarely reported in morphological brain network research, in the field of parallel functional brain network construction, it has become a research trend in recent years to construct more reasonable brain network structures by improving sparse regularization terms or introducing new regularization terms.
However, existing brain network architectures are prioritized over any brain network analysis, and the resulting brain network structure may not be optimal for subsequent tasks such as individual identification, disease diagnosis, and the like. In response to this problem, in the related field of graph signal processing (brain networks can be regarded as graphs), researchers have proposed adaptive graph structure adjustment for downstream tasks by using graph neural network technology in geometric deep learning. These pioneering studies provide important ideas for designing estimation methods of brain network structures, but none of them is designed for brain network structures, and does not use information that has proven valuable in brain network construction, such as sparsity and modularity of brain networks.
In summary, the prior art problems are:
1. the existing individual morphological brain network construction method is still immature, is limited by the speed and accuracy of cerebral cortex surface segmentation on one hand, and does not generally consider subsequent tasks such as disease diagnosis and the like on the other hand, so that the biological significance of the method is to be improved;
2. the relevance of feature distribution among the ROIs is measured by calculating an exponential function of multivariate Euclidean distances, the method is simple in calculation and has biological intuition, but potential influences of other regions are ignored, and wrong connection can be generated;
3. the existing brain network structure estimation does not provide a brain network structure estimation method with self-adaptive characteristics aiming at subsequent tasks.
Disclosure of Invention
In order to solve the technical problems, the invention provides a brain network structure and similarity contact learning method based on a graph attention network, which comprises the following steps:
s1: acquiring subject brain sMRI data, performing cortical segmentation processing and morphological feature extraction on the acquired brain sMRI data to obtain morphological features of all brain areas, modeling a subject brain network into a graph, wherein each node in the brain network corresponds to one brain area, and forming a feature matrix of the brain network according to the morphological features of the nodes in each brain area;
s2: mapping the characteristic matrix of the brain network to an adjacent matrix of the image through Pearson correlation calculation, estimating an initial brain network structure, and setting a threshold value to remove weak connection;
s3: inputting the feature matrix and the adjacent matrix of the pair of brain networks after the weak connection is removed into a twin map attention learning network, performing map embedding expression learning through the feature matrix of the brain networks to obtain an embedded feature expression, and calculating the similarity between brain network structures according to the embedded feature expression of the pair of brain networks and the adjacent matrix;
s4: calculating a graph regularization loss function and a twin network loss function, and constraining sparsity, modularity, intra-group similarity and inter-group dissimilarity of an initial brain network structure;
s5: updating an adjacent matrix of the brain network according to the embedded characteristics of the brain network, and combining the adjacent matrix with an initial brain network structure with constraint sparsity, modularity, intra-group similarity and inter-group dissimilarity to obtain an updated brain network structure;
s6: and learning a new embedded feature representation according to the updated feature matrix of the brain network structure, and calculating the similarity of the brain network structure according to the calculated new embedded feature representation.
Preferably, the network of the brain of the subject is modeled as a graph g =by a cortical segmentation process
{ v, ε, x, A }, whereinIs a set of n brain network nodes, each node corresponding to a brain region, and ε is a set of brain network edges; />The feature vector of each node is the average value of morphological feature vectors of all vertexes in a corresponding brain region, and d is the dimension of the node feature;the adjacency matrix of the graph represents the connection relationship between nodes.
Preferably, the initial brain network structure is estimated by mapping the feature matrix of the brain network to the adjacency matrix of the map by pearson correlation computation, which is expressed as:
wherein, a ij Representing the estimated morphological brain network structure,denotes a cross covariance, σ (-) denotes a sigmoid function, which indicates a Hadamrd product, </H>Represents a trainable weight vector->A feature matrix representing the i-th brain area node of the cerebral cortex, <>A feature matrix representing the j-th brain region node of the cerebral cortex.
Preferably, a threshold is set to remove weak links, expressed as:
wherein the content of the first and second substances,representing the morphological brain network structure after removal of the weak link, a ij Representing the estimated morphological brain network structure, ξ represents the threshold.
Preferably, the learning of the graph embedding representation is performed by a feature matrix of the brain network to obtain an embedding feature representation, which is expressed as:
wherein the content of the first and second substances,the currently learned embedded feature represents @>The expression is that H attention heads are connected in series, sigma is sigmoid function, v j Indicates the node in the h-th head of attention, v i Represents a node in the first order area within the h' th attention head, is represented>Indicates the number of nodes in the attention head, and>indicates the attention weight of the h-th head, is>Representing a trainable linear transformation matrix, x j And the characteristic matrix represents the j-th brain area node of the brain network.
Preferably, the similarity between the brain network structures is calculated from the embedded feature representation of a pair of brain networks and the adjacency matrix, and is expressed as:
wherein, O p Representing the similarity score, att (-) represents the graph attention layer, FC (-) represents the fully connected layer,<·,·>the inner product is represented by the sum of the two,an embedded representation representing a brain network, and->Representing the brainA contiguous matrix of the network.
Preferably, the graph regularization loss function is calculated as:
wherein the content of the first and second substances,represents a regularization penalty function, <' >>Represents the optimized brain network structure | · | 1 Represents the loss of sparsity | · |) * Represents a low rank loss, is asserted>Two subjects in a structural pair, λ, representing the brain network 1 And λ 2 Representing a hyperparameter for adjusting the weight between the loss terms.
Preferably, a twin network loss function is calculated, expressed as:
wherein the content of the first and second substances,representing the Hinge loss function, N p The number of the brain network structure pairs with high similarity; y is p For brain network structures to true similarity labels, O p Is the similarity score output by the twin network.
Preferably, S5 is represented by
Wherein the content of the first and second substances,represents the updated brain network structure, and>represents the initial brain network structure, and>represents the updated adjacency matrix, and β represents the weight.
The invention has the beneficial effects that: according to the method, the trainable pearson correlation is adopted to estimate the brain network structure, the twin graph network is adopted to learn the similarity between the brain networks, the iterative updating of the brain network structure and the graph embedding representation are adopted, two tasks of brain network structure estimation and similarity learning are jointly optimized, the morphological brain network is effectively estimated, and valuable information is provided for subsequent tasks such as individual identification, disease diagnosis assistance and the like.
Drawings
FIG. 1 is a flow chart of the present invention;
fig. 2 is a schematic diagram of the brain network structure and the contact learning of the similarity of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
A brain network structure and similarity contact learning method based on a graph attention network is disclosed, as shown in FIG. 2, and comprises the following steps, as shown in FIG. 1:
s1: acquiring subject brain sMRI data, performing cortical segmentation processing and morphological feature extraction on the acquired brain sMRI data to obtain morphological features of all brain areas, modeling a subject brain network into a graph, wherein each node in the brain network corresponds to one brain area, and forming a feature matrix of the brain network according to the morphological features of the nodes in each brain area;
s2: mapping the characteristic matrix of the brain network to an adjacent matrix of the image through Pearson correlation calculation, estimating an initial brain network structure, and setting a threshold value to remove weak connection;
s3: inputting the feature matrix and the adjacent matrix of the pair of brain networks after the weak connection is removed into a twin map attention learning network, performing map embedding expression learning through the feature matrix of the brain networks to obtain an embedded feature expression, and calculating the similarity between brain network structures according to the embedded feature expression of the pair of brain networks and the adjacent matrix;
s4: calculating a graph regularization loss function and a twin network loss function, and constraining sparsity, modularity, intra-group similarity and inter-group dissimilarity of an initial brain network structure;
s5: updating an adjacent matrix of the brain network according to the embedded characteristics of the brain network, and combining the adjacent matrix with an initial brain network structure with constraint sparsity, modularity, intra-group similarity and inter-group dissimilarity to obtain an updated brain network structure;
s6: and learning a new embedded feature representation according to the updated feature matrix of the brain network structure, and calculating the similarity of the brain network structure according to the calculated new embedded feature representation.
Modeling a subject brain network as a graph g = { v, epsilon, x, a } by a cortical segmentation process, whereinIs a set of n brain network nodes, each node corresponding to a brain region, and ε is a set of brain network edges; />The feature vector of each node is the average value of morphological feature vectors of all vertexes in a corresponding brain region, and d is the dimension of the node feature; />The adjacency matrix of the graph represents the connection relationship between nodes.
The cerebral cortex segmentation method comprises the following steps: acquiring a brain sMRI image, reconstructing a cerebral cortex surface, and extracting morphological characteristics of the cerebral cortex surface; the brain sMRI image comprises a target image to be segmented and a segmented atlas image; carrying out feature decomposition on the multidimensional morphological characteristics of the surface of the cerebral cortex by using graph Laplace to obtain the spectral representation of the surface manifold of the cerebral cortex; context information of the cerebral cortex surface of the target image is extracted by using a segmentation module of a U-shaped layer level structure, anatomical prior information of the cerebral cortex surface of a atlas image is extracted by using an auxiliary module of the same U-shaped layer level structure, fusion processing is carried out on the information of the segmentation module and the information of the auxiliary module by using a non-local attention feature fusion module, and a prediction probability map of a segmentation label of the cerebral cortex surface of the target image is output.
Reconstructing the cerebral cortex surface comprises mapping a brain sMRI image to a standard space, correcting the nonuniformity of the brain sMRI image by using an N3 algorithm, removing non-brain tissues, and performing constrained segmentation based on an intensity value and a neighborhood to obtain white brain matter, gray brain matter and cerebrospinal fluid; reconstructing the inner surface of cerebral cortex by using interface of cerebral white matter and cerebral gray matter, and generating the outer surface of cerebral cortex by using interface of cerebral gray matter and cerebrospinal fluid.
The extracting of the morphological characteristics of the surface of the cerebral cortex comprises extracting five morphological characteristics reflecting different geometric attributes of the cerebral cortex for each vertex on the inner surface of the cerebral cortex, and comprises the following steps: an average curvature whose value is the inverse of the radius of the inscribed sphere at the apex; the Gaussian curvature, the value of which is the product of the main curvatures at the vertexes, reflects the bending degrees of the curved surface in different directions at the vertexes; cortical thickness, which is the distance between the white matter surface and the corresponding apex of the gray matter surface; the sulcus depth is the vertical distance from the vertex to the middle curved surface of gray and white matter; the surface area, whose value is the average area of all the adjacent triangular patches of the vertex.
The partitioning module of the U-shaped layer level structure comprises a first coding layer and a first decoding layer; the first coding layer comprises a first input layer, and a plurality of first feature extraction layers, first pooling layers and first non-local attention feature fusion modules which are repeatedly arranged; the first decoding layer comprises a plurality of second feature extraction layers, a first upper pooling layer, a second non-local attention feature fusion module and a first output layer which are repeatedly arranged; the auxiliary module of the U-shaped layer level structure comprises a second coding layer and a second decoding layer; the second coding layer comprises a second input layer, a plurality of third feature extraction layers and a second pooling layer which are repeatedly arranged; the second decoding layer comprises a plurality of fourth feature extraction layers and a second upper pooling layer which are repeatedly arranged; the first pooling layer is connected with a first feature extraction layer and a third feature extraction layer, and the second pooling layer is connected with a second feature extraction layer and a fourth feature extraction layer; the first non-local attention feature fusion module is connected with a first pooling layer and a second pooling layer, and the second non-local attention feature fusion module is connected with a first upper pooling layer and a second upper pooling layer.
The feature matrix of the brain network is mapped to the adjacency matrix of the graph by pearson correlation computation, and the initial brain network structure is estimated, expressed as:
wherein, a ij Representing the estimated morphological brain network structure,denotes a cross covariance, σ (-) denotes a sigmoid function, which denotes a Hadamrd product, w s Represents a trainable weight vector->A feature matrix representing the i-th brain area node of the cerebral cortex, <>A feature matrix representing the j-th brain region node of the cerebral cortex.
Setting a threshold to remove weak links is expressed as:
wherein the content of the first and second substances,representing the morphological brain network structure after removal of the weak link, a ij Representing the estimated morphological brain network structure, ξ represents the threshold.
For the learning of graph embedding representation, a multi-head graph attention mechanism is adopted, graph attention is a graph convolution method based on space and is suitable for an abnormal graph, and attention weight of an h-th head is calculated firstly:
wherein the content of the first and second substances,indicates the attention weight of the h-th head, is>Representing a trainable linear transformation matrix, | representing a serial operation, |>Representing a trainable weight vector, ELU representing an exponential linear unit activation function, T representing a transposition operation, x i Feature matrix, x, representing the ith brain region node of the brain network j A feature matrix, v, representing the j-th brain region node of the brain network j Indicates the node in the h-th head of attention, v i Represents a node in the first order area within the h' th attention head, is represented>Representing nodes within the head of attentionAnd (4) counting.
Then node v i All neighborhood information is aggregated, and all H attention heads are connected in series, so that the embedded feature representation can be obtained:
wherein the content of the first and second substances,the currently learned embedded feature represents @>The expression is that H attention heads are connected in series, sigma is sigmoid function, v j Indicates the node in the h-th head of attention, v i Represents a node in the first order area within the h' th attention head, is represented>Indicates the number of nodes in the attention head, and>indicates the attention weight of the h-th head, is>Representing a trainable linear transformation matrix, x j And the characteristic matrix represents the j-th brain area node of the brain network.
Learning the similarity between the brain network structure according to the embedded feature representation of a pair of brain networks and the adjacent matrix, and expressing as follows:
wherein, O p Representing the similarity score, att (-) represents the graph attention layer, FC (-) represents the fully connected layer,<·,·>the inner product is represented by the sum of the two,an embedded representation representing a brain network, and->A adjacency matrix representing the brain network.
Calculating a graph regularization loss function, expressed as:
wherein, the first and the second end of the pipe are connected with each other,represents a regularization penalty function, <' >>Represents the optimized brain network structure | · | 1 Represents the loss of sparsity | · |) * Represents a low rank loss, is asserted>Two subjects in a structural pair, λ, representing the brain network 1 And λ 2 Representing a hyperparameter for adjusting the weight between the loss terms.
Calculating a twin network loss function expressed as:
wherein the content of the first and second substances,representing the Hinge loss function, N p The number of the brain network structure pairs with high similarity; y is p For brain network structures to true similarity labels, O p Is the similarity score output by the twin network.
Said S5, is represented by
Wherein the content of the first and second substances,represents an updated brain network structure, and +>Represents the initial brain network structure, and>represents the updated adjacency matrix, and β represents the weight.
Learning a new embedded feature representation from the updated feature matrix of the brain network structure:
wherein the content of the first and second substances,represents a new embedded feature representation learned, att (-) represents the graph attention level, and/or>A feature matrix representing the updated brain network structure, based on the evaluation of the brain network structure>A adjacency matrix representing the updated brain network structure;
calculating the similarity of the brain network structure according to the calculated new embedded feature representation:
wherein the content of the first and second substances,an embedded representation representing the updated brain network structure, based on the evaluation of the brain network structure>A adjacency matrix representing the updated brain network structure, att (-) representing the graph attention layer, FC (-) representing the fully-connected layer,<·,·>denotes the inner product, O p Is a similarity score.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.
Claims (9)
1. A brain network structure and similarity contact learning method based on a graph attention network is characterized by comprising the following steps:
s1: acquiring subject brain sMRI data, performing cortical segmentation processing and morphological feature extraction on the acquired brain sMRI data to obtain morphological features of all brain areas, modeling a subject brain network into a graph, wherein each node in the brain network corresponds to one brain area, and forming a feature matrix of the brain network according to the morphological features of the nodes in each brain area;
s2: mapping the characteristic matrix of the brain network to an adjacent matrix of the image through Pearson correlation calculation, estimating an initial brain network structure, and setting a threshold value to remove weak connection;
s3: inputting the feature matrix and the adjacent matrix of the pair of brain networks after the weak connection is removed into a twin map attention learning network, performing map embedding expression learning through the feature matrix of the brain networks to obtain an embedded feature expression, and calculating the similarity between brain network structures according to the embedded feature expression of the pair of brain networks and the adjacent matrix;
s4: calculating a graph regularization loss function and a twin network loss function, and constraining sparsity, modularity, intra-group similarity and inter-group dissimilarity of an initial brain network structure;
s5: updating an adjacent matrix of the brain network according to the embedded characteristics of the brain network, and combining the adjacent matrix with an initial brain network structure with constraint sparsity, modularity, intra-group similarity and inter-group dissimilarity to obtain an updated brain network structure;
s6: and learning a new embedded feature representation according to the updated feature matrix of the brain network structure, and calculating the similarity of the brain network structure according to the new embedded feature representation obtained by learning.
2. The method of claim 1, wherein the brain network of the subject is modeled as a graph g = { v, epsilon, x, a } by a cortical segmentation process, wherein the graph g = { v, epsilon, x, a }, and wherein the structure and similarity of the brain network is based on the graph attention networkIs a set of n brain network nodes, each node corresponding to a brain region, epsilon is a set of brain network edges;the feature vector of each node is the average value of morphological feature vectors of all vertexes in a corresponding brain region, and d is the dimension of the node feature;the adjacency matrix of the graph represents the connection relationship between nodes.
3. The contact learning method of brain network structure and similarity based on graph attention network according to claim 1, characterized in that the initial brain network structure is estimated by mapping the feature matrix of brain network to the adjacency matrix of graph through Pearson correlation calculation, expressed as:
wherein, a ij Representing the estimated morphological brain network structure,denotes the cross-covariance, σ (·) denotes the sigmoid function, and |, denotes the Hadamrd product,a trainable weight vector is represented that is used to represent,a feature matrix representing the ith brain region node of the cerebral cortex,a feature matrix representing the j-th brain region node of the cerebral cortex.
4. The contact learning method of brain network structure and similarity based on graph attention network according to claim 1, characterized in that a threshold is set to remove weak connection, expressed as:
5. The contact learning method of the brain network structure and the similarity based on the graph attention network according to claim 1, wherein the graph embedding representation learning is performed by a feature matrix of the brain network to obtain an embedding feature representation, which is represented as:
wherein the content of the first and second substances,the embedded feature representation that is currently learned,the expression is that H attention heads are connected in series, sigma is sigmoid function, v j Indicates the node in the h-th head of attention, v i Representing nodes in the first-order domain in the h-th attention head,indicating the number of nodes within the attention head,indicating the attention weight of the h-th head,representing a trainable linear transformation matrix, x j A feature matrix representing the jth brain region node of the brain network.
6. The contact learning method based on the brain network structure and the similarity of the graph attention network according to claim 1, wherein the similarity between the brain network structures is calculated according to the embedded feature representation of a pair of brain networks and the adjacency matrix, and is represented as follows:
wherein, O p Representing the similarity score, att (-) representing the graph attention layer, FC (-) representing the full connectivityThe adhesive layer is connected with the base material layer,<·,·>the inner product is represented by the sum of the two,an embedded representation of the brain network is represented,a adjacency matrix representing the brain network.
7. The contact learning method of brain network structure and similarity based on graph attention network according to claim 1, characterized by calculating a graph regularization loss function expressed as:
wherein the content of the first and second substances,a regularization loss function is represented as,represents the optimized brain network structure | · | 1 Represents the loss of sparsity | · |) * Which represents a low rank loss, is represented,two subjects in a structural pair, λ, representing the brain network 1 And λ 2 Representing a hyperparameter for adjusting the weight between the loss terms.
8. The brain network structure and similarity contact learning method based on the graph attention network as claimed in claim 1, wherein the twin network loss function is calculated as:
wherein the content of the first and second substances,representing the Hinge loss function, N p Representing the number of pairs of brain network structures with high similarity; y is p Similarity label, O, representing the brain network structure versus reality p A similarity score is represented.
9. The method for learning brain network structure and similarity based on graph attention network as claimed in claim 1, wherein S5 is expressed as
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211240213.2A CN115841607A (en) | 2022-10-11 | 2022-10-11 | Brain network structure and similarity joint learning method based on graph attention network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211240213.2A CN115841607A (en) | 2022-10-11 | 2022-10-11 | Brain network structure and similarity joint learning method based on graph attention network |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115841607A true CN115841607A (en) | 2023-03-24 |
Family
ID=85575516
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211240213.2A Pending CN115841607A (en) | 2022-10-11 | 2022-10-11 | Brain network structure and similarity joint learning method based on graph attention network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115841607A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116051849A (en) * | 2023-04-03 | 2023-05-02 | 之江实验室 | Brain network data feature extraction method and device |
CN116977272A (en) * | 2023-05-05 | 2023-10-31 | 深圳市第二人民医院(深圳市转化医学研究院) | Structural magnetic resonance image processing method based on federal graph annotation force learning |
CN117036727A (en) * | 2023-10-09 | 2023-11-10 | 之江实验室 | Method and device for extracting multi-layer embedded vector features of brain network data |
CN117172294A (en) * | 2023-11-02 | 2023-12-05 | 烟台大学 | Method, system, equipment and storage medium for constructing sparse brain network |
CN117350352A (en) * | 2023-12-06 | 2024-01-05 | 烟台大学 | Learning method, system and equipment from structural brain network to functional connectivity network |
-
2022
- 2022-10-11 CN CN202211240213.2A patent/CN115841607A/en active Pending
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116051849A (en) * | 2023-04-03 | 2023-05-02 | 之江实验室 | Brain network data feature extraction method and device |
CN116051849B (en) * | 2023-04-03 | 2023-07-07 | 之江实验室 | Brain network data feature extraction method and device |
CN116977272A (en) * | 2023-05-05 | 2023-10-31 | 深圳市第二人民医院(深圳市转化医学研究院) | Structural magnetic resonance image processing method based on federal graph annotation force learning |
CN117036727A (en) * | 2023-10-09 | 2023-11-10 | 之江实验室 | Method and device for extracting multi-layer embedded vector features of brain network data |
CN117036727B (en) * | 2023-10-09 | 2024-01-05 | 之江实验室 | Method and device for extracting multi-layer embedded vector features of brain network data |
CN117172294A (en) * | 2023-11-02 | 2023-12-05 | 烟台大学 | Method, system, equipment and storage medium for constructing sparse brain network |
CN117172294B (en) * | 2023-11-02 | 2024-01-26 | 烟台大学 | Method, system, equipment and storage medium for constructing sparse brain network |
CN117350352A (en) * | 2023-12-06 | 2024-01-05 | 烟台大学 | Learning method, system and equipment from structural brain network to functional connectivity network |
CN117350352B (en) * | 2023-12-06 | 2024-02-23 | 烟台大学 | Learning method, system and equipment from structural brain network to functional connectivity network |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Wang et al. | A review on extreme learning machine | |
Nawaz et al. | A deep feature-based real-time system for Alzheimer disease stage detection | |
CN113616184B (en) | Brain network modeling and individual prediction method based on multi-mode magnetic resonance image | |
CN115841607A (en) | Brain network structure and similarity joint learning method based on graph attention network | |
CN113040715B (en) | Human brain function network classification method based on convolutional neural network | |
CN111461232A (en) | Nuclear magnetic resonance image classification method based on multi-strategy batch type active learning | |
CN113763442B (en) | Deformable medical image registration method and system | |
CN113693563B (en) | Brain function network classification method based on hypergraph attention network | |
CN114897914B (en) | Semi-supervised CT image segmentation method based on countermeasure training | |
CN112614131A (en) | Pathological image analysis method based on deformation representation learning | |
CN112766376A (en) | Multi-label eye fundus image identification method based on GACNN | |
Fan et al. | Evolutionary neural architecture search for retinal vessel segmentation | |
Ding et al. | FTransCNN: Fusing Transformer and a CNN based on fuzzy logic for uncertain medical image segmentation | |
CN112529063B (en) | Depth domain adaptive classification method suitable for Parkinson voice data set | |
CN115423754A (en) | Image classification method, device, equipment and storage medium | |
CN113610118A (en) | Fundus image classification method, device, equipment and medium based on multitask course learning | |
CN114299006A (en) | Self-adaptive multi-channel graph convolution network for joint graph comparison learning | |
Shirazi et al. | Deep learning in the healthcare industry: theory and applications | |
CN117036288A (en) | Tumor subtype diagnosis method for full-slice pathological image | |
CN115601346A (en) | Multi-level classification method for knee joint cartilage injury by multi-modal MRI based on deep learning | |
CN117036760A (en) | Multi-view clustering model implementation method based on graph comparison learning | |
CN113951883B (en) | Gender difference detection method based on electroencephalogram signal emotion recognition | |
CN112668543B (en) | Isolated word sign language recognition method based on hand model perception | |
Nalluri et al. | Pneumonia screening on chest X-rays with optimized ensemble model | |
CN116797817A (en) | Autism disease prediction technology based on self-supervision graph convolution model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |