CN113035349B - Neural network dynamic fusion method for multi-center screening of genetic metabolic diseases - Google Patents
Neural network dynamic fusion method for multi-center screening of genetic metabolic diseases Download PDFInfo
- Publication number
- CN113035349B CN113035349B CN202110320409.1A CN202110320409A CN113035349B CN 113035349 B CN113035349 B CN 113035349B CN 202110320409 A CN202110320409 A CN 202110320409A CN 113035349 B CN113035349 B CN 113035349B
- Authority
- CN
- China
- Prior art keywords
- task
- neural network
- node
- screening
- computing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 68
- 238000012216 screening Methods 0.000 title claims abstract description 50
- 230000002068 genetic effect Effects 0.000 title claims abstract description 23
- 208000030159 metabolic disease Diseases 0.000 title claims abstract description 23
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 12
- 238000000034 method Methods 0.000 claims abstract description 17
- 230000004927 fusion Effects 0.000 claims abstract description 12
- 238000004364 calculation method Methods 0.000 claims abstract description 11
- 238000001514 detection method Methods 0.000 claims abstract description 6
- 238000009826 distribution Methods 0.000 claims abstract description 4
- 238000012423 maintenance Methods 0.000 claims abstract description 4
- 238000007726 management method Methods 0.000 claims abstract description 4
- 238000012549 training Methods 0.000 claims description 12
- 238000007667 floating Methods 0.000 claims description 9
- 238000003062 neural network model Methods 0.000 claims description 8
- 239000011159 matrix material Substances 0.000 claims description 6
- 239000013598 vector Substances 0.000 claims description 6
- 238000007499 fusion processing Methods 0.000 claims description 3
- 230000009191 jumping Effects 0.000 claims description 3
- 238000004891 communication Methods 0.000 abstract description 9
- 208000016097 disease of metabolism Diseases 0.000 abstract description 4
- 238000005070 sampling Methods 0.000 abstract description 2
- 230000001360 synchronised effect Effects 0.000 abstract description 2
- 238000013473 artificial intelligence Methods 0.000 description 3
- 238000003745 diagnosis Methods 0.000 description 3
- 238000010276 construction Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 208000028782 Hereditary disease Diseases 0.000 description 1
- 208000026350 Inborn Genetic disease Diseases 0.000 description 1
- 208000024556 Mendelian disease Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000007418 data mining Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 208000016245 inborn errors of metabolism Diseases 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000013468 resource allocation Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Biomedical Technology (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Biophysics (AREA)
- Mathematical Physics (AREA)
- Primary Health Care (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Epidemiology (AREA)
- Pathology (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
The invention discloses a neural network dynamic fusion method for multi-center screening of genetic metabolic diseases, which is characterized in that two types of nodes are required to be equipped besides a plurality of screening centers: task nodes and computing nodes, wherein the task nodes are responsible for management, distribution and maintenance of multi-center screening tasks; each screening center needs to be provided with a computing node, and the computing nodes are responsible for issuing the computation of the joint modeling task. The method is oriented to a multi-center screening scene of the genetic metabolic diseases, and fills the gap of a multi-center joint modeling method. And secondly, by combining the characteristics of multiple genetic metabolic disease multi-center screening modeling tasks and large communication load pressure fused by a neural network, the method adopts the detection parameter sampling to evaluate the synchronous degree of the iteration of a plurality of calculation node seed models, dynamically adjust the time nodes fused by the models, improve the fusion efficiency, reduce the communication times and effectively reduce the communication load of the whole task.
Description
Technical Field
The invention belongs to the technical field of multi-center screening model construction, relates to a neural network dynamic fusion method, and particularly relates to a neural network dynamic fusion method for multi-center screening of genetic metabolic diseases.
Background
In recent years, the screening mode of the genetic metabolic diseases is developed from the original single-center and single-hospital closed independent screening to multi-center collaborative combined screening, and various alliance forms such as medical conjuncts, medical communities and the like appear, such as national children health and disease clinical medical research centers, national children regional medical centers and the like, so that the resource allocation, the precision and the efficiency of the genetic metabolic disease screening are enhanced. Meanwhile, the development of artificial intelligence technology brings about revolution to the medical industry, and a new mode of intelligent medical treatment is provided. The genetic metabolic disease screening flow based on artificial intelligence is optimized and upgraded by combining intelligent algorithms such as data mining, machine learning and the like, and a doctor is assisted in interpretation and diagnosis, so that screening efficiency is greatly improved.
At present, a plurality of hospitals and screening centers serve as test points, an artificial intelligence auxiliary diagnosis platform is deployed to improve the screening quality of the hospitals, and two problems still exist: firstly, under a multi-center screening mode, an artificial intelligent joint modeling method for genetic metabolic diseases is not provided temporarily; secondly, the neural network is a common and effective model in the existing artificial intelligent auxiliary diagnosis platform, and one bottleneck for expanding the neural network to multi-center joint modeling is the pressure of a large amount of parameter and gradient information on communication load when the models are fused. Because the hereditary metabolic disease is a large type of hereditary disease, many clinical screening or clinical scientific research property modeling tasks exist in the scene, but the computing resources of hospitals are objectively limited, and cannot bear the communication concurrency of a large number of joint modeling tasks, so that the multi-center joint modeling efficiency is lower.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a neural network dynamic fusion method for multi-center screening of genetic metabolic diseases, which is a scene for multi-center screening of genetic metabolic diseases, and fills the gap of a multi-center joint modeling method. And secondly, by combining the characteristics of multiple genetic metabolic disease multi-center screening modeling tasks and large communication load pressure fused by a neural network, the method adopts the detection parameter sampling to evaluate the synchronous degree of the iteration of a plurality of calculation node seed models, dynamically adjust the time nodes fused by the models, improve the fusion efficiency, reduce the communication times and effectively reduce the communication load of the whole task.
The technical scheme adopted by the invention is as follows:
a neural network dynamic fusion method for multi-center screening of genetic metabolic diseases is a scene for multi-center screening of genetic metabolic diseases, and two types of nodes are required to be equipped besides a plurality of screening centers: task nodes and computing nodes, wherein the task nodes are responsible for management, distribution and maintenance of multi-center screening tasks, and users directly interact with the task nodes; each screening center needs to be provided with a computing node which is responsible for the task node to issue the computation of the joint modeling task, the computing node only interacts with the task node, and a user cannot directly access the computing node; the connection mode of the task node and the computing node is any mainstream network topology structure.
When a user initiates a multi-center screening task T of the genetic metabolic diseases, m screening centers participating in joint modeling are required to be selected; designing the structure and super parameters of the neural network model, and model training configuration including the global model iteration number E g Number of local model iterations E l Etc.; parameters of a dynamic fusion method are configured: the value of the parameter detection proportion alpha is [0,1 ]]Floating point number of (a); the "update stop ratio" β has a value of (0, 1]Floating point number of (a); the value of the parameter high risk proportion gamma is (0, 1)]Floating point number of (a); "Risk-free iteration number" E l * The value is [0, E l ]Is an integer of (2); the buffer interval number delta is [1, E l -E l * ]Is an integer of (a).
The process of dynamic fusion of the neural network is as follows:
1) The task node constructs a neural network model f of a task T according to a network structure designed by a user, and initializes a global neural network f (w), wherein w is a parameter of the global neural network; initializing all computing nodes as low-risk computing nodes;
2) The task node will be the god of task TThrough network model f, super parameters, training configuration and E l * Calculation node C, where delta is sent to m screening centers 1 ,…,C m ;
3) The task node copies the global neural network parameter w of the task T to C 1 ,…,C m In (c), a local neural network f (w 1 ),…,f(w m ) At this time w=w 1 =…=w m ;
4) C based on training configuration information 1 ,…,C m The low risk computing nodes in (1) respectively use the local data D of the respective screening centers 1 ,…,D m Begins training the local neural network f (w 1 ),…,f(w m ) Wherein w is 1 ,…,w m Is a local neural network parameter, where w is not equal to w 1 ≠…≠w m ;
5) When the local neural network f (w i ) The iteration number j reachesAt this time, computing node C of the screening center i Saving the local neural network parameters at this time as optimal parameters +.>When the local neural network f (w i ) The number of iterations j satisfies->And->At this time, computing node C of the screening center i Preserving local neural network parameters w at that time ij Saving the iteration number E at the moment i =j; and pause f (w) i ) Starting to send a model fusion request signal to the task node, at which point the computing node C i Other tasks may be performed;
6) When a task node receives model fusion request signals of all computing nodes, starting to randomly select alpha, w parameters from all the w parameters of a global neural network f (w), recording index numbers corresponding to the parameters, and then sending the index numbers to all the computing nodes;
7) When computing node C i After receiving the parameter index number sent by the task node, re-adding the task T into a calculation task queue;
8) When computing node C i When the task queue of (C) is executed to task T i Reading local neural network parameters w in storage ij Sequentially taking out w according to the sequence of parameter index numbers ij Corresponding parameters of (a) to form a column vectorAnd sending the task node;
9) When the task node receives the parameter vectors of all the calculation nodes, a parameter matrix is formedCalculating the upper quartile Q of the parameter matrix by row 3 Lower quartile Q 1 Quarter bit-spacing iqr=q 3 -Q 1 And a high risk cutoff value R 1 =Q 3 +1.5IQR and R 2 =Q 1 -1.5IQR;
10 Task node starts counting risk proportion of each computing node, one computing node C i Risk ratio gamma of (2) i The calculation is as follows:
wherein the method comprises the steps ofAnd->Respectively indicate->Is greater than the cut-off value R 1 And less than the cutoff value R 2 The number of parameters;
11 If a computing node C i Risk ratio gamma of (2) i Greater than or equal to gamma, task node C i Marked as high risk if one computing node C i Risk ratio gamma of (2) i Less than gamma, task node handles C i Marking as low risk;
12 If the number of computing nodes that have been marked as high risk is greater than βm, the task node will inform all computing nodes to upload the optimal parametersThe global neural network will perform parameter updates in the following manner;
wherein:
wherein | i The i and i represent the data amounts of the i-th screening center and the m screening centers; after the task node updates the complete local neural network parameter w, jumping to the step 16);
13 If the number of computing nodes that have been marked as high risk is less than βm, the task node will send a signal to stop iteration to all high risk computing nodes and send a signal to continue iteration to all low risk nodes;
14 In C) i For example, when the high risk computing node receives the signal for stopping iteration, the parameter w stored on the node is deleted ij Releasing the resources of the task T for the execution of other tasks;
15 In C) i For example, when the low risk computing node receives the signal of continuing iteration, the optimal parameters on the node are determinedUpdated to the parameter w ij Deleting w stored on node ij Then the local neural network f (w i ) And jump to step 4);
16 When the global iteration number reaches E g Stopping the dynamic fusion process of the neural network to obtain a global neural network; otherwise, the task node marks all the computing nodes as low risk, informs all the computing nodes to empty the storage and temporary variables generated in the global iteration, and jumps to step 3).
The beneficial effects of the invention are as follows:
the invention designs a neural network dynamic fusion method for screening multiple centers of genetic metabolic diseases, and provides a parameter detection technology, which can dynamically evaluate the fitting state and the synchronization degree of local models of each center, reduce the number of model fusion, reduce the pressure of model parameters and gradient information exchange on communication load and improve the efficiency of joint modeling.
Drawings
FIG. 1 is a schematic illustration of the process of the present invention;
Detailed Description
The invention will be further described with reference to the accompanying drawings and specific examples.
The invention discloses a neural network dynamic fusion method for multi-center screening of genetic metabolic diseases, which is a scene for multi-center screening of genetic metabolic diseases, and two types of nodes are required to be equipped besides a plurality of screening centers: task nodes and computing nodes, wherein the task nodes are responsible for management, distribution and maintenance of multi-center screening tasks, a user can directly interact with the task nodes to perform operations such as task initiation, model construction, flow arrangement and the like, the task nodes can be independently built on a third-party cloud platform or regional medical centers (the task nodes in the example are built based on national regional medical centers); each screening center needs to be provided with a computing node which is responsible for the task node to issue the computation of the joint modeling task, the computing node only interacts with the task node, and a user cannot directly access the computing node. The connection mode of the task node and the computing node may be any mainstream network topology (the connection mode of this example is a star topology).
When a user initiates a multi-center screening task T of the genetic metabolic diseases, m screening centers participating in joint modeling are required to be selected; designing the structure and super parameters of the neural network model, and model training configuration including the global model iteration number E g Number of local model iterations E l Etc.; parameters of a dynamic fusion method are configured: the value of the parameter detection proportion alpha is [0,1 ]]Floating point number of (a); the "update stop ratio" β has a value of (0, 1]Floating point number of (a); the value of the parameter high risk proportion gamma is (0, 1)]Floating point number of (a); risk-free iteration number "Take the value of [0, E l ]Is an integer of (2); "cache interval number" delta, value +.>Is an integer of (a).
As shown in fig. 1, the process of dynamic fusion of the neural network in the method of the present invention is as follows:
1) The task node constructs a neural network model f of a task T according to a network structure designed by a user, and initializes a global neural network f (w), wherein w is a parameter of the global neural network; initializing all computing nodes as low-risk computing nodes;
2) The task node uses the neural network model f, super parameters, training configuration of the task T,Delta and other information is sent to computing nodes C of m screening centers 1 ,…,C m ;
3) The task node copies the global neural network parameter w of the task T to C 1 ,…,C m In (c), a local neural network f (w 1 ),…,f(w m ) At this time w=w 1 =…=w m ;
4) C based on training configuration information 1 ,…,C m The low risk computing nodes in (1) respectively use the local data D of the respective screening centers 1 ,…,D m Begins training the local neural network f (w 1 ),…,f(w m ) Wherein w is 1 ,…,w m Is a local neural network parameter, where w is not equal to w 1 ≠…≠w m ;
5) When the local neural network f (w i ) The iteration number j reachesAt this time, computing node C of the screening center i Saving the local neural network parameters at this time as optimal parameters +.>When the local neural network f (w i ) The number of iterations j satisfies->And->At this time, computing node C of the screening center i Preserving local neural network parameters w at that time ij Saving the iteration number E at the moment i =j; and pause f (w) i ) Starting to send a model fusion request signal to the task node, at which point the computing node C i Other tasks may be performed;
6) When a task node receives model fusion request signals of all computing nodes, starting to randomly select alpha, w parameters from all the w parameters of a global neural network f (w), recording index numbers corresponding to the parameters, and then sending the index numbers to all the computing nodes;
7) When computing node C i After receiving the parameter index number sent by the task node, re-adding the task T into a calculation task queue;
8) When computing node C i When the task queue of (C) is executed to task T i Reading local neural network parameters w in storage ij Sequentially taking out w according to the sequence of parameter index numbers ij Corresponding parameters of (a) to form a column vectorAnd sending the task node;
9) When the task node receives the parameter vectors of all the calculation nodes, a parameter matrix is formedCalculating the upper quartile Q of the parameter matrix by row 3 Lower quartile Q 1 Quarter bit-spacing iqr=q 3 -Q 1 And a high risk cutoff value R 1 =Q 3 +1.5IQR and R 2 =Q 1 -1.5IQR;
10 Task node starts counting risk proportion of each computing node, one computing node C i Risk ratio gamma of (2) i The calculation is as follows:
wherein the method comprises the steps ofAnd->Respectively indicate->Is greater than the cut-off value R 1 And less than the cutoff value R 2 The number of parameters;
11 If a computing node C i Risk ratio gamma of (2) i Greater than or equal to gamma, task node C i Marked as high risk if one computing node C i Risk ratio gamma of (2) i Less than gamma, task node handles C i Marking as low risk;
12 If the number of computing nodes that have been marked as high risk is greater than βm, the task node will inform all computing nodes to upload the optimal parametersThe global neural network will perform parameter updates in the following manner;
wherein:
wherein |D i The i and the i D represent the data amounts of the i-th screening center and the m screening centers; after the task node updates the complete local neural network parameter w, jumping to the step 16);
13 If the number of computing nodes that have been marked as high risk is less than βm, the task node will send a signal to stop iteration to all high risk computing nodes and send a signal to continue iteration to all low risk nodes;
14 In C) i For example, when the high risk computing node receives the signal for stopping iteration, the parameter w stored on the node is deleted ij Releasing the resources of the task T for the execution of other tasks;
15 In C) i For example, when the low risk computing node receives the signal of continuing iteration, the optimal parameters on the node are determinedUpdated to the parameter w ij Deleting w stored on node ij Then the local neural network f (w i ) And jump to step 4);
16 When the global iteration number reaches E g Stopping the dynamic fusion process of the neural network to obtain a global neural network; otherwise, the task node marks all the computing nodes as low risk, informs all the computing nodes to empty the storage and temporary variables generated in the global iteration, and jumps to step 3).
Claims (1)
1. The neural network dynamic fusion method for multi-center screening of the genetic metabolic diseases is characterized in that the method is a scene for multi-center screening of the genetic metabolic diseases, and besides a plurality of screening centers, two types of nodes are required to be equipped: task nodes and computing nodes, wherein the task nodes are responsible for management, distribution and maintenance of multi-center screening tasks, and users directly interact with the task nodes; each screening center needs to be provided with a computing node which is responsible for the task node to issue the computation of the joint modeling task, the computing node only interacts with the task node, and a user cannot directly access the computing node; the connection mode of the task node and the computing node is a network topology structure of any mainstream;
when a user initiates a multi-center screening task T of the genetic metabolic diseases, m screening centers participating in joint modeling are required to be selected; designing the structure and super parameters of the neural network model, and model training configuration including the global model iteration number E g Number of local model iterations E l The method comprises the steps of carrying out a first treatment on the surface of the Configuration dynamic fusionParameters of the method: the value of the parameter detection proportion alpha is [0,1 ]]Floating point number of (a); the "update stop ratio" β has a value of (0, 1]Floating point number of (a); the value of the parameter high risk proportion gamma is (0, 1)]Floating point number of (a); risk-free iteration number "Take the value of [0, E l ]Is an integer of (2); "cache interval number" delta, value +.>Is an integer of (2);
the process of dynamic fusion of the neural network is as follows:
1) The task node constructs a neural network model f of a task T according to a network structure designed by a user, and initializes a global neural network f (w), wherein w is a parameter of the global neural network; initializing all computing nodes as low-risk computing nodes;
2) The task node uses the neural network model f, super parameters, training configuration of the task T,Delta is sent to the computing nodes C of the m screening centers 1 ,…,C m ;
3) The task node copies the global neural network parameter w of the task T to C 1 ,…,C m In (c), a local neural network f (w 1 ),…,f(w m ) At this time w=w 1 =…=w m ;
4) C based on training configuration information 1 ,…,C m The low risk computing nodes in (1) respectively use the local data D of the respective screening centers 1 ,…,D m Begins training the local neural network f (w 1 ),…,f(w m ) Wherein w is 1 ,…,w m Is a local neural network parameter, where w is not equal to w 1 ≠…≠w m ;
5) When the local neural network f (w i ) The iteration number j reachesAt this time, computing node C of the screening center i Saving the local neural network parameters at this time as optimal parameters +.>When the local neural network f (w i ) The number of iterations j satisfies->And->At this time, computing node C of the screening center i Preserving local neural network parameters w at that time ij Saving the iteration number E at the moment i =j; and pause f (w) i ) Starting to send a model fusion request signal to the task node, at which point the computing node C i Other tasks may be performed;
6) When a task node receives model fusion request signals of all computing nodes, starting to randomly select alpha, w parameters from all the w parameters of a global neural network f (w), recording index numbers corresponding to the parameters, and then sending the index numbers to all the computing nodes;
7) When computing node C i After receiving the parameter index number sent by the task node, re-adding the task T into a calculation task queue;
8) When computing node C i When the task queue of (C) is executed to task T i Reading local neural network parameters w in storage ij Sequentially taking out w according to the sequence of parameter index numbers ij Corresponding parameters of (a) to form a column vectorAnd sending the task node;
9) When the task node receives the parameter vectors of all the computing nodes,forming a parameter matrixCalculating the upper quartile Q of the parameter matrix by row 3 Lower quartile Q 1 Quarter bit-spacing iqr=q 3 -Q 1 And a high risk cutoff value R 1 =Q 3 +1.5IQR and R 2 =Q 1 -1.5IQR;
10 Task node starts counting risk proportion of each computing node, one computing node C i Risk ratio gamma of (2) i The calculation is as follows:
wherein the method comprises the steps ofAnd->Respectively indicate->Is greater than the cut-off value R 1 And less than the cutoff value R 2 The number of parameters;
11 If a computing node C i Risk ratio gamma of (2) i Greater than or equal to gamma, task node C i Marked as high risk if one computing node C i Risk ratio gamma of (2) i Less than gamma, task node handles C i Marking as low risk;
12 If the number of computing nodes that have been marked as high risk is greater than βm, the task node will inform all computing nodes to upload the optimal parametersThe global neural network will perform parameter updates in the following manner;
wherein:
wherein |D i The i and the i D represent the data amounts of the i-th screening center and the m screening centers; after the task node updates the complete local neural network parameter w, jumping to the step 16);
13 If the number of computing nodes that have been marked as high risk is less than βm, the task node will send a signal to stop iteration to all high risk computing nodes and send a signal to continue iteration to all low risk nodes;
14 In C) i For example, when the high risk computing node receives the signal for stopping iteration, the parameter w stored on the node is deleted ij Releasing the resources of the task T for the execution of other tasks;
15 In C) i For example, when the low risk computing node receives the signal of continuing iteration, the optimal parameters on the node are determinedUpdated to the parameter w ij Deleting w stored on node ij Then the local neural network f (w i ) And jump to step 4);
16 When the global iteration number reaches E g Next time, the neural network is stoppedA state fusion process is carried out to obtain a global neural network; otherwise, the task node marks all the computing nodes as low risk, informs all the computing nodes to empty the storage and temporary variables generated in the global iteration, and jumps to step 3).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110320409.1A CN113035349B (en) | 2021-03-25 | 2021-03-25 | Neural network dynamic fusion method for multi-center screening of genetic metabolic diseases |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110320409.1A CN113035349B (en) | 2021-03-25 | 2021-03-25 | Neural network dynamic fusion method for multi-center screening of genetic metabolic diseases |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113035349A CN113035349A (en) | 2021-06-25 |
CN113035349B true CN113035349B (en) | 2024-01-05 |
Family
ID=76473768
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110320409.1A Active CN113035349B (en) | 2021-03-25 | 2021-03-25 | Neural network dynamic fusion method for multi-center screening of genetic metabolic diseases |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113035349B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007141325A1 (en) * | 2006-06-09 | 2007-12-13 | Bracco Spa | Method of processing multichannel and multivariate signals and method of classifying sources of multichannel and multivariate signals operating according to such processing method |
CN110473634A (en) * | 2019-04-23 | 2019-11-19 | 浙江大学 | A kind of Inherited Metabolic Disorders auxiliary screening method based on multiple domain fusion study |
WO2020020088A1 (en) * | 2018-07-23 | 2020-01-30 | 第四范式(北京)技术有限公司 | Neural network model training method and system, and prediction method and system |
CN111813858A (en) * | 2020-07-10 | 2020-10-23 | 电子科技大学 | Distributed neural network hybrid synchronous training method based on self-organizing grouping of computing nodes |
CN112151192A (en) * | 2020-10-22 | 2020-12-29 | 浙江大学 | Genetic metabolic disease screening method based on implicit space reprojection |
-
2021
- 2021-03-25 CN CN202110320409.1A patent/CN113035349B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007141325A1 (en) * | 2006-06-09 | 2007-12-13 | Bracco Spa | Method of processing multichannel and multivariate signals and method of classifying sources of multichannel and multivariate signals operating according to such processing method |
WO2020020088A1 (en) * | 2018-07-23 | 2020-01-30 | 第四范式(北京)技术有限公司 | Neural network model training method and system, and prediction method and system |
CN110473634A (en) * | 2019-04-23 | 2019-11-19 | 浙江大学 | A kind of Inherited Metabolic Disorders auxiliary screening method based on multiple domain fusion study |
CN111813858A (en) * | 2020-07-10 | 2020-10-23 | 电子科技大学 | Distributed neural network hybrid synchronous training method based on self-organizing grouping of computing nodes |
CN112151192A (en) * | 2020-10-22 | 2020-12-29 | 浙江大学 | Genetic metabolic disease screening method based on implicit space reprojection |
Non-Patent Citations (1)
Title |
---|
基于迁移学习算法的糖网病自动筛选系统研究;陈检;肖思隽;孙秋梅;;信息技术与信息化(07);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN113035349A (en) | 2021-06-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Guo et al. | Cloud resource scheduling with deep reinforcement learning and imitation learning | |
CN104714852B (en) | A kind of parameter synchronization optimization method and its system suitable for distributed machines study | |
CN107330516A (en) | Model parameter training method, apparatus and system | |
CN110533183A (en) | The model partition and task laying method of heterogeneous network perception in a kind of assembly line distribution deep learning | |
CN110515303A (en) | A kind of adaptive dynamic path planning method based on DDQN | |
CN104951425A (en) | Cloud service performance adaptive action type selection method based on deep learning | |
CN109635922B (en) | Distributed deep learning parameter quantification communication optimization method and system | |
JP2024039598A (en) | Multi-task hybrid supervised medical image segmentation method and system based on federated learning | |
CN109743196A (en) | It is a kind of based on the network characterisation method across double-layer network random walk | |
CN111243045A (en) | Image generation method based on Gaussian mixture model prior variation self-encoder | |
Chen et al. | Binarized neural architecture search | |
CN113409299B (en) | Medical image segmentation model compression method | |
CN109657794A (en) | A kind of distributed deep neural network performance modelling method of queue based on instruction | |
CN104657626A (en) | Method for establishing protein-protein interaction network by utilizing text data | |
CN114612408B (en) | Cardiac image processing method based on federal deep learning | |
CN116932722A (en) | Cross-modal data fusion-based medical visual question-answering method and system | |
CN116051494A (en) | Fourier-based meta-learning field generalization less-sample medical image segmentation method | |
CN113035349B (en) | Neural network dynamic fusion method for multi-center screening of genetic metabolic diseases | |
CN112948123B (en) | Spark-based grid hydrological model distributed computing method | |
CN113672684A (en) | Layered user training management system and method for non-independent same-distribution data | |
CN110119268B (en) | Workflow optimization method based on artificial intelligence | |
WO2020248440A1 (en) | Machine learning method and apparatus | |
CN115329985B (en) | Unmanned cluster intelligent model training method and device and electronic equipment | |
CN112396154A (en) | Parallel method based on convolutional neural network training | |
CN115544307A (en) | Directed graph data feature extraction and expression method and system based on incidence matrix |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |