CN113035349A - Neural network dynamic fusion method for genetic metabolic disease multi-center screening - Google Patents
Neural network dynamic fusion method for genetic metabolic disease multi-center screening Download PDFInfo
- Publication number
- CN113035349A CN113035349A CN202110320409.1A CN202110320409A CN113035349A CN 113035349 A CN113035349 A CN 113035349A CN 202110320409 A CN202110320409 A CN 202110320409A CN 113035349 A CN113035349 A CN 113035349A
- Authority
- CN
- China
- Prior art keywords
- task
- neural network
- node
- screening
- center
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 70
- 238000012216 screening Methods 0.000 title claims abstract description 62
- 230000002068 genetic effect Effects 0.000 title claims abstract description 24
- 208000030159 metabolic disease Diseases 0.000 title claims abstract description 24
- 208000016097 disease of metabolism Diseases 0.000 title claims abstract description 22
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 15
- 230000004927 fusion Effects 0.000 claims abstract description 15
- 238000000034 method Methods 0.000 claims abstract description 12
- 238000009826 distribution Methods 0.000 claims abstract description 4
- 238000012423 maintenance Methods 0.000 claims abstract description 4
- 238000007726 management method Methods 0.000 claims abstract description 4
- 238000012549 training Methods 0.000 claims description 12
- 238000007667 floating Methods 0.000 claims description 9
- 238000003062 neural network model Methods 0.000 claims description 9
- 239000013598 vector Substances 0.000 claims description 9
- 238000004364 calculation method Methods 0.000 claims description 7
- 239000011159 matrix material Substances 0.000 claims description 6
- 238000007499 fusion processing Methods 0.000 claims description 5
- 238000001514 detection method Methods 0.000 claims description 4
- 230000009191 jumping Effects 0.000 claims description 3
- 238000004891 communication Methods 0.000 abstract description 9
- 238000005070 sampling Methods 0.000 abstract description 2
- 238000013473 artificial intelligence Methods 0.000 description 5
- 238000003745 diagnosis Methods 0.000 description 3
- 238000010276 construction Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 208000026350 Inborn Genetic disease Diseases 0.000 description 1
- 208000024556 Mendelian disease Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000007418 data mining Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 208000016245 inborn errors of metabolism Diseases 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000013468 resource allocation Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Biomedical Technology (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Biophysics (AREA)
- Mathematical Physics (AREA)
- Primary Health Care (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Epidemiology (AREA)
- Pathology (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
The invention discloses a neural network dynamic fusion method for genetic metabolic disease multi-center screening, which is characterized in that two types of nodes are required to be equipped besides a plurality of screening centers: the system comprises task nodes and computing nodes, wherein the task nodes are responsible for management, distribution and maintenance of multi-center screening tasks; each screening center needs to be provided with a computing node and is responsible for the task node to issue the computation of the joint modeling task. The method is oriented to a genetic metabolic disease multi-center screening scene, and fills the gap of a multi-center combined modeling method. Secondly, combining the characteristics of multiple genetic metabolic disease multi-center screening modeling tasks and high neural network fusion communication load pressure, the method adopts exploratory parameter sampling, evaluates the iterative synchronization degree of a plurality of computing node models, dynamically adjusts the time node of model fusion, improves the fusion efficiency, reduces the communication times and can effectively reduce the communication load of the whole task.
Description
Technical Field
The invention belongs to the technical field of multi-center screening model construction, relates to a dynamic fusion method of a neural network, and particularly relates to a dynamic fusion method of the neural network for genetic metabolic disease multi-center screening.
Background
In recent years, the screening mode of the genetic metabolic diseases is developed from the original closed independent screening of a single center and a single hospital into the combined screening of multi-center cooperation, and various alliances such as a hospital and a medical community appear, such as a national children health and disease clinical medicine research center, a national children regional medical center and the like, so that the resource allocation, the precision and the efficiency of the genetic metabolic disease screening are enhanced. Meanwhile, the development of the artificial intelligence technology brings a revolution to the medical industry, and a new mode of intelligent medical treatment is provided. By combining intelligent algorithms such as data mining and machine learning, the genetic metabolic disease screening process based on artificial intelligence is optimized and upgraded, so that a doctor is assisted in interpretation and diagnosis, and the screening efficiency is greatly improved.
At present, some hospitals and screening centers are used as test points, artificial intelligence auxiliary diagnosis platforms are deployed to improve the screening quality of the hospitals, and two problems still exist: firstly, in a multi-center screening mode, an artificial intelligence combined modeling method facing genetic metabolic diseases is not available for a while; secondly, the neural network is a common and effective model in the existing artificial intelligence auxiliary diagnosis platform, and one bottleneck for expanding the neural network to multi-center combined modeling is the pressure of a large amount of parameter and gradient information on communication load during model fusion. Because the hereditary metabolic diseases are a large class of hereditary diseases, a plurality of clinical screening or clinical scientific research property modeling tasks exist in a scene, but the calculation resources of a hospital are objectively limited, and the communication concurrence of a large number of combined modeling tasks cannot be borne, so that the multi-center combined modeling efficiency is low.
Disclosure of Invention
The invention aims to provide a dynamic fusion method of a neural network for genetic metabolic disease multi-center screening aiming at the defects of the prior art. Secondly, combining the characteristics of multiple genetic metabolic disease multi-center screening modeling tasks and high neural network fusion communication load pressure, the method adopts exploratory parameter sampling, evaluates the iterative synchronization degree of a plurality of computing node models, dynamically adjusts the time node of model fusion, improves the fusion efficiency, reduces the communication times and can effectively reduce the communication load of the whole task.
The technical scheme adopted by the invention is as follows:
a neural network dynamic fusion method for genetic metabolic disease multi-center screening is a scene for genetic metabolic disease multi-center screening, and two types of nodes are required to be equipped besides a plurality of screening centers: the system comprises task nodes and computing nodes, wherein the task nodes are responsible for management, distribution and maintenance of a multi-center screening task, and a user directly interacts with the task nodes; each screening center needs to be provided with a computing node and is responsible for the task node to issue the computation of the joint modeling task, the computing node only interacts with the task node, and a user cannot directly access the computing node; the connection mode of the task node and the computing node is any mainstream network topology structure.
When a user initiates a genetic metabolic disease multi-center screening task T, m screening centers participating in combined modeling are required to be selected; designing the structure and hyper-parameters of a neural network model, and model training configuration including the number of iterations E of the global modelgNumber of iterations of local model ElEtc.; configuration ofParameters of the dynamic fusion method: the parameter detection ratio alpha is set to be 0, 1]The floating point number of (1); the 'update stop ratio' beta is taken as (0, 1)]The floating point number of (1); the 'parameter high risk ratio' gamma is (0, 1)]The floating point number of (1); "Risk-free number of iterations" El *Value is [0, E ]l]An integer of (d); "number of cache intervals" δ, taking the value of [1, El-El *]Is an integer of (1).
The dynamic fusion process of the neural network is as follows:
1) the method comprises the following steps that a task node constructs a neural network model f of a task T according to a network structure designed by a user, and initializes a global neural network f (w), wherein w is a parameter of the global neural network; initializing all computing nodes as low-risk computing nodes;
2) the task node combines the neural network model f, the hyper-parameters, the training configuration and the E of the task Tl *Delta sending to m screening centers1,…,Cm;
3) The task node copies the global neural network parameter w of the task T to C1,…,CmAs a local neural network f (w) for each screening center1),…,f(wm) When w is equal to w1=…=wm;
4) Based on training configuration information, C1,…,CmThe low-risk computing nodes in the system respectively use local data D of the respective screening centers1,…,DmBegin training the local neural network f (w)1),…,f(wm) Wherein w is1,…,wmFor local neural network parameters, w is not equal to w1≠…≠wm;
5) Local neural network f (w) at the ith screening centeri) Is reached by the number of iterations jThen, the computing node C of the screening centeriSaving the local neural network parameters at the moment as optimal parametersLocal neural network f (w) at the ith screening centeri) Is satisfied withAnd isThen, the computing node C of the screening centeriSaving the local neural network parameter w at this timeijThe number of iterations E at this time is savediJ; and pause f (w)i) The iteration of (2) begins to send a model fusion request signal to the task node, at which time the compute node CiOther tasks may be performed;
6) when the task node receives model fusion request signals of all the computing nodes, alpha- | w | parameters are randomly selected from all | w | parameters of a global neural network f (w), index numbers corresponding to the parameters are recorded, and then the index numbers are sent to all the computing nodes;
7) when computing node CiAfter receiving the parameter index number sent by the task node, adding the task T into the calculation task queue again;
8) when computing node CiWhen the task queue of (2) executes to task T, CiReading local neural network parameters w in storageijSequentially fetching w according to the order of parameter index numbersijCorresponding parameters in the vector form a column vectorAnd sending to the task node;
9) after the task nodes receive the parameter vectors of all the computing nodes, a parameter matrix is formedCalculating the upper quartile Q of the parameter matrix by row3Lower quartile Q1And four-bit distance IQR ═ Q3-Q1And a high risk cutoff value R1=Q3+1.5IQR and R2=Q1-1.5IQR;
10) The task node starts to count the risk proportion of each computing node, and one computing node CiRisk ratio gamma ofiThe calculation is as follows:
whereinAndrespectively representMiddle greater than cutoff value R1And less than the cutoff value R2The number of parameters of (2);
11) if a computing node CiRisk ratio gamma ofiGreater than or equal to gamma, the task node sends CiMarking as high risk if one compute node CiRisk ratio gamma ofiLess than gamma, task node is Ci(ii) is flagged as low risk;
12) if the number of the computing nodes marked as high risk is more than beta m, the task node informs all the computing nodes to upload the optimal parametersThe global neural network will update the parameters in the following way;
wherein:
therein withoutiI and I represent the data volume of the ith screening center and the m screening centers; jumping to step 16) after the task node updates the complete local neural network parameter w);
13) if the number of the computing nodes marked as high risk is less than beta m, the task node sends a signal for stopping iteration to all high risk computing nodes and sends a signal for continuing iteration to all low risk nodes;
14) with CiFor example, when the high-risk computing node receives the signal for stopping iteration, the parameter w stored in the node is deletedijReleasing the resources of the task T for the execution of other tasks;
15) with CiFor example, when the low-risk computing node receives the signal of continuing iteration, the optimal parameter on the node is determinedUpdated to parameter wijDeleting w stored on a nodeijThen continue the local neural network f (w)i) And jumps to step 4);
16) when the global iteration number reaches EgThen, stopping the dynamic fusion process of the neural network to obtain a global neural network; otherwise, the task node marks all the computing nodes as low-risk, informs all the computing nodes to empty storage and temporary variables generated by the global iteration, and jumps to step 3).
The invention has the beneficial effects that:
the invention designs a neural network dynamic fusion method for genetic metabolic disease multi-center screening, provides a parameter detection technology, can dynamically evaluate the fitting state and the synchronization degree of each central local model, reduces the times of model fusion, reduces the pressure of model parameter and gradient information exchange on communication load, and improves the efficiency of combined modeling.
Drawings
FIG. 1 is a schematic of the process of the present invention;
Detailed Description
The invention is further described with reference to the following figures and specific examples.
The invention relates to a neural network dynamic fusion method for genetic metabolic disease multi-center screening, which is a genetic metabolic disease multi-center screening scene, and two types of nodes are required to be equipped besides a plurality of screening centers: the system comprises task nodes and computing nodes, wherein the task nodes are responsible for management, distribution and maintenance of a multi-center screening task, a user can directly interact with the task nodes to perform operations such as task initiation, model construction and flow arrangement, the task nodes can be built on a third-party cloud platform independently of a screening center and can also be built by a regional medical center (the task nodes in the example are built based on the national regional medical center); each screening center needs to be provided with a computing node and is responsible for the task node to issue the computation of the joint modeling task, the computing node only interacts with the task node, and a user cannot directly access the computing node. The connection mode of the task node and the computing node can be any mainstream network topology structure (the connection mode of the example is a star topology structure).
When a user initiates a genetic metabolic disease multi-center screening task T, m screening centers participating in combined modeling are required to be selected; designing the structure and hyper-parameters of a neural network model, and model training configuration including the number of iterations E of the global modelgNumber of iterations of local model ElEtc.; configuring parameters of the dynamic fusion method: the parameter detection ratio alpha is set to be 0, 1]The floating point number of (1); the 'update stop ratio' beta is taken as (0, 1)]The floating point number of (1); the 'parameter high risk ratio' gamma is (0, 1)]The floating point number of (1); "Risk-free number of iterations"Take the value of [0, El]An integer of (d); the number of times of cache interval delta is taken asIs an integer of (1).
As shown in fig. 1, the dynamic fusion process of the neural network in the method of the present invention is as follows:
1) the method comprises the following steps that a task node constructs a neural network model f of a task T according to a network structure designed by a user, and initializes a global neural network f (w), wherein w is a parameter of the global neural network; initializing all computing nodes as low-risk computing nodes;
2) the task node calculates a neural network model f, a hyper-parameter, a training configuration and a,Delta and other information are sent to computing nodes C of m screening centers1,…,Cm;
3) The task node copies the global neural network parameter w of the task T to C1,…,CmAs a local neural network f (w) for each screening center1),…,f(wm) When w is equal to w1=…=wm;
4) Based on training configuration information, C1,…,CmThe low-risk computing nodes in the system respectively use local data D of the respective screening centers1,…,DmBegin training the local neural network f (w)1),…,f(wm) Wherein w is1,…,wmFor local neural network parameters, w is not equal to w1≠…≠wm;
5) Local neural network f (w) at the ith screening centeri) Is reached by the number of iterations jThen, the computing node C of the screening centeriSaving the local neural network parameters at the moment as optimal parametersLocal neural network f (w) at the ith screening centeri) Is satisfied withAnd isThen, the computing node C of the screening centeriSaving the local neural network parameter w at this timeijThe number of iterations E at this time is savediJ; and pause f (w)i) The iteration of (2) begins to send a model fusion request signal to the task node, at which time the compute node CiOther tasks may be performed;
6) when the task node receives model fusion request signals of all the computing nodes, alpha- | w | parameters are randomly selected from all | w | parameters of a global neural network f (w), index numbers corresponding to the parameters are recorded, and then the index numbers are sent to all the computing nodes;
7) when computing node CiAfter receiving the parameter index number sent by the task node, adding the task T into the calculation task queue again;
8) when computing node CiWhen the task queue of (2) executes to task T, CiReading local neural network parameters w in storageijSequentially fetching w according to the order of parameter index numbersijCorresponding parameters in the vector form a column vectorAnd sending to the task node;
9) after the task nodes receive the parameter vectors of all the computing nodes, a parameter matrix is formedCalculating the upper quartile Q of the parameter matrix by row3Lower quartile Q1And four-bit distance IQR ═ Q3-Q1And a high risk cutoff value R1=Q3+15IQR and R2=Q1-1.5IQR;
10) The task node starts to count the risk proportion of each computing node, and one computing node CiRisk ratio gamma ofiThe calculation is as follows:
whereinAndrespectively representMiddle greater than cutoff value R1And less than the cutoff value R2The number of parameters of (2);
11) if a computing node CiRisk ratio gamma ofiGreater than or equal to gamma, the task node sends CiMarking as high risk if one compute node CiRisk ratio gamma ofiLess than gamma, task node is Ci(ii) is flagged as low risk;
12) if the number of the computing nodes marked as high risk is more than beta m, the task node informs all the computing nodes to upload the optimal parametersThe global neural network will update the parameters in the following way;
wherein:
wherein | Di| and | D | represent the data volume of the ith screening center and the m screening centers; jumping to step 16) after the task node updates the complete local neural network parameter w);
13) if the number of the computing nodes marked as high risk is less than beta m, the task node sends a signal for stopping iteration to all high risk computing nodes and sends a signal for continuing iteration to all low risk nodes;
14) with CiFor example, when the high-risk computing node receives the signal for stopping iteration, the parameter w stored in the node is deletedijReleasing the resources of the task T for the execution of other tasks;
15) with CiFor example, when the low-risk computing node receives the signal of continuing iteration, the optimal parameter on the node is determinedUpdated to parameter wijDeleting w stored on a nodeijThen continue the local neural network f (w)i) And jumps to step 4);
16) when the global iteration number reaches EgThen, stopping the dynamic fusion process of the neural network to obtain a global neural network; otherwise, the task node marks all the computing nodes as low-risk, informs all the computing nodes to empty storage and temporary variables generated by the global iteration, and jumps to step 3).
Claims (3)
1. A neural network dynamic fusion method for genetic metabolic disease multi-center screening is characterized in that the method is a genetic metabolic disease multi-center screening-oriented scene, and two types of nodes are required to be equipped besides a plurality of screening centers: the system comprises task nodes and computing nodes, wherein the task nodes are responsible for management, distribution and maintenance of a multi-center screening task, and a user directly interacts with the task nodes; each screening center needs to be provided with a computing node and is responsible for the task node to issue the computation of the joint modeling task, the computing node only interacts with the task node, and a user cannot directly access the computing node; the connection mode of the task node and the computing node is any mainstream network topology structure.
2. The dynamic neural network fusion method for genetic metabolic disease multi-center screening as claimed in claim 1, wherein when a user initiates a genetic metabolic disease multi-center screening task T, m screening centers participating in joint modeling need to be selected; designing the structure and hyper-parameters of a neural network model, and model training configuration including the number of iterations E of the global modelgNumber of iterations of local model El(ii) a Configuring parameters of the dynamic fusion method: the parameter detection ratio alpha is set to be 0, 1]The floating point number of (1); the 'update stop ratio' beta is taken as (0, 1)]The floating point number of (1); the 'parameter high risk ratio' gamma is (0, 1)]The floating point number of (1); "Risk-free number of iterations"Take the value of [0, El]An integer of (d); the number of times of cache interval delta is taken asIs an integer of (1).
3. The dynamic neural network fusion method for genetic metabolic disease multi-center screening according to claim 2, wherein the dynamic neural network fusion is performed as follows:
1) the method comprises the following steps that a task node constructs a neural network model f of a task T according to a network structure designed by a user, and initializes a global neural network f (w), wherein w is a parameter of the global neural network; initializing all computing nodes as low-risk computing nodes;
2) the task node calculates a neural network model f, a hyper-parameter, a training configuration and a,Computing node C for sending delta to m screening centers1,…,Cm;
3) The task node copies the global neural network parameter w of the task T to C1,…,CmAs a local neural network f (w) for each screening center1),…,f(wm) When w is equal to w1=…=wm;
4) Based on training configuration information, C1,…,CmThe low-risk computing nodes in the system respectively use local data D of the respective screening centers1,…,DmBegin training the local neural network f (w)1),...,f(wm) Wherein w is1,…,wmFor local neural network parameters, w is not equal to w1≠…≠wm;
5) Local neural network f (w) at the ith screening centeri) Is reached by the number of iterations jThen, the computing node C of the screening centeriSaving the local neural network parameters at the moment as optimal parametersLocal neural network f (w) at the ith screening centeri) Is satisfied withAnd isThen, the computing node C of the screening centeriSaving the local neural network parameter w at this timeijThe number of iterations E at this time is savedi=j;And pause f (w)i) The iteration of (2) begins to send a model fusion request signal to the task node, at which time the compute node CiOther tasks may be performed;
6) when the task node receives model fusion request signals of all the computing nodes, alpha- | w | parameters are randomly selected from all | w | parameters of a global neural network f (w), index numbers corresponding to the parameters are recorded, and then the index numbers are sent to all the computing nodes;
7) when computing node CiAfter receiving the parameter index number sent by the task node, adding the task T into the calculation task queue again;
8) when computing node CiWhen the task queue of (2) executes to task T, CiReading local neural network parameters w in storageijSequentially fetching w according to the order of parameter index numbersijCorresponding parameters in the vector form a column vectorAnd sending to the task node;
9) after the task nodes receive the parameter vectors of all the computing nodes, a parameter matrix is formedCalculating the upper quartile Q of the parameter matrix by row3Lower quartile Q1And four-bit distance IQR ═ Q3-Q1And a high risk cutoff value R1=Q3+1.5IQR and R2=Q1-1.5IQR;
10) The task node starts to count the risk proportion of each computing node, and one computing node CiRisk ratio gamma ofiThe calculation is as follows:
whereinAndrespectively representMiddle greater than cutoff value R1And less than the cutoff value R2The number of parameters of (2);
11) if a computing node CiRisk ratio gamma ofiGreater than or equal to gamma, the task node sends CiMarking as high risk if one compute node CiRisk ratio gamma ofiLess than gamma, task node is Ci(ii) is flagged as low risk;
12) if the number of the computing nodes marked as high risk is more than beta m, the task node informs all the computing nodes to upload the optimal parametersThe global neural network will update the parameters in the following way;
wherein:
wherein | Di| and | D | represent the ith screeningData volume for heart and m screening centers; jumping to step 16) after the task node updates the complete local neural network parameter w);
13) if the number of the computing nodes marked as high risk is less than beta m, the task node sends a signal for stopping iteration to all high risk computing nodes and sends a signal for continuing iteration to all low risk nodes;
14) with CiFor example, when the high-risk computing node receives the signal for stopping iteration, the parameter w stored in the node is deletedijReleasing the resources of the task T for the execution of other tasks;
15) with CiFor example, when the low-risk computing node receives the signal of continuing iteration, the optimal parameter on the node is determinedUpdated to parameter wijDeleting w stored on a nodeijThen continue the local neural network f (w)i) And jumps to step 4);
16) when the global iteration number reaches EgThen, stopping the dynamic fusion process of the neural network to obtain a global neural network; otherwise, the task node marks all the computing nodes as low-risk, informs all the computing nodes to empty storage and temporary variables generated by the global iteration, and jumps to step 3).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110320409.1A CN113035349B (en) | 2021-03-25 | 2021-03-25 | Neural network dynamic fusion method for multi-center screening of genetic metabolic diseases |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110320409.1A CN113035349B (en) | 2021-03-25 | 2021-03-25 | Neural network dynamic fusion method for multi-center screening of genetic metabolic diseases |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113035349A true CN113035349A (en) | 2021-06-25 |
CN113035349B CN113035349B (en) | 2024-01-05 |
Family
ID=76473768
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110320409.1A Active CN113035349B (en) | 2021-03-25 | 2021-03-25 | Neural network dynamic fusion method for multi-center screening of genetic metabolic diseases |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113035349B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007141325A1 (en) * | 2006-06-09 | 2007-12-13 | Bracco Spa | Method of processing multichannel and multivariate signals and method of classifying sources of multichannel and multivariate signals operating according to such processing method |
CN110473634A (en) * | 2019-04-23 | 2019-11-19 | 浙江大学 | A kind of Inherited Metabolic Disorders auxiliary screening method based on multiple domain fusion study |
WO2020020088A1 (en) * | 2018-07-23 | 2020-01-30 | 第四范式(北京)技术有限公司 | Neural network model training method and system, and prediction method and system |
CN111813858A (en) * | 2020-07-10 | 2020-10-23 | 电子科技大学 | Distributed neural network hybrid synchronous training method based on self-organizing grouping of computing nodes |
CN112151192A (en) * | 2020-10-22 | 2020-12-29 | 浙江大学 | Genetic metabolic disease screening method based on implicit space reprojection |
-
2021
- 2021-03-25 CN CN202110320409.1A patent/CN113035349B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007141325A1 (en) * | 2006-06-09 | 2007-12-13 | Bracco Spa | Method of processing multichannel and multivariate signals and method of classifying sources of multichannel and multivariate signals operating according to such processing method |
WO2020020088A1 (en) * | 2018-07-23 | 2020-01-30 | 第四范式(北京)技术有限公司 | Neural network model training method and system, and prediction method and system |
CN110473634A (en) * | 2019-04-23 | 2019-11-19 | 浙江大学 | A kind of Inherited Metabolic Disorders auxiliary screening method based on multiple domain fusion study |
CN111813858A (en) * | 2020-07-10 | 2020-10-23 | 电子科技大学 | Distributed neural network hybrid synchronous training method based on self-organizing grouping of computing nodes |
CN112151192A (en) * | 2020-10-22 | 2020-12-29 | 浙江大学 | Genetic metabolic disease screening method based on implicit space reprojection |
Non-Patent Citations (1)
Title |
---|
陈检;肖思隽;孙秋梅;: "基于迁移学习算法的糖网病自动筛选系统研究", 信息技术与信息化, no. 07 * |
Also Published As
Publication number | Publication date |
---|---|
CN113035349B (en) | 2024-01-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110060263B (en) | Medical image segmentation method, segmentation device, segmentation system and computer readable medium | |
CN108376558B (en) | Automatic generation method for multi-modal nuclear magnetic resonance image medical record report | |
CN106980899A (en) | The deep learning model and system of flow characteristic on prediction vascular tree blood flow paths | |
CN109559326A (en) | A kind of hemodynamic parameter calculation method, system and electronic equipment | |
CN107886510A (en) | A kind of prostate MRI dividing methods based on three-dimensional full convolutional neural networks | |
US11735321B2 (en) | System for the prognostics of the chronic diseases after the medical examination based on the multi-label learning | |
CN109431492A (en) | ECG lead signals based on neural network algorithm simulate method for reconstructing | |
US7734423B2 (en) | Method, system, and apparatus for virtual modeling of biological tissue with adaptive emergent functionality | |
JP2024039598A (en) | Multi-task hybrid supervised medical image segmentation method and system based on federated learning | |
Liang | Evaluation of fitness state of sports training based on self-organizing neural network | |
CN115830041A (en) | 3D medical image segmentation method based on cross fusion convolution and deformable attention transducer | |
CN114783571A (en) | Traditional Chinese medicine dynamic diagnosis and treatment scheme optimization method and system based on deep reinforcement learning | |
CN113160986A (en) | Model construction method and system for predicting development of systemic inflammatory response syndrome | |
CN114612408B (en) | Cardiac image processing method based on federal deep learning | |
CN115471716A (en) | Chest radiographic image disease classification model lightweight method based on knowledge distillation | |
Jiang et al. | Machine learning approaches to surrogate multifidelity growth and remodeling models for efficient abdominal aortic aneurysmal applications | |
Sengan et al. | Echocardiographic image segmentation for diagnosing fetal cardiac rhabdomyoma during pregnancy using deep learning | |
CN113035349A (en) | Neural network dynamic fusion method for genetic metabolic disease multi-center screening | |
JP2002537008A (en) | Apparatus and method for modeling a heart by computer | |
CN117038096A (en) | Chronic disease prediction method based on low-resource medical data and knowledge mining | |
CN116090503A (en) | Method for training neural network model based on knowledge distillation and related products | |
US20230094323A1 (en) | System and method for optimizing general purpose biological network for drug response prediction using meta-reinforcement learning agent | |
Bonfa et al. | HERMES: an expert system for the prognosis of hepatic diseases | |
Chen et al. | Predicting resting-state functional connectivity with efficient structural connectivity | |
CN115858820A (en) | Prediction method and device based on medical knowledge graph, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |