CN112932431A - Heart rate identification method based on 1DCNN + Inception Net + GRU fusion network - Google Patents

Heart rate identification method based on 1DCNN + Inception Net + GRU fusion network Download PDF

Info

Publication number
CN112932431A
CN112932431A CN202110103335.6A CN202110103335A CN112932431A CN 112932431 A CN112932431 A CN 112932431A CN 202110103335 A CN202110103335 A CN 202110103335A CN 112932431 A CN112932431 A CN 112932431A
Authority
CN
China
Prior art keywords
data
gru
1dcnn
heart rate
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110103335.6A
Other languages
Chinese (zh)
Other versions
CN112932431B (en
Inventor
潘晓光
王小华
张娜
董虎弟
宋晓晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanxi Sanyouhe Smart Information Technology Co Ltd
Original Assignee
Shanxi Sanyouhe Smart Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanxi Sanyouhe Smart Information Technology Co Ltd filed Critical Shanxi Sanyouhe Smart Information Technology Co Ltd
Priority to CN202110103335.6A priority Critical patent/CN112932431B/en
Publication of CN112932431A publication Critical patent/CN112932431A/en
Application granted granted Critical
Publication of CN112932431B publication Critical patent/CN112932431B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal

Abstract

The invention relates to the technical field of artificial intelligence, in particular to a heart rate identification method based on a 1DCNN + Inception Net + GRU fusion network. The method comprises the following steps: s1, constructing a data set; s2, denoising data; s3, segmenting data; s4, dividing a data set; s5, constructing a model; s6, training a model; and S7, processing the label. The network of the invention integrates the advantages of three network structures of CNN, IncepotionNet and GRU, and analyzes the Electrocardiosignal (ECG) by using the integrated deep learning network to obtain the classification of normal/abnormal heart rate. The invention is mainly applied to the aspect of intelligent heart rate identification.

Description

Heart rate identification method based on 1DCNN + Inception Net + GRU fusion network
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a heart rate identification method based on a 1DCNN + Inception Net + GRU fusion network.
Background
The existing heart rate identification technology is mostly based on an artificial characteristic engineering mode to construct an identification algorithm, arrhythmia includes various abnormal electrocardio signals such as atrial fibrillation, premature beat, ventricular fibrillation and the like, and artificial characteristic engineering cannot comprehensively select and extract characteristics of various abnormal heart rates, so that the algorithm has poor generalization capability and cannot meet practical requirements. The manual identification mode is easily influenced by subjective factors of a diagnostician, and the electrocardiosignals with weak characteristics are easily missed and mistakenly detected when being analyzed.
Problems or disadvantages of the prior art: the false detection rate of the manual identification mode is high, the conventional heart rate identification technology is not comprehensive in analysis of the electrocardiosignal, and the arrhythmia signals under various conditions cannot be effectively identified, so that the identification accuracy rate is low.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention provides a heart rate identification method based on a 1DCNN + Inception Net + GRU fusion network, which is based on a deep learning technology and is used for analyzing an electrocardiosignal and finishing the classification work of arrhythmia/normal.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows:
a heart rate identification method based on a 1DCNN + Inception Net + GRU fusion network comprises the following steps:
s1, data set construction: constructing a data set with a normal/abnormal heart rate classification label based on the standard data set, storing the data set as fast reading NPY data, reconstructing a normal electrocardio signal label as 0, and reconstructing a 15-class arrhythmia signal label as 1;
s2, denoising data: the NPY data are processed by using a high-pass filter and a low-pass filter, so that the influence of noise on NPY data identification is reduced, and the identification accuracy is improved;
s3, data segmentation: segmenting the NPY data by taking 500 time steps as a segment, and establishing segmented data labels based on corresponding labels of the NPY data;
s4, data set division: according to the following steps: 3, randomly dividing the data set into a training set and a testing set;
s5, model construction: the method comprises the steps of constructing a model through fusion of three networks, extracting features of different scales and structures of data by using the three networks, fully analyzing the features of the data, and classifying the data with high precision;
s6, model training: inputting training set data into the model, performing loop iteration training on the model until the model loss does not decrease and the accuracy does not increase, stopping training, and storing the model;
s7, label processing: and (3) obtaining probability data with an output result in the range of (0, 1) after model identification, and processing the label to obtain a binary classification result.
In step S1, the data set used is constructed based on the MIT-BIH arrhythmia standard data set, the data content is electrocardiographic signal data, and the data set is composed of normal electrocardiographic signals and class 15 arrhythmia signals.
In step S2, the frequency of the electrocardiographic signal is generally within 1-45Hz, the data is first subjected to high-pass filtering at 0.9Hz to remove zero drift interference, and then the data is filtered by a low-pass filter at 46Hz to eliminate myoelectric interference.
In step S5, the model is formed by sequentially connecting four layers, i.e., 1DCNN, inclusion net, GRU, and FC.
The 1DCNN layer is composed of 2 layers of 1D convolution layers, the convolution kernel size is 5, the step length is 2, Relu is used as an activation function after each layer, the ReLu activation function is f (x) max (0, x), f (x) is data output after the ReLu function is activated, and x is input data.
The InceptionNet layer is subjected to SAME convolution by three convolution kernels with the sizes of 1, 3 and 5, wherein a convolution layer with the convolution kernel size of 1 is additionally used before a convolution layer with the convolution kernel size of 5.
The GRU layer contains 32 hidden units per GRU unit.
The FC layer is formed by 2 layers of full connection, the result is output by using Sigmoid, and the Sigmoid function is
Figure BDA0002916840430000021
k represents the result output by the FC layer, and s (k) represents the result output after Sigmoid computation.
In step S7, the trained model is used to perform classification prediction on the test set data, where the label with the output result greater than 0.5 considers that the piece of data is arrhythmia data, and the label with the output result less than or equal to 0.5 is normal data.
Compared with the prior art, the invention has the beneficial effects that:
the invention uses the fusion type deep learning network to analyze the Electrocardiosignal (ECG) to obtain the classification of normal/abnormal heart rate. The network integrates the advantages of three network structures of CNN, Inception Net and GRU, firstly CNN is used for preliminarily extracting features, network calculation amount is reduced, then Inception Net is used for carrying out multi-scale analysis on data features and improving data dimensions, more information of data is excavated, GRU is used for carrying out time domain analysis, and finally full-connection classification is carried out on the features. The method carries out comprehensive and deep characteristic analysis on the electrocardiosignals, and has higher identification accuracy and stronger generalization capability.
Drawings
Fig. 1 is a schematic view of the identification process of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention are clearly and completely described below, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, a heart rate identification method based on a 1DCNN + inclusion net + GRU fusion network includes the following steps:
s1, data set construction: constructing a data set with a normal/abnormal heart rate classification label based on the standard data set, storing the data set as fast reading NPY data, reconstructing a normal electrocardio signal label as 0, and reconstructing a 15-class arrhythmia signal label as 1;
s2, denoising data: the NPY data are processed by using a high-pass filter and a low-pass filter, so that the influence of noise on NPY data identification is reduced, and the identification accuracy is improved;
s3, data segmentation: segmenting the NPY data by taking 500 time steps as a segment, and establishing segmented data labels based on corresponding labels of the NPY data;
s4, data set division: according to the following steps: 3, randomly dividing the data set into a training set and a testing set;
s5, model construction: the method comprises the steps of constructing a model by fusing three networks, extracting features of different scales and structures of data by using the three networks, fully analyzing the data features, and classifying the data with high precision;
s6, model training: inputting training set data into the model, performing loop iteration training on the model until the model loss does not decrease and the accuracy does not increase, stopping training, and storing the model;
s7, label processing: and (3) obtaining probability data with an output result in the range of (0, 1) after model identification, and processing the label to obtain a binary classification result.
Preferably, in step S1, the data set used is constructed based on the MIT-BIH arrhythmia standard data set, and the data content is electrocardiographic signal data (ECG), and the data set is composed of a normal electrocardiographic signal and a class 15 arrhythmia signal.
Preferably, in step S2, the frequency of the electrocardiographic signal is generally within 1-45Hz, the data is first subjected to high-pass filtering at 0.9Hz to remove zero drift interference, and then the data is filtered by a low-pass filter at 46Hz to eliminate electromyographic interference.
Preferably, in step S5, the model is formed by connecting four layers of 1DCNN, inclusion net, GRU, and FC in this order.
Preferably, the 1DCNN layer is composed of 2 layers of 1D convolutional layers, the convolutional kernel size is 5, the step size is 2, Relu is used after each layer as an activation function, the Relu activation function is f (x) max (0, x), f (x) is data output after activation of the Relu function, and x is input data.
Preferably, the InceptionNet layer is SAME convolved by three convolution kernels of sizes 1, 3 and 5, wherein a convolution layer with a convolution kernel size of 1 is additionally used before a convolution layer with a convolution kernel size of 5.
Preferably, the GRU layer contains 32 hidden units per GRU unit.
Preferably, the FC layer is composed of 2 layers of full connections, and the result is output using a Sigmoid with a Sigmoid function of
Figure BDA0002916840430000031
k represents the result output by the FC layer, and s (k) represents the result output after Sigmoid computation.
Preferably, in step S7, the trained model is used to perform classification prediction on the test set data, wherein the label with the output result greater than 0.5 considers that the piece of data is arrhythmia data, and the label with the output result less than or equal to 0.5 is normal data.
The method is based on a deep learning technology, analyzes electrocardiosignals and finishes arrhythmia/normal classification work, the technology is constructed by fusing 3 deep neural network models with excellent effects, and the deep neural network models are respectively a 1DCNN layer, an IncepotionNet layer, a GRU layer and a full connection layer according to the difference of network structures. The 1DCNN layer is used for carrying out primary analysis on electrocardiosignal data input into a network, and the partial network is used for carrying out primary feature extraction on the data and reducing the data length. The InceptitionNet layer is used for carrying out multi-scale analysis on data features, deepening the depth of a network, simultaneously carrying out dimension increasing on data and mining deep features of the data. The GRU layer is used for carrying out time domain analysis on the data characteristics and carrying out context correlation analysis on the input data. And the full connection layer is used for comprehensively calculating the characteristics extracted by the network to obtain a final classification result.
Although only the preferred embodiments of the present invention have been described in detail, the present invention is not limited to the above embodiments, and various changes can be made without departing from the spirit of the present invention within the knowledge of those skilled in the art, and all changes are encompassed in the scope of the present invention.

Claims (9)

1. A heart rate identification method based on a 1DCNN + Inception Net + GRU fusion network is characterized by comprising the following steps:
s1, data set construction: constructing a data set with a normal/abnormal heart rate classification label based on the standard data set, storing the data set as fast reading NPY data, reconstructing a normal electrocardio signal label as 0, and reconstructing a 15-class arrhythmia signal label as 1;
s2, denoising data: the NPY data are processed by using a high-pass filter and a low-pass filter, so that the influence of noise on NPY data identification is reduced, and the identification accuracy is improved;
s3, data segmentation: segmenting the NPY data by taking 500 time steps as a segment, and establishing segmented data labels based on corresponding labels of the NPY data;
s4, data set division: according to the following steps: 3, randomly dividing the data set into a training set and a testing set;
s5, model construction: the method comprises the steps of constructing a model through fusion of three networks, extracting features of different scales and structures of data by using the three networks, fully analyzing the features of the data, and classifying the data with high precision;
s6, model training: inputting training set data into the model, performing loop iteration training on the model until the model loss does not decrease and the accuracy does not increase, stopping training, and storing the model;
s7, label processing: and (3) obtaining probability data with an output result in the range of (0, 1) after model identification, and processing the label to obtain a binary classification result.
2. The heart rate identification method based on the 1DCNN + Inception Net + GRU fusion network as claimed in claim 1, wherein: in step S1, the data set used is constructed based on the MIT-BIH arrhythmia standard data set, the data content is electrocardiographic signal data, and the data set is composed of normal electrocardiographic signals and class 15 arrhythmia signals.
3. The heart rate identification method based on the 1DCNN + Inception Net + GRU fusion network as claimed in claim 1, wherein: in step S2, the frequency of the electrocardiographic signal is generally within 1-45Hz, the data is first subjected to high-pass filtering at 0.9Hz to remove zero drift interference, and then the data is filtered by a low-pass filter at 46Hz to eliminate myoelectric interference.
4. The heart rate identification method based on the 1DCNN + Inception Net + GRU fusion network as claimed in claim 1, wherein: in step S5, the model is formed by sequentially connecting four layers, i.e., 1DCNN, inclusion net, GRU, and FC.
5. The heart rate identification method based on the 1DCNN + Inception Net + GRU fusion network as claimed in claim 4, wherein: the 1DCNN layer is composed of 2 layers of 1D convolution layers, the convolution kernel size is 5, the step length is 2, Relu is used as an activation function after each layer, the ReLu activation function is f (x) max (0, x), f (x) is data output after the ReLu function is activated, and x is input data.
6. The heart rate identification method based on the 1DCNN + Inception Net + GRU fusion network as claimed in claim 4, wherein: the InceptionNet layer is subjected to SAME convolution by three convolution kernels with the sizes of 1, 3 and 5, wherein a convolution layer with the convolution kernel size of 1 is additionally used before a convolution layer with the convolution kernel size of 5.
7. The heart rate identification method based on the 1DCNN + Inception Net + GRU fusion network as claimed in claim 4, wherein: the GRU layer contains 32 hidden units per GRU unit.
8. The heart rate identification method based on the 1DCNN + Inception Net + GRU fusion network as claimed in claim 4, wherein: the FC layer is formed by 2 layers of full connection, the result is output by using Sigmoid, and the Sigmoid function is
Figure FDA0002916840420000021
k represents the result output by the FC layer, and s (k) represents the result output after Sigmoid computation.
9. The heart rate identification method based on the 1DCNN + Inception Net + GRU fusion network as claimed in claim 1, wherein: in step S7, the trained model is used to perform classification prediction on the test set data, where the label with the output result greater than 0.5 considers that the piece of data is arrhythmia data, and the label with the output result less than or equal to 0.5 is normal data.
CN202110103335.6A 2021-01-26 2021-01-26 Heart rate identification method based on 1DCNN + Inception Net + GRU fusion network Active CN112932431B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110103335.6A CN112932431B (en) 2021-01-26 2021-01-26 Heart rate identification method based on 1DCNN + Inception Net + GRU fusion network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110103335.6A CN112932431B (en) 2021-01-26 2021-01-26 Heart rate identification method based on 1DCNN + Inception Net + GRU fusion network

Publications (2)

Publication Number Publication Date
CN112932431A true CN112932431A (en) 2021-06-11
CN112932431B CN112932431B (en) 2022-09-27

Family

ID=76236912

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110103335.6A Active CN112932431B (en) 2021-01-26 2021-01-26 Heart rate identification method based on 1DCNN + Inception Net + GRU fusion network

Country Status (1)

Country Link
CN (1) CN112932431B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114041800A (en) * 2021-10-21 2022-02-15 吉林大学 Electrocardiosignal real-time classification method and device and readable storage medium

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110047230A1 (en) * 2006-11-17 2011-02-24 Mcgee Steven J Method / process / procedure to enable: The Heart Beacon Rainbow Force Tracking
CN106529476A (en) * 2016-11-11 2017-03-22 重庆邮电大学 Deep stack network-based electroencephalogram signal feature extraction and classification method
US9968257B1 (en) * 2017-07-06 2018-05-15 Halsa Labs, LLC Volumetric quantification of cardiovascular structures from medical imaging
CN108256452A (en) * 2018-01-06 2018-07-06 天津大学 A kind of method of the ECG signal classification of feature based fusion
CN108766557A (en) * 2018-05-12 2018-11-06 鲁东大学 Automatic arrhythmia analysis method based on channel signal fused neural network
CN109620210A (en) * 2019-01-28 2019-04-16 山东科技大学 A kind of electrocardiosignal classification method of the CNN based on from coding mode in conjunction with GRU
CN109645980A (en) * 2018-11-14 2019-04-19 天津大学 A kind of rhythm abnormality classification method based on depth migration study
WO2019100566A1 (en) * 2017-11-27 2019-05-31 乐普(北京)医疗器械股份有限公司 Artificial intelligence self-learning-based static electrocardiography analysis method and apparatus
CN109858514A (en) * 2018-12-20 2019-06-07 北京以萨技术股份有限公司 A kind of video behavior classification method neural network based
US20190244108A1 (en) * 2018-02-08 2019-08-08 Cognizant Technology Solutions U.S. Corporation System and Method For Pseudo-Task Augmentation in Deep Multitask Learning
KR20190141326A (en) * 2018-06-14 2019-12-24 한국과학기술원 Method and Apparatus for ECG Arrhythmia Classification using a Deep Convolutional Neural Network
CN110974163A (en) * 2019-12-05 2020-04-10 中国人民解放军总医院 Multi-sensing information fusion control system and control method for oral medical imaging robot
CN111110228A (en) * 2020-01-17 2020-05-08 武汉中旗生物医疗电子有限公司 Electrocardiosignal R wave detection method and device
CN111242098A (en) * 2020-02-27 2020-06-05 西安交通大学 Electrocardiogram data classification method and system combining feature extraction and initiation network
CN111329469A (en) * 2020-03-05 2020-06-26 广州天嵌计算机科技有限公司 Arrhythmia prediction method
CN111358459A (en) * 2020-02-11 2020-07-03 广州视源电子科技股份有限公司 Arrhythmia identification method, device, equipment and storage medium
US20200305799A1 (en) * 2017-11-27 2020-10-01 Lepu Medical Technology (Beijing) Co., Ltd. Artificial intelligence self-learning-based automatic electrocardiography analysis method and apparatus

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110047230A1 (en) * 2006-11-17 2011-02-24 Mcgee Steven J Method / process / procedure to enable: The Heart Beacon Rainbow Force Tracking
CN106529476A (en) * 2016-11-11 2017-03-22 重庆邮电大学 Deep stack network-based electroencephalogram signal feature extraction and classification method
US9968257B1 (en) * 2017-07-06 2018-05-15 Halsa Labs, LLC Volumetric quantification of cardiovascular structures from medical imaging
US20200305799A1 (en) * 2017-11-27 2020-10-01 Lepu Medical Technology (Beijing) Co., Ltd. Artificial intelligence self-learning-based automatic electrocardiography analysis method and apparatus
WO2019100566A1 (en) * 2017-11-27 2019-05-31 乐普(北京)医疗器械股份有限公司 Artificial intelligence self-learning-based static electrocardiography analysis method and apparatus
CN108256452A (en) * 2018-01-06 2018-07-06 天津大学 A kind of method of the ECG signal classification of feature based fusion
US20190244108A1 (en) * 2018-02-08 2019-08-08 Cognizant Technology Solutions U.S. Corporation System and Method For Pseudo-Task Augmentation in Deep Multitask Learning
CN108766557A (en) * 2018-05-12 2018-11-06 鲁东大学 Automatic arrhythmia analysis method based on channel signal fused neural network
KR20190141326A (en) * 2018-06-14 2019-12-24 한국과학기술원 Method and Apparatus for ECG Arrhythmia Classification using a Deep Convolutional Neural Network
CN109645980A (en) * 2018-11-14 2019-04-19 天津大学 A kind of rhythm abnormality classification method based on depth migration study
CN109858514A (en) * 2018-12-20 2019-06-07 北京以萨技术股份有限公司 A kind of video behavior classification method neural network based
CN109620210A (en) * 2019-01-28 2019-04-16 山东科技大学 A kind of electrocardiosignal classification method of the CNN based on from coding mode in conjunction with GRU
CN110974163A (en) * 2019-12-05 2020-04-10 中国人民解放军总医院 Multi-sensing information fusion control system and control method for oral medical imaging robot
CN111110228A (en) * 2020-01-17 2020-05-08 武汉中旗生物医疗电子有限公司 Electrocardiosignal R wave detection method and device
CN111358459A (en) * 2020-02-11 2020-07-03 广州视源电子科技股份有限公司 Arrhythmia identification method, device, equipment and storage medium
CN111242098A (en) * 2020-02-27 2020-06-05 西安交通大学 Electrocardiogram data classification method and system combining feature extraction and initiation network
CN111329469A (en) * 2020-03-05 2020-06-26 广州天嵌计算机科技有限公司 Arrhythmia prediction method

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
DUBEY, K.; AGARWAL, A.; LATHE, A.S.; KUMAR, R.; SRIVASTAVA, V.: "Self-attention based BiLSTM-CNN classifier for the prediction of ischemic and non-ischemic cardiomyopathy", 《ARXIV》 *
E. K. WANG, X. ZHANG, F. WANG,T.-Y.WU AND C.-M.CHEN: "Multilayer Dense Attention Model for Image Caption", 《ACCESS》 *
杨浩,黄茂林,蔡志鹏,姚映佳,李建清,刘澄玉: "融合CNN和BiLSTM的心律失常心拍分类模型", 《中国生物医学工程学报》 *
王露笛: "心律失常与心力衰竭智能诊断方法研究", 《中国博士学位论文全文数据库》 *
郭维: "基于深度学习的心电信号分类研究与应用", 《中国优秀硕士学位论文全文数据库》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114041800A (en) * 2021-10-21 2022-02-15 吉林大学 Electrocardiosignal real-time classification method and device and readable storage medium
CN114041800B (en) * 2021-10-21 2024-01-30 吉林大学 Electrocardiosignal real-time classification method and device and readable storage medium

Also Published As

Publication number Publication date
CN112932431B (en) 2022-09-27

Similar Documents

Publication Publication Date Title
CN109948647B (en) Electrocardiogram classification method and system based on depth residual error network
CN111160139B (en) Electrocardiosignal processing method and device and terminal equipment
CN109117730B (en) Real-time electrocardiogram atrial fibrillation judgment method, device and system and storage medium
CN110680302B (en) Automatic identification method for electrocardiosignal characteristic wave
CN108511055B (en) Ventricular premature beat recognition system and method based on classifier fusion and diagnosis rules
CN108154519A (en) Dividing method, device and the storage medium of eye fundus image medium vessels
CN111956208B (en) ECG signal classification method based on ultra-lightweight convolutional neural network
CN110638430B (en) Method for building cascade neural network ECG signal arrhythmia classification model
CN113057648A (en) ECG signal classification method based on composite LSTM structure
CN111291727B (en) Method and device for detecting signal quality by using photoplethysmography
CN107239684A (en) A kind of feature learning method and system for ECG identifications
CN110555380A (en) Finger vein identification method based on Center Loss function
Xu et al. Dual-channel asymmetric convolutional neural network for an efficient retinal blood vessel segmentation in eye fundus images
CN111481192A (en) Electrocardiosignal R wave detection method based on improved U-Net
CN113995419A (en) Atrial fibrillation risk prediction system based on heartbeat rhythm signal and application thereof
CN111460953A (en) Electrocardiosignal classification method based on self-adaptive learning of countermeasure domain
CN112932431B (en) Heart rate identification method based on 1DCNN + Inception Net + GRU fusion network
CN115736944A (en) Atrial fibrillation detection model MCNN-BLSTM based on short-time single lead electrocardiosignal
CN110327033B (en) Myocardial infarction electrocardiogram screening method based on deep neural network
CN111899272B (en) Fundus image blood vessel segmentation method based on coupling neural network and line connector
CN113486752B (en) Emotion recognition method and system based on electrocardiosignal
CN111053552A (en) QRS wave detection method based on deep learning
CN113995417A (en) Electrocardiosignal abnormity prediction method and system based on LSTM self-encoder
CN113171102B (en) ECG data classification method based on continuous deep learning
CN116211315B (en) Single-lead electrocardiosignal auxiliary diagnosis method and diagnosis terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant