CN116468723A - Cardiac resynchronization evaluation method, device, equipment and medium - Google Patents

Cardiac resynchronization evaluation method, device, equipment and medium Download PDF

Info

Publication number
CN116468723A
CN116468723A CN202310620002.XA CN202310620002A CN116468723A CN 116468723 A CN116468723 A CN 116468723A CN 202310620002 A CN202310620002 A CN 202310620002A CN 116468723 A CN116468723 A CN 116468723A
Authority
CN
China
Prior art keywords
image
echocardiogram
learning model
neural network
deep neural
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310620002.XA
Other languages
Chinese (zh)
Inventor
刘竑
石思远
崔晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Weiwei Medical Technology Co ltd
Original Assignee
Shanghai Weiwei Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Weiwei Medical Technology Co ltd filed Critical Shanghai Weiwei Medical Technology Co ltd
Priority to CN202310620002.XA priority Critical patent/CN116468723A/en
Publication of CN116468723A publication Critical patent/CN116468723A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/346Analysis of electrocardiograms
    • A61B5/349Detecting specific parameters of the electrocardiograph cycle
    • A61B5/352Detecting R peaks, e.g. for synchronising diagnostic apparatus; Estimating R-R interval
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/346Analysis of electrocardiograms
    • A61B5/349Detecting specific parameters of the electrocardiograph cycle
    • A61B5/366Detecting abnormal QRS complex, e.g. widening
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/06Measuring blood flow
    • A61B8/065Measuring blood flow to determine blood output from the heart
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30048Heart; Cardiac
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular
    • G06T2207/30104Vascular flow; Blood flow; Perfusion

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Cardiology (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Artificial Intelligence (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Theoretical Computer Science (AREA)
  • Public Health (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Mathematical Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Physiology (AREA)
  • Fuzzy Systems (AREA)
  • Quality & Reliability (AREA)
  • Signal Processing (AREA)
  • Psychiatry (AREA)
  • Hematology (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention provides a cardiac resynchronization evaluation method, a device, equipment and a medium, and relates to the technical field of image processing, wherein the method comprises the following steps: acquiring an echocardiogram of a target patient, wherein the echocardiogram comprises an echocardiogram frequency spectrum region image and an electrocardiograph region image, and performing frequency spectrum analysis on the echocardiogram of the target patient by using a trained deep neural network learning model to obtain cardiac function parameters; and obtaining a heart resynchronization evaluation result of the target patient according to the cardiac function parameter, wherein the trained deep neural network learning model is generated by training by using the echocardiogram of the reference patient as training data and the labeling data of the echocardiogram of the reference patient as labels. The method is used for improving the heart resynchronization evaluation effect.

Description

Cardiac resynchronization evaluation method, device, equipment and medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a cardiac resynchronization evaluation method, apparatus, device, and medium.
Background
Cardiac resynchronization therapy (cardiac resynchronization therapy, CRT) is a cardiac pacing method for patients with left ventricular contractile dysfunction combined with unsynchronized ventricular activation that synchronizes or nearly synchronizes the electrical activity of the left and right ventricles by stimulating the left and right ventricles (dual ventricular pacing) or stimulating the left ventricle alone. At present, clinicians mainly rely on the reduction degree of the QRS wave width in an electrocardiogram to measure the heart resynchronization effect, and the defect of the method is that: on one hand, the method needs to rely on doctors to manually mark key points, and the evaluation result is limited by the richness of medical experience of the doctors, so that the subjectivity is strong, and on the other hand, the treatment success rate of the method is found to be low in the actual clinical process.
For this reason, it is highly desirable to provide a cardiac resynchronization evaluation scheme to ameliorate the above problems.
Disclosure of Invention
The invention provides a cardiac resynchronization evaluation method, device, equipment and medium, which are used for comprehensively analyzing an electrocardio area image and a hyperfrequency spectrum area image in an ultrasonic cardiogram to obtain cardiac function parameters, and evaluating cardiac resynchronization based on the cardiac function parameters so as to improve the resynchronization evaluation effect.
In a first aspect, the present invention provides a cardiac resynchronization evaluation method comprising: acquiring an echocardiogram of a target patient, wherein the echocardiogram comprises an echocardiogram frequency spectrum region image and an electrocardiographic region image, and performing frequency spectrum analysis on the echocardiogram of the target patient by using a trained deep neural network learning model to obtain cardiac function parameters, wherein the trained deep neural network learning model is generated by using an echocardiogram of a reference patient as training data and labeling data of the echocardiogram as labels; and obtaining a cardiac resynchronization evaluation result of the target patient according to the cardiac function parameter.
The heart resynchronization evaluation method provided by the invention has the beneficial effects that: according to the embodiment of the invention, the deep neural network learning model is trained, so that cardiac function parameters required by assessing cardiac synchronization can be calculated rapidly by means of the trained deep neural network learning model, compared with the traditional observation of QRS wave width, the embodiment of the invention comprehensively uses parameters in the electrocardio region image and the heart super-spectrum region image, the effect of resynchronization assessment is improved, a doctor does not need to manually mark key points, a result can be directly obtained, the degree of automation is extremely high, an echocardiogram with local signal loss can be automatically screened and processed, the robustness is better, the accuracy is higher, and the echocardiogram of different patients can be well identified.
In a possible embodiment, before the obtaining of the echocardiogram of the target patient, further comprising: acquiring an echocardiogram of a reference patient and labeling data of the echocardiogram of the reference patient, wherein the labeling data comprises key reference points of an electrocardiographic region image and key reference points of a hyperfrequency spectrum region image; taking the labeling data as a label, and training a deep neural network learning model by utilizing an echocardiogram of a reference patient; performing spectrum analysis on an echocardiogram of a target patient by using the trained deep neural network learning model to obtain cardiac function parameters; and obtaining a cardiac resynchronization evaluation result of the target patient according to the cardiac function parameter.
In a possible embodiment, using the labeling data as a label and using the echocardiography to train a deep neural network learning model includes: inputting an electrocardiographic region image of the reference patient to a target detection network part in the deep neural network learning model to obtain a first output result, wherein the first output result comprises a rectangular frame of a QRS waveform, and cutting the electrocardiographic region image of the reference patient according to the rectangular frame of the QRS waveform to obtain a region-of-interest sub-image; inputting a region-of-interest sub-image into a key point detection network part in the deep neural network learning model, and detecting the key points of the region-of-interest sub-image to obtain a second output result, wherein the second output result comprises coordinates of key feature points of an electrocardiogram;
inputting the heart super-spectrum region image of the reference patient to a target detection network part and a key point detection network part of the deep neural network learning model to obtain a third output result, wherein the third output result comprises coordinates of key feature points of the heart super-spectrum region image; comparing the second output result with key reference points of the electrocardiographic region image, comparing the third output result with key reference points of the electrocardiographic region image to obtain a comparison result, and adjusting parameters in the deep neural network learning model until the set iteration times are reached or the loss value of the loss function is smaller than a set threshold value, and outputting the trained deep neural network learning model.
In this embodiment, training is performed by the target detection network part and the keypoint detection network part in the deep neural network learning model, so that training of the deep neural network learning model is completed.
In a possible embodiment, before inputting the image of the hypercardiac spectrum region of the reference patient to the target detection network part and the keypoint detection network part of the deep neural network learning model, the method further comprises: judging whether the image of the heart super-spectrum region of the reference patient is interrupted or not, and resampling the image of the heart super-spectrum region of the reference patient if the image of the heart super-spectrum region of the reference patient is interrupted. The embodiment can effectively filter the echocardiogram with signal interruption, avoid inaccurate positioning of the key feature point coordinates of partial heart hyperspectral region images, and effectively improve the accuracy of a trained deep neural network learning model.
In a possible embodiment, determining whether there is an information interruption in the image of the hyperspectral region of the heart of the reference patient includes: sampling an expansion area around the rectangular frame of the blood flow waveform of each cardiac cycle by the rectangular frame of the blood flow waveform of each cardiac cycle to obtain a sampling image, wherein the expansion area comprises an area expanding N pixels from the boundary of the rectangular frame to the periphery; singular value decomposition and inversion processing are carried out on the sampling image, and a rectangular inner pixel average value of the sampling image is obtained; when the average value of pixels in the rectangle is larger than a set threshold value, determining that the image of the heart super-spectrum region of the reference patient has no information interruption; and when the average value of pixels in the rectangle is smaller than or equal to the set threshold value, determining that the image of the heart hyperspectral region of the reference patient is interrupted in information. The method provided by the embodiment can effectively detect the incomplete blood flow spectrum and avoid inaccurate positioning of the key feature point coordinates of partial heart hyperspectral region images.
In a possible embodiment, the hyperspectral region image includes a four-chamber heart slice of the apex, a five-chamber heart slice of the apex, and a parasternal short axis slice; the key feature points of the electrocardiographic region image comprise: a start point of the QRS wave and an R peak value point; the key feature points of the heart super-spectrum region image comprise: an E wave starting point, an A wave ending point, an aortic blood flow starting point and a pulmonary artery blood flow starting point; the key reference points of the electrocardio-regional image comprise the starting point of each QRS wave and the coordinate position of the R wave crest value point in the electrocardio-regional image which is marked manually; the key reference points of the heart super-spectrum region image comprise an E wave starting point, an A wave ending point, an aortic blood flow starting point and a pulmonary artery blood flow starting point of a blood flow waveform of each cardiac cycle in the heart super-spectrum region image which is marked manually.
In a possible embodiment, the cardiac function parameters include at least two of the following parameters: the distance between R wave peak values between two adjacent cardiac cycles, the distance between the E wave starting point and the A wave ending point in the same cardiac cycle, the distance between the QRS wave starting point and the E wave starting point in the next cardiac cycle, the distance between the QRS wave starting point and the aortic valve blood flow spectrum starting point, the distance between the QRS wave starting point and the aortic valve blood flow spectrum ending point, the distance between the A wave ending point and the aortic valve blood flow spectrum starting point, the distance between the QRS starting point and the pulmonary valve blood flow spectrum starting point, and the E wave starting point and the A wave ending point.
In a second aspect, embodiments of the present application also provide a cardiac resynchronization evaluation device comprising a module/unit performing the method of any one of the possible designs of the first aspect described above. These modules/units may be implemented by hardware, or may be implemented by hardware executing corresponding software.
In a third aspect, embodiments of the present application provide an electronic device including a processor and a memory. Wherein the memory is for storing one or more computer programs; the one or more computer programs, when executed by the processor, enable the electronic device to implement the method of any one of the possible designs of the first aspect described above.
In a fourth aspect, embodiments of the present application also provide a computer-readable storage medium, including a computer program, which when run on an electronic device causes the electronic device to perform the method of any one of the possible designs of the above aspect.
In a fifth aspect, embodiments of the present application also provide a method comprising a computer program product, which when run on an electronic device, causes the electronic device to perform any one of the possible designs of the above aspects.
The advantageous effects of the second to fifth aspects may be referred to the description of the first aspect, and the detailed description will not be repeated.
Drawings
FIG. 1 is a flowchart of a method for cardiac resynchronization evaluation according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a deep neural network learning model according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart of a training method of a deep neural network learning model according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating an example of determining whether there is an information interruption in a hyperfrequency region image according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a fourth-order hoursclass network according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a cardiac resynchronization evaluation device according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to solve the problem that a clinician relies on the reduction degree of the QRS wave width in an electrocardiogram to measure the poor heart resynchronization effect, the invention trains a loop neural network learning model according to an echocardiogram, carries out spectrum analysis on the echocardiogram of a target patient by utilizing the trained deep neural network learning model to obtain cardiac function parameters, and then obtains the heart resynchronization evaluation result of the target patient according to the cardiac function parameters, thereby improving the resynchronization evaluation effect, avoiding manual work, having extremely high automation degree, automatically screening and processing the echocardiogram with local signal loss, having better robustness and higher accuracy and having good recognition effect on the echocardiogram of different patients.
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application. Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the invention. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise. The term "coupled" includes both direct and indirect connections, unless stated otherwise. The terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated.
In embodiments of the invention, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
Fig. 1 shows a flowchart of a cardiac resynchronization evaluation method according to an embodiment of the present invention, which may be performed by an electronic device having image processing capabilities, the method comprising:
s101, acquiring an echocardiogram of a target patient, wherein the echocardiogram comprises an electrocardiographic region image and an electrocardiographic region image.
S102, performing frequency spectrum analysis on the echocardiogram of the target patient by using the trained deep neural network learning model to obtain cardiac function parameters.
And S103, obtaining a cardiac resynchronization evaluation result of the target patient according to the cardiac function parameter.
In a possible embodiment, before the echocardiography of the target patient is acquired, the echocardiography of the reference patient is further required to be used as training data, and the labeling data of the echocardiography is used as a label to train the deep neural network learning model, which specifically includes: s104 acquires an echocardiogram of the reference patient, and labeling data of the echocardiogram. And S105, training a deep neural network learning model by using the echocardiogram of the reference patient by taking the labeling data as a label.
The labeling data comprises key reference points of the electrocardio-region image and key reference points of the heart super-spectrum region image, for example, the heart super-spectrum region image comprises a manually labeled heart apex four-cavity heart section, a heart apex five-cavity heart section and a sternal side short axis section. The key reference points of the electrocardio region image comprise the starting point of each ventricular depolarization (QRS) wave and the coordinate position of an R wave crest value point in the electrocardio region image which are marked manually, and the key reference points of the heart hyperspectral region image comprise the E wave starting point, the A wave ending point, the aortic blood flow starting point and the pulmonary artery blood flow starting point of the blood flow waveform of each cardiac cycle in the heart hyperspectral region image which are marked manually.
In one possible embodiment, as shown in FIG. 2, the deep neural network learning model includes a target detection network portion and a keypoint detection network portion. Illustratively, the object detection network portion may be a YOLO network and the key point detection network portion may be a Hourglass network. As can be seen from fig. 2, the ultrasonic heart region image of the reference patient is input into the deep neural network learning model for training, and finally the trained deep neural network learning model can output the heart function parameters.
Specifically, in connection with fig. 3, the training method for the deep neural network learning model may include the following steps:
and a step a of inputting an electrocardio region image in an echocardiogram of the reference patient to a target detection network part in the deep neural network learning model to obtain a first output result, wherein the first output result comprises a rectangular frame of a QRS waveform. And cutting the electrocardio region image in the echocardiogram of the reference patient according to the rectangular frame of the QRS waveform to obtain a region-of-interest sub-image. Inputting a region-of-interest sub-image to a key point detection network part in the deep neural network learning model, and detecting the key point of the region-of-interest sub-image to obtain a second output result, wherein the second output result comprises coordinates of key feature points of an electrocardiographic region image. The key feature points of the electrocardiographic region image comprise: the start of the QRS wave and the R peak value point.
And b, inputting a heart hyperspectral region image in the echocardiogram of the reference patient to a target detection network part and a key point detection network part of the deep neural network learning model to obtain a third output result, wherein the third output result comprises coordinates of key feature points of the heart hyperspectral region image. The key feature points of the heart super-spectrum region image comprise: an E wave starting point, an A wave ending point, an aortic blood flow starting point, an aortic blood flow ending point, a pulmonary artery blood flow starting point and a pulmonary artery blood flow ending point.
Optionally, after inputting the image of the heart super-spectrum region of the reference patient, whether the image of the heart super-spectrum region of the reference patient has information interruption or not may be judged first, if yes, the image of the heart super-spectrum region of the reference patient is resampled; otherwise, processing continues. In one possible embodiment, the method for judging whether the information is interrupted may be: sampling an expansion area around a rectangular frame of the blood flow waveform of each cardiac cycle to obtain a sampling image, wherein the expansion area comprises an area expanding N pixels from the boundary of the rectangular frame to the periphery; singular value decomposition and inversion processing are carried out on the sampling image, and a rectangular inner pixel average value of the sampling image is obtained; when the average value of pixels in the rectangle is larger than a set threshold value, determining that the image of the heart super-spectrum region of the reference patient has no information interruption; and when the average value of pixels in the rectangle is smaller than or equal to the set threshold value, determining that the image of the heart hyperspectral region of the reference patient is interrupted in information. For example, as shown in fig. 4, after Singular Value Decomposition (SVD) is performed on a rectangle of 5 pixel values outside the left and right boundaries of the rectangular frame of the image in the hyperfrequency spectrum region, the average value of the pixels in the rectangle is 1 or more, that is, a complete blood flow spectrum, and is less than 1, that is, an incomplete blood flow spectrum.
And c, comparing the second output result with a key reference point of the electrocardiographic region image, comparing the third output result with the key reference point of the electrocardiographic region image to obtain a comparison result, and adjusting parameters in the deep neural network learning model until the set iteration times are reached or the loss value of the loss function is smaller than a set threshold value, and outputting the trained deep neural network learning model.
By way of example, parameters in the deep neural network learning model may include configuration variables internal to the neural network learning model and configuration parameters external to the neural network learning model, such as learning rate, number of hidden layers, number of hidden layer units, selection of activation functions, and the like. The loss function may be a classification loss function, e.g. the classification loss function satisfies the following formula:
wherein x is i Represents the i-th input value, y i Indicating the i-th output value of the signal,is the true category, LOSS is y i And true->Distance between them.
It should be understood that the above steps a and b may be performed in parallel or may be performed sequentially, which is not limited in this embodiment. After training according to the method to obtain a trained deep neural network learning model, the echocardiogram of the target patient can be input into the trained deep neural network learning model, and key feature points of the electrocardiographic region image comprise: the start of the QRS wave and the R peak value point. The key feature points of the heart super-spectrum region image comprise: an E wave starting point, an A wave ending point, an aortic blood flow starting point, an aortic blood flow ending point, a pulmonary artery blood flow starting point and a pulmonary artery blood flow ending point. Based on the E-wave start point and the a-wave end point, the distance between the E-wave start point and the a-wave end point in the same cardiac cycle, that is, the Left Ventricular Filling Time (LVFT), and based on the R-wave peak value point, the distance between the R-wave peak values between two adjacent cardiac cycles, that is, the distance between the R-wave peak values between two adjacent cardiac cycles (time betwen two QRS, RR) may be calculated. And so on, calculating the distance between the start point of the QRS wave and the start point of the E wave in the next cardiac cycle (the four-cavity heart section of the apex of the heart) based on the start point of the QRS wave and the start point of the E wave, and obtaining the distance between the start point of the QRS wave and the start point of the E wave in the next cardiac cycle (Time between the start of the QRS and onset of the next E wave, QRS-E); calculating a distance (a heart apex five-cavity heart section) between the starting point of the QRS wave and the starting point of the blood flow spectrum at the aortic valve based on the starting point of the QRS wave and the starting point of the aortic blood flow to obtain a left ventricular pre-ejection interval (left pre-ejection interval, LPEI); calculating the distance (apical five-chamber heart section) between the start point of the QRS wave and the blood flow spectrum end point of the aortic valve based on the start point of the QRS wave and the blood flow spectrum end point of the aortic valve, and obtaining the contraction time (SD); calculating the distance between the A wave end point and the blood flow spectrum start point at the aortic valve (the four-chamber heart tangential plane and the five-chamber heart tangential plane) based on the A wave end point and the blood flow spectrum start point at the aortic valve to obtain isovolumetric contraction time (isovolumic contraction time, isovolCT); calculating the distance (a parasternal short axis section) between the QRS starting point and the blood flow spectrum starting point at the pulmonary valve based on the QRS starting point and the pulmonary artery blood flow starting point to obtain a right pre-ejection interval (RPEI); based on the E-wave start point and the a-wave end point, the ratio (apical four-chamber heart section) between the time taken between the E-wave start point and the a-wave end point in the same cardiac cycle and the time of the cardiac cycle is calculated, and the diastolic ventricular filling time (Diastolic ventricular filling time, DFT) is obtained.
In a possible embodiment, when the key point detection network part is a Hourglass network structure, the Stacked Hourglass network structure adopts an upper half-path structure and a lower half-path structure, the maximum pooling layer is used for reducing the feature size, and then the up-sampling is used for increasing the feature size and fusing with the shallow layer features, so that feature fusion of different scales is realized, and the increase of the positioning precision of the key point is facilitated. The first-order hoursclass network structure is the simplest bypass addition, the upper half is performed in the original scale, and the lower half is subjected to downsampling and then upsampling. For a second order horglass network architecture, the first order network is nested on a first order basis. As for the later high-order horglass, the layers are nested one by one, and the submodules are basically the same. The horglass network structure is generally two-level or four-level, and fig. 5 illustrates a four-level horglass network structure.
In a possible embodiment, when the object detection network portion may be a YOLO network structure, the YOLO network structure may include a CBS module, a function module, a CSP module and an SPP module, where the CBS module is mainly a set convolution and normalization, the function module is a deformed downsampling process, the purpose is to reduce the image, try to reduce the dimension of the feature map and maintain the original effective information, the process is to reduce the data volume and reduce the calculation volume of the model under the condition of sacrificing a small amount of data, the module is mainly applied to a backbone network of the model, the CSP module is mainly used to divide the input into two branches, and the SPP module is mainly used to convert the feature map of any size transmitted from an upper layer into a feature map of a fixed size for the model to learn more picture features.
Further, since these cardiac function parameters can fully describe the condition of the heart synchronization motion, a cardiac resynchronization evaluation can be performed based on the cardiac function parameters. For example, when the cardiac function parameter of the target patient is increased compared to the cardiac function parameter reference, the LVFT is decreased, indicating that the ventricular and atrial movements are not synchronized. For another example, when the cardiac function parameter of the target patient is increased compared to the cardiac function parameter reference, the LPEI is decreased, the LVFT is decreased, and the QRS-E is increased, then the left and right ventricular movements are indicated to be unsynchronized. For another example, when the following result is calculated according to the cardiac function parameter: LVFT/RR < 50%, LPEI-RPEI > 40ms, LPEI > 120ms, when the evaluation system will give an abnormality warning, which the physician can further analyze. Therefore, compared with the prior art, the method has the advantages that the resynchronization effect is greatly improved only by the width change of the QRS wave, the position of the pacemaker placed in CRT therapy can be guided, and the method has higher clinical significance.
According to the invention, the spectrum analysis is carried out on the echocardiogram of the target patient by training the deep neural network learning model, so that the resynchronization evaluation effect can be improved, the manual work is not needed, the automation degree is extremely high, the echocardiogram with local signal loss can be automatically screened and processed, the robustness is better, the accuracy is higher, and the good recognition effect is realized on the echocardiograms of different patients.
As with the concepts of the foregoing embodiments, embodiments of the present application further provide a cardiac resynchronization evaluation device 500 for implementing the functions of the electronic device in the foregoing method. The apparatus 600 may be an electronic device or an apparatus in an electronic device, for example. The device may be a system-on-chip. In the embodiment of the application, the chip system may be formed by a chip, and may also include a chip and other discrete devices. In one example, as shown in fig. 6, an apparatus 600 includes: an acquisition unit 601 for acquiring an echocardiogram of a target patient, the echocardiogram comprising a hyper-cardiac frequency region image and an electrocardiographic region image; the evaluation unit 603 is configured to perform spectral analysis on an echocardiogram of the target patient by using the trained deep neural network learning model, so as to obtain cardiac function parameters; and obtaining a cardiac resynchronization evaluation result of the target patient according to the cardiac function parameter.
Optionally, the acquiring unit 601 is further configured to acquire an echocardiogram of a reference patient, and labeling data of the echocardiogram; a training unit 602 is further included for training a deep neural network learning model using the echocardiogram of the reference patient with the labeling data as a label.
In an example, the training unit 602 uses the labeling data as a label to train a deep neural network learning model by using the echocardiography, which is specifically configured to: inputting an electrocardiographic region image of the reference patient to a target detection network part in the deep neural network learning model to obtain a first output result, wherein the first output result comprises a rectangular frame of a QRS waveform, and cutting the electrocardiographic region image of the reference patient according to the rectangular frame of the QRS waveform to obtain a region-of-interest sub-image; inputting a region-of-interest sub-image into a key point detection network part in the deep neural network learning model, and detecting the key points of the region-of-interest sub-image to obtain a second output result, wherein the second output result comprises coordinates of key feature points of an electrocardiographic region image;
inputting a heart hyperspectral region image in an echocardiogram of the reference patient to a target detection network part and a key point detection network part of the deep neural network learning model to obtain a third output result, wherein the third output result comprises coordinates of key feature points of the heart hyperspectral region image;
comparing the second output result with key reference points of the electrocardiographic region image, comparing the third output result with key reference points of the electrocardiographic region image to obtain a comparison result, and adjusting parameters in the deep neural network learning model until the set iteration times are reached or the loss value of the loss function is smaller than a set threshold value, and outputting the trained deep neural network learning model.
In yet another example, before the training unit 602 inputs the hyperspectral region image of the reference patient to the target detection network portion and the keypoint detection network portion of the deep neural network learning model, it is further configured to:
judging whether the image of the heart super-spectrum region of the reference patient is interrupted or not, and resampling the image of the heart super-spectrum region of the reference patient if the image of the heart super-spectrum region of the reference patient is interrupted.
In yet another example, the training unit 602 determines whether there is an information interruption in the image of the hyperspectral region of the heart of the reference patient, specifically for: sampling an expansion area around a rectangular frame of the blood flow waveform of each cardiac cycle to obtain a sampling image, wherein the expansion area comprises an area expanding N pixels from the boundary of the rectangular frame to the periphery; singular value decomposition and inversion processing are carried out on the sampling image, and a rectangular inner pixel average value of the sampling image is obtained; when the average value of pixels in the rectangle is larger than a set threshold value, determining that the image of the heart super-spectrum region of the reference patient has no information interruption; and when the average value of pixels in the rectangle is smaller than or equal to the set threshold value, determining that the image of the heart hyperspectral region of the reference patient is the interruption of the existing information.
The division of the modules in the embodiments of the present application is schematically only one logic function division, and there may be another division manner in actual implementation, and in addition, each functional module in each embodiment of the present application may be integrated in one processor, or may exist separately and physically, or two or more modules may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules.
In yet another example, as shown in fig. 7, the communication apparatus 700 includes at least one processor 710 and a memory 720, and the communication apparatus 700 may refer to the electronic device above. Wherein the memory 720 has stored therein a computer program. Memory 720 is coupled to processor 710. The communication device 700 may refer to the electronic apparatus above, and the coupling in the embodiments of the present invention may be an interval coupling or communication connection between devices, units or modules, which may be electrical, mechanical or other forms, for information interaction between devices, units or modules. As another implementation, the memory 720 may also be located external to the communication device 700. Processor 710 may operate in conjunction with memory 720. Processor 710 may invoke computer programs stored in memory 720. At least one of the at least one memory may be included in the processor.
In some embodiments, communications apparatus 700 can further include a communications interface 730 for communicating with other devices over a transmission medium, such that an apparatus for use in communications apparatus 700 can communicate with other devices. By way of example, communication interface 730 may be a transceiver, circuit, bus, module, or other type of communication interface, which may be other electronic devices. Processor 710 receives and transmits information using communication interface 730 and is used to implement the methods of the embodiments described above. Illustratively, the communication interface 730 is configured to receive the resource indication information. Also exemplary, communication interface 730 is used to transmit data.
In the embodiments of the present application, the processor may be a general purpose processor, a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component, and may implement or execute the methods, steps, and logic blocks disclosed in the embodiments of the present application. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution.
In the embodiment of the present application, the memory may be a nonvolatile memory, such as a hard disk (HDD) or a Solid State Drive (SSD), or may be a volatile memory (volatile memory), for example, a random-access memory (RAM). The memory is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The memory in the embodiments of the present application may also be a circuit or any other device capable of implementing a storage function, for storing a computer program and/or data.
The invention also provides a computer readable storage medium having stored thereon a computer program which when executed by a computer implements the method of the above-described method embodiments.
The invention also provides a computer program product which, when executed by a computer, implements the method of the above-described method embodiments.
The invention also provides a chip or a chip module, which is coupled with the memory and used for executing the computer program stored in the memory, so that the electronic equipment executes the method of the embodiment of the method.
The method provided in the embodiments of the present application may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program is loaded and executed on a computer, the flow or functions according to the embodiments of the present invention are fully or partially produced. The computer may be a general purpose computer, a special purpose computer, a computer network, a network device, a user device, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.) means. The computer readable storage medium may be any medium that can be accessed by a computer or a data storage device including one or more media integrated servers, data centers, and the like. The medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a digital video disc (digital video disc, abbreviated as DVD)), or a semiconductor medium (e.g., an SSD), etc.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present application without departing from the scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims and the equivalents thereof, the present application is intended to cover such modifications and variations.

Claims (12)

1. A method of cardiac resynchronization evaluation, the method comprising:
acquiring an echocardiogram of a target patient, wherein the echocardiogram comprises a hyper-cardiac frequency spectrum region image and an electrocardiographic region image;
performing frequency spectrum analysis on an echocardiogram of a target patient by using a trained deep neural network learning model to obtain cardiac function parameters, wherein the trained deep neural network learning model is generated by using an echocardiogram of a reference patient as training data and labeling data of the echocardiogram of the reference patient as label training;
and obtaining a cardiac resynchronization evaluation result of the target patient according to the cardiac function parameter.
2. The method of claim 1, further comprising, prior to acquiring the echocardiogram of the target patient:
acquiring an echocardiogram of a reference patient and labeling data of the echocardiogram of the reference patient, wherein the labeling data comprises key reference points of the electrocardiographic region image and key reference points of a hyperfrequency spectrum region image;
and training the deep neural network learning model by using the labeling data as a label and utilizing the echocardiogram of the reference patient to obtain a trained deep neural network learning model.
3. The method of claim 2, wherein training a deep neural network learning model using the echocardiogram with the labeling data as a label comprises:
inputting an echocardiogram of the reference patient to a target detection network part in the deep neural network learning model to obtain a first output result, wherein the first output result comprises a rectangular frame of a QRS waveform, and cutting an electrocardiographic region image in the echocardiogram of the reference patient according to the rectangular frame of the QRS waveform to obtain a region-of-interest sub-image;
inputting a region-of-interest sub-image into a key point detection network part in the deep neural network learning model, and detecting the key points of the region-of-interest sub-image to obtain a second output result, wherein the second output result comprises coordinates of key feature points of the electrocardio region image;
inputting a heart hyperspectral region image in an ultrasonic electrocardiogram of the reference patient to a target detection network part and a key point detection network part of the deep neural network learning model to obtain a third output result, wherein the third output result comprises coordinates of key feature points of the heart hyperspectral region image;
comparing the second output result with key reference points of the electrocardiographic region image, comparing the third output result with key reference points of the electrocardiographic region image to obtain a comparison result, and adjusting parameters in the deep neural network learning model until the set iteration times are reached or the loss value of the loss function is smaller than a set threshold value, and outputting the trained deep neural network learning model.
4. The method of claim 3, further comprising, prior to entering the hyperspectral region image of the reference patient into the target detection network portion and the keypoint detection network portion of the deep neural network learning model:
judging whether the image of the heart super-spectrum region of the reference patient is interrupted, and if so, resampling an ultrasonic electrocardiogram of the reference patient.
5. The method of claim 4, wherein determining whether there is an information interruption in the image of the hyperspectral region of the heart of the reference patient comprises:
sampling an expansion area around a rectangular frame of the blood flow waveform of each cardiac cycle to obtain a sampling image, wherein the expansion area comprises an area expanding N pixels from the boundary of the rectangular frame to the periphery;
singular value decomposition and inversion processing are carried out on the sampling image, and a rectangular inner pixel average value of the sampling image is obtained;
when the average value of pixels in the rectangle is larger than a set threshold value, determining that the image of the heart super-spectrum region of the reference patient has no information interruption;
and when the average value of pixels in the rectangle is smaller than or equal to the set threshold value, determining that the image of the heart hyperspectral region of the reference patient is interrupted in information.
6. The method of any one of claims 3 to 5, wherein the hyperspectral region image comprises a apical four-chamber heart slice, a apical five-chamber heart slice, and a parasternal short axis slice;
the key feature points of the electrocardiographic region image comprise: a start point of the QRS wave and an R peak value point; the key feature points of the heart super-spectrum region image comprise: an E wave starting point, an A wave ending point, an aortic blood flow starting point and a pulmonary artery blood flow starting point;
the key reference points of the electrocardio-regional image comprise the starting point of each QRS wave and the coordinate position of the R wave crest value point in the electrocardio-regional image which is marked manually; the key reference points of the heart super-spectrum region image comprise an E wave starting point, an A wave ending point, an aortic blood flow starting point and a pulmonary artery blood flow starting point of a blood flow waveform of each cardiac cycle in the heart super-spectrum region image which is marked manually.
7. The method according to any one of claims 2 to 5, wherein the cardiac function parameters include at least two of the following parameters: the distance between R wave peak values between two adjacent cardiac cycles, the distance between the E wave starting point and the A wave ending point in the same cardiac cycle, the distance between the QRS wave starting point and the E wave starting point in the next cardiac cycle, the distance between the QRS wave starting point and the aortic valve blood flow spectrum starting point, the distance between the QRS wave starting point and the aortic valve blood flow spectrum ending point, the distance between the A wave ending point and the aortic valve blood flow spectrum starting point, the distance between the QRS starting point and the pulmonary valve blood flow spectrum starting point, and the E wave starting point and the A wave ending point.
8. A cardiac resynchronization evaluation device, the device comprising:
an acquisition unit for acquiring an echocardiogram of a target patient, the echocardiogram comprising a hyper-cardiac frequency region image and an electrocardiographic region image;
the evaluation unit is used for carrying out frequency spectrum analysis on the echocardiogram of the target patient by utilizing the trained deep neural network learning model to obtain cardiac function parameters; and obtaining a heart resynchronization evaluation result of the target patient according to the cardiac function parameter, wherein the trained deep neural network learning model is generated by training by using an echocardiogram of a reference patient as training data and labeling data of the echocardiogram of the reference patient as labels.
9. The apparatus of claim 8, wherein the acquisition unit is further configured to:
acquiring an echocardiogram of a reference patient and labeling data of the echocardiogram of the reference patient, wherein the labeling data comprises key reference points of the electrocardiographic region image and key reference points of a hyperfrequency spectrum region image;
the device further comprises a training unit, wherein the training unit is used for training the deep neural network learning model by using the labeling data as a label and utilizing the echocardiogram of the reference patient to obtain a trained deep neural network learning model.
10. The apparatus according to claim 9, wherein the training unit uses the labeling data as a label to train a deep neural network learning model with the echocardiography, in particular for:
inputting an electrocardiographic region image of the reference patient to a target detection network part in the deep neural network learning model to obtain a first output result, wherein the first output result comprises a rectangular frame of a QRS waveform, and cutting the electrocardiographic region image of the reference patient according to the rectangular frame of the QRS waveform to obtain a region-of-interest sub-image;
inputting a region-of-interest sub-image into a key point detection network part in the deep neural network learning model, and detecting the key points of the region-of-interest sub-image to obtain a second output result, wherein the second output result comprises coordinates of key feature points of an electrocardiographic region image;
inputting the heart super-spectrum region image of the reference patient to a target detection network part and a key point detection network part of the deep neural network learning model to obtain a third output result, wherein the third output result comprises coordinates of key feature points of the heart super-spectrum region image;
comparing the second output result with key reference points of the electrocardiographic region image, comparing the third output result with key reference points of the electrocardiographic region image to obtain a comparison result, and adjusting parameters in the deep neural network learning model until the set iteration times are reached or the loss value of the loss function is smaller than a set threshold value, and outputting the trained deep neural network learning model.
11. An electronic device comprising a memory and a processor, the memory having stored thereon a computer program executable on the processor, which when executed by the processor causes the electronic device to perform the method of any of claims 1 to 7.
12. A computer readable storage medium having a computer program stored therein, which, when executed by a processor, implements the method of any of claims 1 to 7.
CN202310620002.XA 2023-05-29 2023-05-29 Cardiac resynchronization evaluation method, device, equipment and medium Pending CN116468723A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310620002.XA CN116468723A (en) 2023-05-29 2023-05-29 Cardiac resynchronization evaluation method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310620002.XA CN116468723A (en) 2023-05-29 2023-05-29 Cardiac resynchronization evaluation method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN116468723A true CN116468723A (en) 2023-07-21

Family

ID=87179153

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310620002.XA Pending CN116468723A (en) 2023-05-29 2023-05-29 Cardiac resynchronization evaluation method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN116468723A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117994249A (en) * 2024-04-02 2024-05-07 四川省医学科学院·四川省人民医院 Deep learning-based heart valve regurgitation severity ultrasonic assessment method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117994249A (en) * 2024-04-02 2024-05-07 四川省医学科学院·四川省人民医院 Deep learning-based heart valve regurgitation severity ultrasonic assessment method
CN117994249B (en) * 2024-04-02 2024-06-11 四川省医学科学院·四川省人民医院 Deep learning-based heart valve regurgitation severity ultrasonic assessment method

Similar Documents

Publication Publication Date Title
CN110914865B (en) Convolution deep learning analysis of temporal cardiac images
US11957438B2 (en) Method and apparatus for left ventricular end diastolic pressure measurement
CN108364290B (en) Method, medium, and system for analyzing a sequence of images of periodic physiological activity
WO2012106729A1 (en) System and method for evaluating an electrophysiological signal
BR112015025074B1 (en) Ultrasound imaging system and method for generating and evaluating standard two-dimensional views from three-dimensional ultrasonic volume data
JP2012520096A (en) Automatic analysis of cardiac M-mode images
CN116468723A (en) Cardiac resynchronization evaluation method, device, equipment and medium
Reddy et al. Video-based deep learning for automated assessment of left ventricular ejection fraction in pediatric patients
CN109044347B (en) Method, device and system for identifying junctional escape of electrocardiowave image and electronic equipment
US20230131629A1 (en) System and method for non-invasive assessment of elevated left ventricular end-diastolic pressure (LVEDP)
US20230106036A1 (en) Method and apparatus for reconstructing electrocardiogram (ecg) data
US20130013278A1 (en) Non-invasive cardiovascular image matching method
Jatmiko et al. Developing smart telehealth system in Indonesia: Progress and challenge
CN115299898A (en) Non-invasive arterial blood pressure waveform monitoring method, device and system
US11786202B2 (en) Method, system, and medium for analyzing image sequence of periodic physiological activity
EP3871594A1 (en) Method and apparatus for analyzing electrocardio signal, and signal recorder and three-dimensional mapping system
US20220335615A1 (en) Calculating heart parameters
CN116030828A (en) Heart sound analysis method for congenital heart disease screening
CN113592802B (en) Mitral valve annular displacement automatic detection system based on ultrasonic image
CN114617536A (en) Blood pressure prediction method and device and computer readable storage medium
CN110916646A (en) QRS wave detection method based on FASTER-RCNN
WO2021068193A1 (en) Method and apparatus for monitoring blood pressure waveform
Jahren et al. 3.1 Paper I: Estimation of end-diastole in Cardiac Spectral Doppler Using Deep Learning
US12023165B2 (en) Screening cardiac conditions using cardiac vibrational energy spectral heat maps
CN116350199B (en) Dynamic electrocardiogram and heart beat template generation method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination