CN115105032A - Image processing apparatus for evaluating cardiac image and ventricular state identification method - Google Patents

Image processing apparatus for evaluating cardiac image and ventricular state identification method Download PDF

Info

Publication number
CN115105032A
CN115105032A CN202110310722.7A CN202110310722A CN115105032A CN 115105032 A CN115105032 A CN 115105032A CN 202110310722 A CN202110310722 A CN 202110310722A CN 115105032 A CN115105032 A CN 115105032A
Authority
CN
China
Prior art keywords
image
gray
interest
region
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110310722.7A
Other languages
Chinese (zh)
Inventor
许鸿生
利建宏
黄宜瑾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Acer Inc
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Priority to CN202110310722.7A priority Critical patent/CN115105032A/en
Publication of CN115105032A publication Critical patent/CN115105032A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/02007Evaluating blood vessel condition, e.g. elasticity, compliance
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/02028Determining haemodynamic parameters not otherwise provided for, e.g. cardiac contractility or left ventricular ejection fraction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0883Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Cardiology (AREA)
  • Physiology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Vascular Medicine (AREA)
  • Artificial Intelligence (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention provides an image processing apparatus for evaluating a cardiac image and a ventricular state identification method. In the method, a region of interest is determined from a plurality of target images, gray-scale value changes of a plurality of pixels in the region of interest in those target images are determined, and one or more representative images are obtained according to the gray-scale value changes. Those target images are for pixels within the endocardial contour of the left ventricle, the boundaries of the region of interest are approximately on either side of the bottom of the endocardial contour, and the region of interest corresponds to the mitral valve. The gray level change is related to the motion of the mitral valve. Those representative images are used to assess the state of the left ventricle. Thus, the end systole and the end diastole can be identified quickly and accurately.

Description

Image processing device for evaluating cardiac image and ventricular state identification method
Technical Field
The present invention relates to an image recognition technique, and more particularly, to an image processing apparatus and a ventricular state recognition method for evaluating a cardiac image.
Background
Clinically, there are many ways to assess the quality of the heart condition at this stage, one of which is to measure the Left Ventricular Ejection Fraction (LVEF), i.e. how much volume of blood is ejected from the Left ventricle at each heart beat. Notably, the measurement of left ventricular ejection fraction requires reference to the End Systolic Volume (ESV) and the End Diastolic Volume (EDV). Therefore, End-systole (ES) and End-diastole (ED) have to be identified in the heart rate cycle to calculate EDV and ESV using Simpson's equations. Today's technology can identify ED from the R-wave in the Electrocardiogram (ECG), when the volume of the left ventricle is at its maximum. In addition, ES is identified at the end of the T-wave of the ECG, when the volume of the left ventricle is at a minimum. However, identifying ES in ECG is not easy.
Disclosure of Invention
Clinically, there are many ways to assess the quality of the heart condition at this stage, one of which is to measure the Left Ventricular Ejection Fraction (LVEF), i.e. how much volume of blood is ejected from the Left ventricle at each heart beat. Notably, the measurement of left ventricular ejection fraction requires reference to the End Systolic Volume (ESV) and the End Diastolic Volume (EDV). Therefore, End-systole (ES) and End-diastole (ED) have to be identified in the heart rate cycle to calculate EDV and ESV using Simpson's equations. Today's technology can identify ED from the R-wave in the Electrocardiogram (ECG), when the volume of the left ventricle is at its maximum. In addition, ES is identified at the end of the T-wave of the ECG, when the volume of the left ventricle is at a minimum. However, identifying ES in ECG is not easy.
Drawings
The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the invention and together with the description serve to explain the principles of the invention.
FIG. 1 is a block diagram of components of an image processing apparatus according to an embodiment of the present invention;
FIG. 2 is a flow chart of a ventricular state identification method according to an embodiment of the present invention;
FIG. 3 is a schematic illustration of left ventricular dissection in accordance with an embodiment of the present invention;
FIG. 4A is a schematic representation of the End Diastole (ED) of the left ventricle in accordance with an embodiment of the present invention;
FIG. 4B is a schematic representation of the left ventricle between end diastole and End Systole (ES), in accordance with one embodiment of the present invention;
FIG. 4C is a schematic illustration of the end systole of the left ventricle in accordance with one embodiment of the present invention.
Description of the reference numerals
100 image processing means;
110 a memory;
130, a processor;
S210-S250, step (II);
ROI is a region of interest;
401, anterior lobe;
403 posterior leaflet.
Detailed Description
Reference will now be made in detail to exemplary embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings and the description to refer to the same or like parts.
FIG. 1 is a block diagram of components of an image processing apparatus 100 according to an embodiment of the present invention. Referring to fig. 1, the image processing apparatus 100 includes, but is not limited to, a memory 110 and a processor 130. The image processing device 100 may be a desktop computer, a laptop computer, a smart phone, a tablet computer, a server, a medical examination instrument, or other computing device.
The Memory 110 may be any type of fixed or removable Random Access Memory (RAM), Read Only Memory (ROM), flash Memory (flash Memory), Hard Disk Drive (HDD), Solid-State Drive (SSD), or the like. In one embodiment, the memory 110 is used for recording program codes, software modules, configuration configurations, data (such as images, grayscales, statistics, statuses, or volumes, etc.) or files, and embodiments thereof will be described in detail later.
The Processor 130 is coupled to the memory 110, the Processor 130 may be a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or other Programmable general purpose or special purpose Microprocessor (Microprocessor), a Digital Signal Processor (DSP), a Programmable controller, a Field Programmable Gate Array (FPGA), an Application-Specific Integrated Circuit (ASIC), a neural network accelerator, or other similar components or combinations thereof. In one embodiment, the processor 130 is configured to execute all or part of the operations of the image processing apparatus 100, and can load and execute the program codes, software modules, files and data recorded in the memory 110.
Hereinafter, the method according to the embodiment of the present invention will be described with reference to various devices, components and modules in the image processing apparatus 100. The various processes of the method may be adapted according to the implementation, and are not limited thereto.
FIG. 2 is a flow chart of a ventricular state identification method according to an embodiment of the invention. Referring to fig. 2, the processor 130 may determine a region of Interest (ROI) from the plurality of target images (step S210). Specifically, those target images are pixels within the endocardial contour for the left ventricle (left ventricle). In one embodiment, the processor 130 may obtain one or more cardiac cycles and be A4C (cardiac Four View) or A2C (cardiac Two View) cardiac ultrasound images (or Echocardiography) of consecutive frames (frames). The cardiac ultrasound images of the consecutive frames may be derived by dividing a movie taken of the heart into consecutive frames or by dividing cardiac images of frames detected by the ultrasound probe and transmitted via wireless or wired transmission.
In one embodiment, the processor 130 may input the cardiac ultrasound of A4C or A2C to a Machine learning model (e.g., based on deep learning, Multi-Layer Perceptron (MLP), or Support Vector Machine (SVM)) to segment the image within the endocardial contour of the left ventricle and output as an endocardial image of the left ventricle. In other embodiments, the processor 130 may identify endocardial contours based on scale-invariant feature transform (SIFT), haar features, Adaboost, or other identification techniques, and segment the target image accordingly.
FIG. 3 is a schematic diagram of left ventricular ablation in accordance with an embodiment of the present invention. Referring to FIG. 3, the left image is A2C of a cardiac ultrasound image that can be identified to derive the endocardial contour of the left ventricle as shown on the right. Processor 130 may segment the target image from the cardiac ultrasound image based on the endocardial contour. It should be noted that fig. 3 shows image segmentation of a single frame, and the same or similar means may be used for segmentation of other frames.
In some embodiments, processor 130 may also directly obtain the target image that has been segmented from the cardiac image. That is, other devices or medical instruments segment the target image first.
In one embodiment, the boundaries of the region of interest are located approximately on either side of the bottom of the endocardial contour, and the region of interest corresponds to the mitral valve (mitral valve). The base refers to the lower half of the endocardium, or a portion of other proportions or ranges.
In one embodiment, the processor 130 may determine the region of interest for a binarized image of the endocardial contour. This binarizes only two gray scale values (e.g., full black and full white gray scale values) in the image, so that the processor 130 can calculate the coordinate positions of the first pixel points on the left and right sides in the image (i.e., corresponding to the bottom two sides), starting from the bottom of the left ventricular endocardium, respectively. The positions of the two pixel points are the left and right side boundaries of the region of interest. The two side borders are located approximately at the leftmost side of the anterior leaflet (organ leaf) and the rightmost side of the posterior leaflet (spatial leaf) of the mitral valve under the views of A4C or A2C.
In one embodiment, the thickness of the region of interest (i.e., the width in the vertical direction in the view of A4C or A2C, or the top-side and bottom-side border) may be the thickness of the anterior or posterior leaflet of the mitral valve. It should be noted that different people may have different thicknesses depending on age, sex, etc. For example, the anterior leaflet is 1.3 millimeters (mm) thick at an age of less than 20 years, 1.6 mm thick at an age between 20 and 56 years, and 3.2 mm thick at an age above 60 years.
FIG. 4A is a schematic representation of the End Diastole (ED) of the left ventricle in accordance with one embodiment of the present invention. Referring to fig. 4A, the ROI shown corresponds approximately to the anterior lobe 401 and the posterior lobe 403. Wherein the left boundary of the region of interest ROI corresponds to the leftmost side of the front leaf 401, and the left boundary of the region of interest ROI corresponds to the rightmost side of the rear leaf 403. The thickness of the region of interest ROI is slightly larger or approximately equal to the thickness of the anterior lobe 401 and the posterior lobe 403.
It should be noted that the shape of the region of interest ROI is not limited to the rectangle shown in fig. 4A, and the region of interest in other embodiments may be a geometric shape such as an ellipse, a diamond, or an irregular shape corresponding to the contour of the mitral valve.
In another embodiment, the processor 130 may determine the boundaries of the region of interest based on image recognition techniques (e.g., neural networks or feature comparisons). For example, the processor 130 identifies a region of interest based on a classifier derived from training samples of marked mitral valve locations or based on image features of the mitral valve, wherein the region of interest approximately corresponds to the contour of the mitral valve with the mitral valve fully open.
The processor 130 may determine gray-scale value changes for a plurality of pixels in the region of interest in those target images (step S230). Specifically, the gray level changes are related to the motion of the mitral valve. In the case of the contraction and relaxation of the heart, the mitral valve of the heart will open and close. The process of fully opening to fully closing (or fully closing to fully opening) the mitral valve is presented in the heart images of successive frames (frames). Notably, cardiac ultrasound images that exhibit either A4C or A2C typically employ the B mode (B-mode). Therefore, when the ultrasonic probe emits the sound wave, the intensity of the echo reflected back can be represented by the brightness of the point. The image composed of these dots (as pixels) is a gray-scale image. In a grayscale image, the intensity (i.e., grayscale value) of each pixel is a numerical value of 0 to 255, where a numerical value of 0 represents the darkest and a numerical value of 255 represents the whitest. As shown in fig. 4A, the mitral valve (consisting of anterior leaflet 401 and posterior leaflet 403) appears substantially white.
In the view of the region of interest (i.e., analyzing or viewing only the image within the region of interest), the action of the mitral valve will cause mitral valves that present different areas in the image to appear within the region of interest. For example, during the period from full opening to full closure of the mitral valve, part of the anterior leaflet or part of the posterior leaflet will gradually move away from the region of interest.
In one embodiment, the processor 130 may determine a sum of the gray scale values of those pixels in the region of interest in each target image. It is noted that the areas of the mitral valve that are encompassed by the region of interest in different target images may differ. The mitral valve in the grayscale image is substantially white (i.e., the grayscale value is higher) and its surroundings are substantially black (i.e., the grayscale value is lower). The sum of the gray levels reflects the area of the mitral valve included in the region of interest. The processor 130 may determine the difference in the sum between those target images (i.e., the change in the area of the region of interest contained to the mitral valve). And this difference corresponds to the aforementioned gray-scale value variation. That is, the difference in the sum of the gray-scale values is reflected in the action of the mitral valve.
For example, fig. 4B is a schematic diagram of the end-diastole to the end-systole of the left ventricle according to an embodiment of the present invention, and fig. 4C is a schematic diagram of the end-systole of the left ventricle according to an embodiment of the present invention. Referring to fig. 4A to 4C, the ROI may include different areas of the anterior lobe 401 and the posterior lobe 403 during the contraction and relaxation of the heart.
In some embodiments, processor 130 may enhance the contrast of the target image by Histogram equalization (Histogram equalization) or other image processing, and then determine the sum of the gray scale values of the pixels.
In one embodiment, the processor 130 may determine the maximum and minimum of the sums in those target images. Wherein the difference between the sum is related to the maximum and the minimum. It is noteworthy that clinically, the mitral valve is completely closed at the End Diastole (ED) of the heart and completely open at the End Systole (ES) of the heart. The smallest sum of the gray values in the region of interest corresponds to the mitral valve being in a fully closed state, i.e. to the end diastole of the left ventricle. The one with the largest sum corresponds to the mitral valve being fully open, i.e., to the end systole of the left ventricle. The processor 130 can use the maximum and minimum of the gray-scale sum as the representative of the gray-scale variation. That is, the processor 130 represents the end diastole and the end systole.
In some embodiments, the processor 130 may calculate an average of the sum of the grayscale values, or the number of the maximum grayscale values, and determine the maximum and minimum of the target images accordingly. For example, the average of the sums for end diastole is minimum and the average of the sums for end systole is maximum. Alternatively, the number of maximum gray values for end diastole is the smallest and the number for end systole is the largest.
The processor 130 obtains one or more representative images according to the gray-scale value variation (step S250). In particular, these representative images may be used to assess the state of the left ventricle. Clinically, the volume of the left ventricle in the end diastole state is maximal (as shown in fig. 4A), while the volume of the left ventricle in the end systole state is minimal (as shown in fig. 4C). In one embodiment, the processor 130 may obtain the target image with the largest sum of the gray-scale values and the target image with the smallest sum of the gray-scale values as the representative images. That is, the two representative images correspond to the end-systole and end-diastole states, respectively.
Therefore, the embodiment of the present invention can determine whether the target image is an end-systolic image or an end-diastolic image of the heart according to the motion change of the mitral valve in the region of interest. Compared to analyzing the septal zones (septal annuus), the embodiments of the present invention can quickly identify end systole or end diastole. For example, in one heart rate cycle, embodiments of the present invention may identify the status of end systole or end diastole.
In one embodiment, the processor 130 may calculate the volumes of the left ventricles in the representative image respectively, and determine the volume change of the left ventricle between the maximum and minimum (i.e., the difference in volume between the maximum and minimum). For example, processor 130 may calculate the area occupied by pixels within the endocardial contour representing the left ventricle in the image. And the volume change was used to estimate the blood ejection volume. For example, processor 130 may determine the Left Ventricular Ejection Fraction (LVEF) based on the End Systolic Volume (ESV) and the End Diastolic Volume (EDV).
In other embodiments, the processor 130 may obtain the target image of the ventricle in other states as the representative image according to different gray-scale value sums according to different requirements.
It should be noted that, the embodiment of the present invention can identify the end systole or end diastole of the heart in real time or non-real time for the heart image. In one embodiment, in real-time, the ultrasound probe will transmit several frames of cardiac images, so the processor 130 may accumulate the number of frames of about several heart beat cycles before identifying end systole or end diastole. In yet another embodiment, in a non-real-time situation, the processor 130 may segment the A4C or A2C cardiac echogram into a plurality of cardiac images, take cardiac images of frames of a plurality of heart cycles, and further identify end systole or end diastole.
In summary, in the image processing apparatus for evaluating a cardiac image and the ventricular state identification method according to the embodiments of the invention, gray-scale value changes of pixels in a region of interest in an image are analyzed to obtain a motion of a mitral valve in the region of interest. In addition, the embodiment of the invention can identify the state of the heart at the end systole or the end diastole based on the sum of the gray-scale values of the pixels and estimate the blood ejection amount according to the state. Therefore, different states of the heart in the processes of contraction and relaxation can be quickly and accurately identified.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. A ventricular state identification method, comprising:
determining a region of interest from a plurality of target images, wherein the target images are for pixels within an endocardial contour of a left ventricle, a boundary of the region of interest is located approximately on both sides of a bottom of the endocardial contour, and the region of interest corresponds to a mitral valve;
determining a gray-scale value change for a plurality of pixels in the region of interest in the target image, wherein the gray-scale value change is related to motion of the mitral valve; and
obtaining at least one representative image according to the gray-scale value variation, wherein the representative image is used for evaluating the state of the left ventricle.
2. A ventricular state recognition method as claimed in claim 1, wherein the step of determining the gray-level value change of the pixel in the region of interest in the target image comprises:
determining a sum of gray-scale values of the pixels in the region of interest; and
determining a difference in the sum between the target images, wherein the difference corresponds to the gray-scale value change.
3. A ventricular state identification method as claimed in claim 2, wherein the step of determining the difference in the sum between the target images includes:
determining a maximum and a minimum of the sum in the target image, wherein the difference is related to the maximum and the minimum, the minimum corresponding to an end-diastole of the left ventricle and the maximum corresponding to an end-systole of the left ventricle; and
and taking the maximum and minimum as the representation of the gray level value change.
4. A ventricular state identification method as claimed in claim 3, wherein the step of obtaining the representative image according to the gray-level value change includes:
and acquiring the target image of the maximum object and the target image of the minimum object as the representative images.
5. A ventricular state identification method as claimed in claim 4, further comprising:
determining a volume change of the left ventricle between the largest and the smallest, wherein the volume change is used to estimate a blood ejection volume.
6. An image processing apparatus for evaluating cardiac images, comprising:
a memory storing a plurality of program codes; and
a processor coupled to the memory, wherein the processor is configured to load and execute the program code to:
determining a region of interest from a plurality of target images, wherein the target images are for pixels within an endocardial contour of a left ventricle, a boundary of the region of interest is located approximately on both sides of a bottom of the endocardial contour, and the region of interest corresponds to a mitral valve;
determining a gray-scale value change for a plurality of pixels in the region of interest in the target image, wherein the gray-scale value change is related to motion of the mitral valve; and
obtaining at least one representative image according to the gray-scale value variation, wherein the representative image is used for evaluating the state of the left ventricle.
7. An image processing device for evaluating a cardiac image according to claim 6, wherein the processor is further configured to:
determining a sum of gray scale values of the pixels in the region of interest; and
determining a difference in the sum between the target images, wherein the difference corresponds to the gray-scale value change.
8. The image processing device for evaluating a cardiac image of claim 7, wherein the processor is further configured to:
determining a maximum and a minimum of the sum in the target image, wherein the difference is related to the maximum and the minimum, the minimum corresponding to an end diastole of the left ventricle and the maximum corresponding to an end systole of the left ventricle; and
and taking the maximum and minimum as the representation of the gray level value change.
9. The image processing device for evaluating a cardiac image of claim 8, wherein the processor is further configured to:
and acquiring the target image of the maximum and the target image of the minimum as the representative image.
10. The image processing device for evaluating a cardiac image of claim 9, wherein the processor is further configured to:
determining a volume change of the left ventricle between the largest and the smallest, wherein the volume change is used to estimate a blood ejection volume.
CN202110310722.7A 2021-03-23 2021-03-23 Image processing apparatus for evaluating cardiac image and ventricular state identification method Pending CN115105032A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110310722.7A CN115105032A (en) 2021-03-23 2021-03-23 Image processing apparatus for evaluating cardiac image and ventricular state identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110310722.7A CN115105032A (en) 2021-03-23 2021-03-23 Image processing apparatus for evaluating cardiac image and ventricular state identification method

Publications (1)

Publication Number Publication Date
CN115105032A true CN115105032A (en) 2022-09-27

Family

ID=83323030

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110310722.7A Pending CN115105032A (en) 2021-03-23 2021-03-23 Image processing apparatus for evaluating cardiac image and ventricular state identification method

Country Status (1)

Country Link
CN (1) CN115105032A (en)

Similar Documents

Publication Publication Date Title
JP7123891B2 (en) Automation of echocardiographic Doppler examination
CN110009640B (en) Method, apparatus and readable medium for processing cardiac video
WO2017206023A1 (en) Cardiac volume identification analysis system and method
CN110197713B (en) Medical image processing method, device, equipment and medium
Balaji et al. Automatic classification of cardiac views in echocardiogram using histogram and statistical features
CN103260526B (en) There is ultrasonic image-forming system and the method for peak strength measuring ability
US9147258B2 (en) Methods and systems for segmentation in echocardiography
CN111275755B (en) Mitral valve orifice area detection method, system and equipment based on artificial intelligence
WO2022141083A1 (en) Periodic parameter analysis method and ultrasonic imaging system
CN117547306B (en) Left ventricular ejection fraction measurement method, system and device based on M-type ultrasound
CN117017347B (en) Image processing method and system of ultrasonic equipment and ultrasonic equipment
CN116869571B (en) Ultrasonic heart reflux automatic detection and evaluation method, system and device
WO2021152603A1 (en) System and method for classification of strain echocardiograms
US11610312B2 (en) Image processing apparatus for evaluating cardiac images and ventricular status identification method
CN115105032A (en) Image processing apparatus for evaluating cardiac image and ventricular state identification method
de Melo et al. Gradient boosting decision trees for echocardiogram images
Nageswari et al. Preserving the border and curvature of fetal heart chambers through TDyWT perspective geometry wrap segmentation
CN114271850A (en) Ultrasonic detection data processing method and ultrasonic detection data processing device
Upendra et al. Artificial neural network application in classifying the left ventricular function of the human heart using echocardiography
US11803967B2 (en) Methods and systems for bicuspid valve detection with generative modeling
Jaenputra et al. Heart Disease Detection from PSAX Echocardiography View using Ultrasound Portable Based on Machine Learning Method
EP4059441B1 (en) Apparatus for evaluating movement state of heart
Sánchez-Puente et al. Uncertainty to Improve the Automatic Measurement of Left Ventricular Ejection Fraction in 2D Echocardiography Using CNN-Based Segmentation
Ranaweera et al. Artificial Neural Network Application in Classifying the Left Ventricular Function of the Human Heart Using Echocardiography
Thennakoon et al. Automatic classification of left ventricular function of the human heart using echocardiography

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination