CN115148339A - Method for estimating ventricular volume - Google Patents

Method for estimating ventricular volume Download PDF

Info

Publication number
CN115148339A
CN115148339A CN202110343733.5A CN202110343733A CN115148339A CN 115148339 A CN115148339 A CN 115148339A CN 202110343733 A CN202110343733 A CN 202110343733A CN 115148339 A CN115148339 A CN 115148339A
Authority
CN
China
Prior art keywords
reference point
pixel
pixels
distance
surrounding pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110343733.5A
Other languages
Chinese (zh)
Inventor
黄宜瑾
利建宏
许鸿生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Acer Inc
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Priority to CN202110343733.5A priority Critical patent/CN115148339A/en
Publication of CN115148339A publication Critical patent/CN115148339A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/06Measuring blood flow
    • A61B8/065Measuring blood flow to determine blood output from the heart
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30048Heart; Cardiac

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Primary Health Care (AREA)
  • Physiology (AREA)
  • Epidemiology (AREA)
  • Cardiology (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Hematology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Quality & Reliability (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention provides a method for estimating ventricular volume, comprising the following steps: obtaining a left ventricle mask image corresponding to a cardiac ultrasound image, wherein the left ventricle mask image is a binary image; finding 3 reference point pixels in the lv mask image, wherein each reference point pixel has a first value, each reference point pixel surrounds N surrounding pixels, and the surrounding pixels of each reference point pixel include N1 first surrounding pixels having the first value and N2 second surrounding pixels having a second value; a left ventricle volume corresponding to the ultrasound image of the heart is estimated based on the plurality of reference point pixels. Therefore, the left ventricle volume can be automatically estimated with better efficiency under the condition of not manually marking the apex and the two side mitral valves.

Description

Method for estimating ventricular volume
Technical Field
The present invention relates to a medical image evaluation method, and more particularly, to a method of estimating ventricular volume.
Background
In the prior art, there are many documents that estimate the ventricular volume in ultrasound images of the heart based on Simpson's rule. Generally, before using the simpson formula, three specific reference points are found in the ultrasonic image of the heart, which correspond to the apex (apex) and the Mitral valve (Mitral valve) on both sides. After finding the reference points, the simpson's formula can be used to estimate ventricular volume.
However, the reference points are generally marked manually by the medical professional in the ultrasonic image of the heart, and there is no related art for automatically marking the reference points.
Disclosure of Invention
The present invention provides a method for estimating ventricular volume, which can be used to solve the above technical problems.
The invention provides a method for estimating ventricular volume, which is suitable for an electronic device and comprises the following steps: obtaining a left ventricular mask image corresponding to a cardiac ultrasound image, wherein the left ventricle mask image is a binary image; finding 3 reference point pixels in the lv mask image, wherein each reference point pixel has a first value, each reference point pixel surrounds N surrounding pixels, and the surrounding pixels of each reference point pixel include N1 first surrounding pixels having the first value and N2 second surrounding pixels having a second value; a left ventricular volume corresponding to the echocardiographic image is estimated based on the plurality of reference point pixels.
Drawings
The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the invention and together with the description serve to explain the principles of the invention.
FIG. 1 is a schematic diagram of an electronic device according to an embodiment of the invention;
FIG. 2 is a flow diagram illustrating a method of estimating ventricular volume in accordance with an embodiment of the present invention;
FIG. 3 is a diagram illustrating an application context according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating the finding of first, second, and third reference point pixels based on the distance between the reference point pixels according to an embodiment of the present invention;
FIG. 5A is a diagram illustrating the finding of reference point pixels corresponding to the apex of the heart according to FIG. 3;
FIG. 5B is a diagram illustrating the finding of reference point pixels corresponding to the left mitral valve according to FIG. 5A;
FIG. 5C is a diagram illustrating the finding of reference point pixels corresponding to the right mitral valve according to FIG. 5B;
FIG. 6 is a flow chart illustrating a method of assisting in assessing a state of motion of a heart, in accordance with an embodiment of the present invention;
FIG. 7A is a diagram illustrating an application context, according to an embodiment of the present invention;
FIG. 7B is another application context diagram illustrated in accordance with FIG. 7A.
Detailed Description
Reference will now be made in detail to exemplary embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings and the description to refer to the same or like parts.
Fig. 1 is a schematic diagram of an electronic device according to an embodiment of the invention. In various embodiments, the electronic device 100 is, for example, a personal computer device, a smart device, and/or a handheld device, but may not be limited thereto.
As shown in fig. 1, the electronic device 100 includes a memory circuit 102 and a processor 104. The Memory circuit 102 is, for example, any type of fixed or removable Random Access Memory (RAM), read-Only Memory (ROM), flash Memory (Flash Memory), hard disk, or other similar devices or combination thereof, and can be used to record a plurality of program codes or modules.
The processor 104 is coupled to the memory Circuit 102, and may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor, a plurality of microprocessors (microprocessors), one or more microprocessors in conjunction with a digital signal processor core, a controller, a microcontroller, an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), any other type of Integrated Circuit, a state Machine, an Advanced reduced instruction set Machine (Advanced RISC Machine, ARM) based processor, and the like.
In an embodiment of the present invention, the processor 104 may access the module and program code recorded in the memory circuit 102 to implement the method for estimating ventricular volume of the present invention, the details of which are described below.
Referring to fig. 2 and 3, fig. 2 is a flow chart illustrating a method for estimating ventricular volume according to an embodiment of the present invention, and fig. 3 is an application scenario diagram illustrating an embodiment of the present invention. The method of the present embodiment can be executed by the electronic apparatus 100 of fig. 1, and details of steps of fig. 2 are described below in conjunction with components shown in fig. 1 and the scenario of fig. 3.
First, in step S210, the processor 104 may obtain a left ventricular mask image 31 corresponding to the ultrasonic image 30 of the heart, wherein the left ventricular mask (mask) image 31 is a binary image. In an embodiment of the present invention, the processor 104 may, for example, input the cardiac ultrasound image 30 (which is, for example, an apical view (apical view) of A2C (adaptive two chamber) or A4C (adaptive four chamber)) into a pre-trained machine learning model to output a corresponding binarized image as the left ventricle mask image 31 in response to the cardiac ultrasound image 30 from the machine learning model.
In one embodiment, in order to provide the machine learning model with the above capability, a designer may input various cardiac ultrasound images of an image area marked with a ventricle as training data to the machine learning model in the process of training the machine learning model. Therefore, the machine learning model can learn the characteristics of the image area corresponding to the ventricle, and accordingly, when the unknown heart ultrasonic image is obtained, the image area corresponding to the ventricle can be identified. Thereafter, the machine learning model sets all pixels in the image region corresponding to the ventricle to a first value (e.g., 255) and all pixels in the image region not corresponding to the ventricle to a second value (e.g., 0) to generate a corresponding binarized image, but is not limited thereto.
Thereafter, in step S220, the processor 104 may find 3 reference point pixels 311 to 313 in the left ventricular mask image 31.
In embodiments of the present invention, each reference point pixel 311-313 may have a first value (e.g., 255). In addition, each of the reference point pixels 311 to 313 may be surrounded by N (e.g., 8) surrounding pixels, and the surrounding pixels of each of the reference point pixels 311 to 313 may include N1 (e.g., 3) first surrounding pixels having a first value and N2 (e.g., 5) second surrounding pixels having a second value (e.g., 0), where N, N1, and N2 are positive integers.
In one embodiment, among the surrounding pixels of the reference point pixel 311, the first surrounding pixels (i.e., the pixels located in the range 311 a) are arranged in a straight line, and the second surrounding pixels (i.e., the pixels located in the range 311 b) are arranged in a C-shape.
In one embodiment, among the surrounding pixels of the reference point pixel 312, the first surrounding pixels (i.e., the pixels located in the range 312 a) are arranged in an L shape, and the second surrounding pixels (i.e., the pixels located in the range 312 b) are arranged in an L shape.
In one embodiment, among the surrounding pixels of the reference point pixel 313, the first surrounding pixels (i.e., the pixels located in the range 313 a) are arranged in an L shape, and the second surrounding pixels (i.e., the pixels located in the range 313 b) are arranged in an L shape.
In an embodiment of the invention, since the reference point pixels 311 to 313 are respectively unique in the lv mask image 31, the processor 104 may examine each pixel in the lv mask image 31 one by one, and define 3 pixels meeting the above condition (e.g., 3 first surrounding pixels having the first value and 5 second surrounding pixels having the second value among 8 surrounding pixels, and having the first value) as the reference point pixels 311 to 313.
After that, in step S230, the processor 104 may estimate a left ventricular volume corresponding to the cardiac ultrasound image 30 based on the reference point pixels 311-313. In one embodiment, the processor 104 may estimate the distance between the reference point pixels 311 to 313 and accordingly find the first, second and third reference point pixels corresponding to the apex of the heart, the first mitral valve (e.g., left mitral valve) and the second mitral valve (e.g., right mitral valve) respectively among the reference point pixels 311 to 313. The processor 104 may then apply the simpson's formula based on the first, second, and third reference point pixels to estimate a left ventricular volume corresponding to the ultrasound cardiac image 30.
Referring to fig. 4, a schematic diagram illustrating the finding of first, second and third reference point pixels based on the distance between the reference point pixels according to an embodiment of the invention is shown. Generally, the distance between the left and right mitral valves should be less than the distance between the apex of the heart and either mitral valve. Therefore, the processor 104 can find out the first, second and third reference point pixels corresponding to the apex of the heart, the first mitral valve and the second mitral valve respectively from the reference point pixels 311-313 based on this principle.
In fig. 4, a first distance D1 may exist between reference point pixel 311 and reference point pixel 312, a second distance D2 may exist between reference point pixel 311 and reference point pixel 313, and a third distance D3 may exist between reference point pixel 312 and reference point pixel 313.
In the scenario of FIG. 4, in response to determining that the first distance D1 and the second distance D2 are both greater than the third distance D3, the processor may define the reference point pixels 311-313 as the first, second, and third reference point pixels, respectively.
In another embodiment, in response to determining that the second distance D2 and the third distance D3 are both greater than the first distance D1, the processor 104 may define the reference point pixel 313, the reference point pixel 311, and the reference point pixel 312 as the first, second, and third reference point pixels, respectively. In another embodiment, in response to determining that the first distance D1 and the third distance D3 are both greater than the second distance D2, the processor 103 may define the reference point pixel 312, the reference point pixel 311, and the reference point pixel 313 as the first, second, and third reference point pixels, respectively.
In addition, if the cardiac ultrasound image 30 is determined as the apical view, the highest one among the found 3 reference point pixels should correspond to the apex. Thus, in fig. 4, the processor 104 may directly define the highest-height reference point pixel 311 as the first reference point pixel corresponding to the apex of the heart, and define the remaining reference point pixels 312, 313 as the second and third reference point pixels corresponding to the mitral valve, respectively, but may not be limited thereto.
The processor 104 may then apply the simpson's equation to estimate the volume of the left ventricle corresponding to the ultrasonic image 30 of the heart based on the first, second, and third reference point pixels, for details, which are not described herein in detail with reference to the related art documents.
In addition, to improve the efficiency of finding the reference point pixels 311 to 313, the processor 104 may find the reference point pixels 311 to 313 based on the mechanisms shown in fig. 5A to 5C.
Please refer to fig. 5A, which is a diagram illustrating the finding of a reference point pixel corresponding to the apex of the heart according to fig. 3. As described above, if the cardiac ultrasound image 30 is determined as the apical view, the highest one among the found 3 reference point pixels should correspond to the apex.
In this regard, starting from the highest pixel column in the left ventricular mask image 31, the processor 104 may sweep down column by column to find pixels that meet the above condition (e.g., include 3 first surrounding pixels having a first value and 5 second surrounding pixels having a second value, and have the first value, among the 8 surrounding pixels). In FIG. 5A, when a pixel meeting the above condition is found, the processor 104 can directly define the pixel as the reference point pixel 311 corresponding to the apex of the heart, and pause the saccade.
Please refer to fig. 5B, which is a diagram illustrating the finding of the reference point pixel corresponding to the left mitral valve according to fig. 5A. In the case where the ultrasonic image 30 of the heart is determined as the apical view, the leftmost lower one of the 3 reference point pixels found should correspond to the left mitral valve.
In this regard, starting from the lowest pixel column in the left ventricular mask image 31, the processor 104 may sweep from left to right column by column to find pixels that meet the above condition (e.g., include 3 first surrounding pixels having a first value and 5 second surrounding pixels having a second value, among the 8 surrounding pixels, and have the first value). In FIG. 5B, when a pixel is found that meets the above condition, the processor 104 directly defines the pixel as the reference point pixel 312 corresponding to the left mitral valve, and suspends the saccade process.
Please refer to fig. 5C, which is a diagram illustrating the finding of the reference point pixel corresponding to the right mitral valve according to fig. 5B. In the case where the ultrasonic cardiac image 30 is determined as the apical view, the rightmost lower one of the found 3 reference point pixels should correspond to the right mitral valve.
In this regard, starting with the lowest pixel column in the left ventricular mask image 31, the processor 104 may sweep from right to left column by column to find pixels that meet the above condition (e.g., include 3 first surrounding pixels having a first value and 5 second surrounding pixels having a second value, among the 8 surrounding pixels, and have the first value). In FIG. 5C, when a pixel meeting the above condition is found, the processor 104 may first determine whether the pixel is defined as another reference point pixel (e.g., the reference point pixel 312). If not, the processor 104 may directly define this pixel as the reference point pixel 313 corresponding to the right mitral valve and halt the process of panning. On the other hand, if the pixel is already defined as another reference point pixel (e.g., reference point pixel 312), the processor 104 may ignore the pixel and continue to sweep upward to find another pixel that meets the above condition. When another pixel is found that meets the above condition, the processor 104 may directly define the other pixel as the reference point pixel 313 corresponding to the right mitral valve and pause the process of saccades.
In other embodiments, the processor 104 may also find the reference point pixels 311-313 in the left ventricular mask image 31 by other methods, and is not limited to the methods taught in fig. 5A-5C.
As can be seen from the above, the method for estimating ventricular volume according to the present invention can find 3 pixels meeting a specific condition (e.g., among 8 surrounding pixels, 3 first surrounding pixels having a first value and 5 second surrounding pixels having a second value, and having the first value) as reference point pixels corresponding to the apex and the mitral valves after obtaining the left ventricular mask image corresponding to the left ventricular ultrasound. The left ventricular volume may then be estimated based on the plurality of reference point pixels. Therefore, the invention can automatically estimate the volume of the left ventricle with better efficiency under the condition of not manually marking the apex of the heart and the mitral valves at two sides.
In another embodiment, the present invention further provides a method for assisting in assessing a moving state of a heart, which can determine whether an abnormal moving state of the heart occurs based on a change in a volume of a left ventricle. In an embodiment of the present invention, the processor 104 may access the module and the program code recorded in the memory circuit 102 to implement the method for assisting in evaluating the motion state of the heart, which is provided by the present invention and is described in detail below.
Referring to FIG. 6 of the drawings, which is a flow chart illustrating a method of assisting in assessing a state of motion of a heart, in accordance with an embodiment of the present invention. The method of this embodiment can be executed by the electronic device 100 of fig. 1, and details of steps in fig. 6 will be described below in conjunction with components shown in fig. 1.
First, in step S610, the processor 104 may obtain a plurality of consecutive cardiac ultrasound images corresponding to a heart (e.g., a heart of a patient), and accordingly estimate a plurality of left ventricle volumes corresponding to the plurality of cardiac ultrasound images.
In one embodiment, the processor 104 may first obtain the cardiac ultrasound images and determine whether each cardiac ultrasound image belongs to an apical view (e.g., A2C or A4C). In one embodiment, the processor 104 may determine whether each of the ultrasonic images of the heart belongs to the apical view based on the techniques described in the "Guidelines for rendering a Comprehensive cardiac imaging evaluation in Adults" literature, from the American Society of Echocardiography ", for example, and therefore, the details thereof are referred to the above-mentioned literature and not described herein.
In response to determining that each of the ultrasonic cardiac images belongs to the apical view, the processor 104 may extract a left ventricle mask image corresponding to a left ventricle of the heart from each of the ultrasonic cardiac images and accordingly estimate a left ventricle volume corresponding to the ultrasonic cardiac image.
In one embodiment, the processor 104 may, for example, input each ultrasound image of the heart into the aforementioned machine learning model, wherein the machine learning model may output a corresponding left ventricular mask image in response to each ultrasound image of the heart.
In an embodiment of the invention, for each left ventricular mask image, the processor 104 may estimate the corresponding left ventricular volume based on the mechanisms taught in fig. 2 to 5C, and therefore, the details thereof are not described herein.
For the purpose of illustrating the concept of the present invention, the following description will be supplemented with FIG. 7A, wherein FIG. 7A is a diagram illustrating an application scenario according to an embodiment of the present invention. In fig. 7A, the continuous plurality of left ventricular volumes resulting from step S610 can be illustrated as the left ventricular volume change plot 700 shown in fig. 7A.
Thereafter, in step S620, the processor 104 can find a plurality of specific extrema 711-715 in the plurality of left ventricular volumes, and accordingly estimate a plurality of time differences T1-T4 between the plurality of specific extrema 711-715.
In an embodiment, the processor 104 may take, for example, a plurality of specific left ventricular volumes corresponding to the EDV among the above left ventricular volumes as the plurality of specific extrema, but may not be limited thereto. By definition, each EDV should correspond to the largest left ventricular volume in the heart rate cycle to which it belongs. Accordingly, if the processor 104 determines that the ith (where i is an integer) of the left ventricular volumes is greater than the (i-1) and (i + 1) th of the left ventricular volumes, the processor 104 may determine that the ith ventricular volume corresponds to the EDV, and may further determine that the ith left ventricular volume belongs to one of the specific extrema.
In the scenario of FIG. 7A, since the left ventricular volume change plot 700 can be understood to include 5 heart rate cycles, the processor 104 can find 5 EDVs as the specific extrema 711-715 based on the above principle. The processor 104 may then re-estimate the time differences T1-T4 between the particular extremum values 711-715.
In general, assuming that the specific extremum found by the processor 104 includes the 1 st to Kth (where K is an integer) specific extremum, the time difference between the j +1 th specific extremum and the j-th specific extremum can be defined as the j-th time difference, where j is greater than or equal to 1 and less than or equal to K-1.
Taking fig. 7A as an example, the time difference T1 (which can be understood as the 1 st time difference) is, for example, the time difference between the specific extreme value 711 (which can be understood as the 1 st specific extreme value) and the specific extreme value 712 (which can be understood as the 2 nd specific extreme value). The time difference T2 (which may be understood as 2 nd time difference) is, for example, a time difference between the specific extremum 712 (which may be understood as 2 nd specific extremum) and the specific extremum 713 (which may be understood as 3 rd specific extremum). The time difference T3 (which may be understood as a 3 rd time difference) is, for example, a time difference between the specific extremum 713 (which may be understood as a 3 rd specific extremum) and the specific extremum 714 (which may be understood as a4 th specific extremum). The time difference T4 (which may be understood as a4 th time difference) is, for example, a time difference between the specific extremum 714 (which may be understood as a4 th specific extremum) and the specific extremum 715 (which may be understood as a 5 th specific extremum), but may not be limited thereto.
Thereafter, in step S630, the processor 104 may estimate statistical characteristic values (including but not limited to an average value of the time differences T1 to T4) of the time differences T1 to T4 based on the time differences T1 to T4. Moreover, the processor 104 can determine whether each of the time differences T1 to T4 deviates from the statistical characteristics to reach a predetermined threshold. In various embodiments, the predetermined threshold may be set to any ratio, such as 5%, but not limited thereto, according to the requirement of the designer.
In response to determining that at least one of the time differences T1-T4 deviates from the statistical characteristic by a predetermined threshold value, the processor 104 may determine that an abnormal motion state (e.g., a cardiac arrhythmia state) occurs in the heart in step S640. On the other hand, in response to determining that none of the time differences T1 to T4 deviate from the statistical characteristic value by the predetermined threshold value, the processor 104 may determine that the abnormal motion state of the heart does not occur.
In fig. 7A, if the processor 104 determines that the time differences T1-T4 do not deviate from the statistical characteristic value by the predetermined threshold value, the processor 104 may determine that the abnormal motion state such as arrhythmia does not occur in the heart.
Please refer to fig. 7B, which is another application scenario diagram according to fig. 7A. In the present embodiment, it is assumed that the processor 104 obtains the left ventricle volume change map 700a shown in FIG. 7B according to the previous teachings, and finds a plurality of specific extrema 711 a-715 a corresponding to the EDV and the corresponding time differences T1 '-T4' therein.
In fig. 7B, assuming that the processor 104 determines that the time difference T2' among the time differences T1' to T4' deviates from the statistical characteristic values of the time differences T1' to T4' by a preset threshold value, the processor 104 may determine that an abnormal motion state (e.g., a cardiac arrhythmia state) of the heart occurs, but may not be limited thereto.
In an embodiment of the present invention, the processor 104 may provide the determination result of whether the abnormal motion state of the heart occurs to the relevant medical personnel for the reference of diagnosis, but is not limited thereto.
In addition, although the above embodiment has the left ventricular volume corresponding to the EDV as the specific extreme under consideration, in other embodiments, the processor 104 may also have the left ventricular volume corresponding to the ESV as the specific extreme under consideration. By definition, each ESV should correspond to the smallest left ventricular volume in the heart rate cycle to which it belongs. Accordingly, if the processor 104 determines that the ith of the left ventricular volumes is less than the (i-1) th and (i + 1) th of the left ventricular volumes, the processor 104 may determine that the ith ventricular volume corresponds to the ESV and may further determine that the ith left ventricular volume belongs to one of the particular extrema.
Accordingly, in the scenario of FIG. 7B, the processor 104 may accordingly find the value of the left ventricular volume corresponding to the ESV as the specific extremum 711B-715B, and accordingly estimate the corresponding time difference T1 "-T4".
In FIG. 7B, assuming that the processor 104 determines that the time difference T1 among the time differences T1 'to T4' deviates from the statistical characteristic values of the time differences T1 'to T4' by a preset threshold value, the processor 104 may determine that an abnormal motion state (e.g., a cardiac arrhythmia state) occurs in the heart and may serve as a reference for diagnosis by the medical personnel concerned.
In some embodiments, if the relevant medical personnel determines that the heart is misjudged to have the abnormal motion state after examining the ultrasonic image of the heart corresponding to fig. 7B (i.e., the heart does not have the abnormal motion state substantially), the relevant medical personnel can report the situation to the electronic device 100. In an embodiment of the present invention, since the misjudgment situation may be caused by the poor recognition capability of the machine learning model for the left ventricle image region, the processor 104 may retrain the machine learning model accordingly to reduce the probability of misjudgment occurring in the future, but not limited thereto.
In summary, the method for estimating ventricular volume according to the present invention can find 3 pixels meeting the specific condition as the reference point pixels corresponding to the apex and the mitral valve after obtaining the left ventricular mask image corresponding to the left ventricular ultrasound. The left ventricular volume may then be estimated based on the plurality of reference point pixels. Therefore, the invention can automatically estimate the volume of the left ventricle with better efficiency under the condition of not manually marking the apex of the heart and the mitral valves at two sides.
In addition, the method for assisting in evaluating the motion state of the heart according to the present invention can find a plurality of specific extreme values corresponding to the EDV (or ESV) in a plurality of left ventricle volumes corresponding to a plurality of consecutive ultrasonic images of the heart, and determine whether the abnormal motion state such as arrhythmia occurs in the heart based on the time difference between the specific extreme values. Therefore, the related medical personnel can more easily grasp the condition of the heart, thereby reducing the probability of making wrong assessment (such as calculating wrong ejection fraction and the like).
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and these modifications or substitutions do not depart from the spirit of the corresponding technical solutions of the embodiments of the present invention.

Claims (8)

1. A method of estimating ventricular volume, adapted for an electronic device, comprising:
obtaining a left ventricle mask image corresponding to a heart ultrasonic image, wherein the left ventricle mask image is a binary image;
finding 3 reference point pixels in the lv mask image, wherein each of the reference point pixels has a first value, each of the reference point pixels surrounds N surrounding pixels, and a plurality of the surrounding pixels of each of the reference point pixels includes N1 first surrounding pixels having the first value and N2 second surrounding pixels having a second value, where N, N1, and N2 are positive integers;
estimating a left ventricular volume corresponding to the cardiac ultrasound image based on the plurality of reference point pixels.
2. The method of claim 1, wherein N1 is 3 and N2 is 5.
3. The method of claim 1, wherein the first value is 255 and the second value is 0.
4. The method of claim 1, wherein the plurality of reference point pixels comprises a first reference point pixel, and of the plurality of surrounding pixels of the first reference point pixel, the plurality of first surrounding pixels are arranged in a straight line and the plurality of second surrounding pixels are arranged in a C-shape.
5. The method of claim 1, wherein the plurality of reference point pixels comprises a second reference point pixel and a third reference point pixel, and of the plurality of surrounding pixels of the second reference point pixel and the third reference point pixel, the plurality of first surrounding pixels are arranged in an L-shape and the plurality of second surrounding pixels are arranged in an L-shape.
6. The method of claim 1, wherein estimating the left ventricular volume corresponding to the cardiac ultrasound image based on the plurality of reference point pixels comprises:
estimating the distance among the plurality of reference point pixels, and finding out a first reference point pixel, a second reference point pixel and a third reference point pixel which respectively correspond to the apex of the heart, the first mitral valve and the second mitral valve from the plurality of reference point pixels;
applying a Simpson's formula based on the first, second, and third reference point pixels to estimate the left ventricular volume corresponding to the cardiac ultrasound image.
7. The method of claim 6, wherein the plurality of reference point pixels comprises a first pixel, a second pixel, and a third pixel, a first distance exists between the first pixel and the second pixel, a second distance exists between the first pixel and the third pixel, a third distance exists between the second pixel and the third pixel, and estimating a distance between the plurality of reference point pixels and each other, and the step of finding the first reference point pixel, the second reference point pixel, and the third reference point pixel in the plurality of reference points corresponding to the apex, the first mitral valve, and the second mitral valve, respectively, comprises:
in response to determining that the first distance and the second distance are both greater than the third distance, defining the first pixel, the second pixel, and the third pixel as the first reference point pixel, the second reference point pixel, and the third reference point pixel, respectively;
in response to determining that the second distance and the third distance are both greater than the first distance, defining the third pixel, the first pixel, and the second pixel as the first reference point pixel, the second reference point pixel, and the third reference point pixel, respectively;
in response to determining that the first distance and the third distance are both greater than the second distance, defining the second pixel, the first pixel, and the third pixel as the first reference point pixel, the second reference point pixel, and the third reference point pixel, respectively.
8. The method of claim 1, wherein estimating the left ventricular volume corresponding to the echocardiographic image based on the plurality of reference point pixels comprises:
in response to determining that the cardiac ultrasound image is an apical view, defining a highest-level one of the plurality of reference point pixels as a first reference point pixel corresponding to an apex of the heart, and defining another two of the plurality of reference point pixels as a second reference point pixel and a third reference point pixel corresponding to a first mitral valve and a second mitral valve, respectively;
applying a Simpson's formula based on the first, second, and third reference point pixels to estimate the left ventricular volume corresponding to the cardiac ultrasound image.
CN202110343733.5A 2021-03-30 2021-03-30 Method for estimating ventricular volume Pending CN115148339A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110343733.5A CN115148339A (en) 2021-03-30 2021-03-30 Method for estimating ventricular volume

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110343733.5A CN115148339A (en) 2021-03-30 2021-03-30 Method for estimating ventricular volume

Publications (1)

Publication Number Publication Date
CN115148339A true CN115148339A (en) 2022-10-04

Family

ID=83404094

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110343733.5A Pending CN115148339A (en) 2021-03-30 2021-03-30 Method for estimating ventricular volume

Country Status (1)

Country Link
CN (1) CN115148339A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105261051A (en) * 2015-09-25 2016-01-20 沈阳东软医疗系统有限公司 Method and apparatus for obtaining image mask
CN108882917A (en) * 2016-05-30 2018-11-23 深圳迈瑞生物医疗电子股份有限公司 A kind of heart volume discriminance analysis system and method
CN110659343A (en) * 2019-09-23 2020-01-07 阿里巴巴集团控股有限公司 Extraction method, device and equipment of geo-fence data
CN110664435A (en) * 2019-09-23 2020-01-10 东软医疗系统股份有限公司 Method and device for acquiring cardiac data and ultrasonic imaging equipment
CN111419280A (en) * 2020-04-29 2020-07-17 中国人民解放军总医院 Artificial intelligence method, apparatus and computer medium for obtaining cardiac pressure volume loop

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105261051A (en) * 2015-09-25 2016-01-20 沈阳东软医疗系统有限公司 Method and apparatus for obtaining image mask
CN108882917A (en) * 2016-05-30 2018-11-23 深圳迈瑞生物医疗电子股份有限公司 A kind of heart volume discriminance analysis system and method
CN110659343A (en) * 2019-09-23 2020-01-07 阿里巴巴集团控股有限公司 Extraction method, device and equipment of geo-fence data
CN110664435A (en) * 2019-09-23 2020-01-10 东软医疗系统股份有限公司 Method and device for acquiring cardiac data and ultrasonic imaging equipment
CN111419280A (en) * 2020-04-29 2020-07-17 中国人民解放军总医院 Artificial intelligence method, apparatus and computer medium for obtaining cardiac pressure volume loop

Similar Documents

Publication Publication Date Title
CN110742653B (en) Cardiac cycle determination method and ultrasonic equipment
US11207055B2 (en) Ultrasound Cardiac Doppler study automation
US8343053B2 (en) Detection of structure in ultrasound M-mode imaging
US11157797B2 (en) Evaluating quality of a product such as a semiconductor substrate
US20200178930A1 (en) Method and system for evaluating cardiac status, electronic device and ultrasonic scanning device
CN115641340A (en) Retina blood vessel image segmentation method based on multi-scale attention gating network
US20150379373A1 (en) Automatic assessment of perceptual visual quality of different image sets
CN111046893B (en) Image similarity determining method and device, image processing method and device
CN110956636A (en) Image processing method and device
Yasrab et al. End-to-end first trimester fetal ultrasound video automated crl and nt segmentation
CN115148339A (en) Method for estimating ventricular volume
US11844655B2 (en) Method for estimating ventricular volume
CN114638878B (en) Two-dimensional echocardiogram pipe diameter detection method and device based on deep learning
CN115147331A (en) Method for assisting in assessing heart motion state
US20220296210A1 (en) Method for evaluating movement state of heart
KR102432766B1 (en) Magnetic resonance image analysis system and method for alzheimer's disease classification
CN114067081A (en) 3D tooth model segmentation method based on bidirectional enhanced network
KR20210074205A (en) System and method for image classification based positioning
CN112509052A (en) Method and device for detecting fovea maculata, computer equipment and storage medium
Liu et al. Context-endcoding for neural network based skull stripping in magnetic resonance imaging
CN116824116B (en) Super wide angle fundus image identification method, device, equipment and storage medium
US11610312B2 (en) Image processing apparatus for evaluating cardiac images and ventricular status identification method
CN113256593B (en) Tumor image detection method based on task self-adaptive neural network architecture search
US11074468B2 (en) Method of liveness detection and related device
CN113674297B (en) Semantic edge detection method and system based on information chaos degree measurement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination