WO2008078265A2 - Medical imaging system - Google Patents

Medical imaging system Download PDF

Info

Publication number
WO2008078265A2
WO2008078265A2 PCT/IB2007/055161 IB2007055161W WO2008078265A2 WO 2008078265 A2 WO2008078265 A2 WO 2008078265A2 IB 2007055161 W IB2007055161 W IB 2007055161W WO 2008078265 A2 WO2008078265 A2 WO 2008078265A2
Authority
WO
WIPO (PCT)
Prior art keywords
interest
borders
feature
region
information
Prior art date
Application number
PCT/IB2007/055161
Other languages
French (fr)
Other versions
WO2008078265A3 (en
Inventor
Pascal Allain
Olivier Gerard
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Priority to JP2009543558A priority Critical patent/JP2010514486A/en
Publication of WO2008078265A2 publication Critical patent/WO2008078265A2/en
Publication of WO2008078265A3 publication Critical patent/WO2008078265A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • G06V10/987Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns with the intervention of an operator
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30048Heart; Cardiac
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images
    • G06V2201/031Recognition of patterns in medical or anatomical images of internal organs

Definitions

  • the present invention relates to a medical imaging system, and to a corresponding method.
  • the invention finds, in particular, its application in the domain of ultrasound imaging.
  • a known medical imaging system makes it possible to acquire a sequence of 3D images of a feature of interest of a body, such as the left ventricle of the heart, and to display it on a screen, to determine the borders of such feature of interest and display them on the screen, to detect visually on the screen errors of border determination, and to correct manually such errors.
  • the user chooses a region of interest on a 3D sequence and looks at all the 2D images composing a 3D image. Then, when the user sees that there is an error of border determination, he may correct it manually.
  • One drawback of said imaging system is that the user of said system is loses if he wants to see and correct if necessary all the borders of the regions of the feature of interest of the 3D sequence because he has to extract all the 2D corresponding images, one 3D image being composed of about a hundred 2D images. He has to look at about 3000 images in a cardiac cycle, which is very tedious.
  • controlling means for controlling the following operations: - automatic determination of the borders of at least one region of a feature of interest in a sequence of images of a part of a body,
  • sequence of images comprising a plurality of images, such as a sequence of images representative of a whole cardiac cycle
  • the invention may be used with a sequence of images comprising a single image. Therefore, the expression "sequence of images” should also be understood as meaning "at least one image”.
  • the display of an information representative of a border with its confidence level permits the user to save time, as he will see automatically the borders where he has to focus on and which may need to be corrected.
  • the displayed information is a map of the confidence levels associated respectively to the borders of a plurality of regions of the feature of interest. It permits the user to have a global view of the regions of a feature of interest and their associated confidence level.
  • the controlling means permit the control of the display of a second information representative of at least one region of the feature of interest whose borders have been corrected. It permits the user to follow his own modifications.
  • controlling means permit the control of the automatic display of a 2D slice view of one region of low confidence based on the information. It permits to guide the user in his corrections.
  • the present invention also relates to a method for medical imaging which comprises the steps of :
  • the present invention finally relates to a computer program product comprising program instructions for implementing said method when said program is executed by a processor.
  • - Fig.l is a schematic diagram of a system according to an embodiment of the invention which cooperates with a probe ;
  • - Fig.2 is a schematic drawing of a feature of interest such as the left ventricle of a heart, from which a sequence of images is acquired via a system according to an embodiment of the invention ;
  • - Fig.3 is a first view of a segmentation of a feature of interest such as the left ventricle of a heart, which may be used by the system according to an embodiment of the invention
  • - Fig.4 is a second view of a segmentation of a feature of interest such as the left ventricle of a heart, which may be used by the system according to an embodiment of the invention ;
  • - Fig.5 is a display of a feature of interest such as the left ventricle of a heart with regions having borders with low confidence levels, performed by the system according to an embodiment of the invention ;
  • - Fig.6 is another display of the borders of some regions of a feature of interest such as the left ventricle of a heart, performed by the system according to an embodiment of the invention ;
  • - Fig.7 is a first display, of a map of confidence levels associated to different regions of a feature of interest such as the left ventricle of a heart, performed by the system according to an embodiment of the invention ;
  • - Fig.8 is a second display, of a map of confidence levels associated to different regions of a feature of interest such as the left ventricle of a heart, performed by the system according to an embodiment of the invention ;
  • - Fig.9 represents a diagram of a method for medical imaging according to an embodiment of the invention.
  • the system SYS comprises a controller CTRL for controlling the following operations :
  • the system SYS further optionally comprises a screen SCR for displaying the sequences SQ of images acquired, such as a LCD screen, and a user interface M USER.
  • a screen SCR for displaying the sequences SQ of images acquired, such as a LCD screen, and a user interface M USER.
  • the system SYS may comprise a memory MEM in order to save the images I acquired.
  • controller CTRL is further arranged to control the display of the sequence of images SQ, and the automatic display of a 2D slice view of one region RI of low confidence based on the information IN;
  • controller CTRL comprises a microprocessor that can be pre-programmed by means of instructions or that can be programmed by a user of the system SYS, for instance via the interface M USER.
  • an image I is a 3D grey level image that may be split up in 2D slices which is usually called a MPR "Multiplanar Reconstruction" view.
  • Such a system SYS may be used in ultrasound, in particular, where organ measurements need to be performed, such as the left ventricle LV of a heart.
  • a heart is composed of a left and a right ventricles LV and RV, an aorta AO, and a left and right atrium LA and RA as shown in Fig.2, and that the arterial blood goes from the left ventricle LV to the aorta AO while the right ventricle RV exits the venous blood received from the right atrium RA to the pulmonary artery.
  • the left ventricle LV is working is indicative of the health of the heart, one focus more particularly on said left ventricle LV when using the ultrasound imaging system SYS.
  • the inner wall of left ventricle LV of the heart may be segmented in seventeen segments SG as defined in the standard "Standardized Myocardial Segmentation and Nomenclature for Tomographic Imaging of the Heart" by the Cardiac Imaging Committee of the Council on Clinical Cardiology of the American Heart Association.
  • Fig.3 is a display on a circumferential polar plot of such a segmentation called "bulls eye” and
  • Fig.4 is a 3D view of such segmentation.
  • the seventeen segments are named by the standard.
  • the segment number 17 is the apex
  • the segments number 1 and 7 which identify the locations of the anterior wall at the base and mid-cavity are named basal anterior and mid-anterior.
  • Such a segmentation may be used by the ultrasound imaging system as described below.
  • the ultrasonic probe PRB is applied on the body of a patient, at the apex near the heart in a not limited embodiment, and the imaging system SYS performs the operations described hereinafter.
  • the user of the system SYS moves the probe PRB on the part of the body which is of interest, here the heart, and more particularly the left ventricle LV.
  • a sequence of grey level three-dimensional images is acquired.
  • the sequence of images SQ is displayed on the screen SCR. It is to be noted that a sequence SQ of three- dimensional images is performed at about 20Hz and a sequence SQ is composed of about 20 three-dimensional images. It is to be noted that in order to view the entire volume of the left ventricle LV, the images acquisition is performed during four cardiac cycles, wherein one fourth of the left ventricle LV is acquired at each cardiac cycle. This 3D acquisition permits to obtain some volumes.
  • the controller CTRL also controls this acquisition, however this acquisition may be controlled by a separate system.
  • the acquisition may be performed by an acquisition system and the sequence of images sent, for instance by means of a wireless connection, to a system comprising means for controlling automatic determination of borders of at least one region RI of a feature of interest FI in the sequence of images SQ, for ccomputing a confidence level associated to the borders of at least one region of the feature of interest which is representative of the determination of said borders, and for displaying an information representative of at least one region of the feature of interest where the associated borders have a confidence level which is lower than a predefined threshold.
  • a region RI is composed of at least one voxel and may be composed of a plurality of voxels.
  • the automatic determination may for instance be based on a measurement of the movement of the left ventricle LV. In this case, it may be based for instance on features characterization in the image such as edge based on gradient, density level, texture tracking.
  • the type of features characterization used is chosen according to an anatomic model function of the regions of the left ventricle LV seen on the sequence SQ of images.
  • a color is associated to the velocity information of the regions RI of the left ventricle LV in order to be displayed as a parametric image IP on the screen SCR. For example, a red color can be used when the regions contract whereas a blue color can be used when the regions relax.
  • the left ventricle LV is working correctly, the whole left ventricle LV should be displayed in red when it contracts, and in blue when it relaxes. If it is not the case, the left ventricle LV is displayed for some parts in red and for other parts in blue. The colors are not uniform.
  • a confidence level CL is associated to the determination of the borders B of the different regions RI of the left ventricle LV. This confidence level depends on local estimation of the feature characterization used to determine the borders B, as described before, such as threshold on grey level for the density level, gradient level for the edge based on gradient, and global/local statistic for the texture tracking.
  • the computation of a confidence level is performed for each image I of the acquired sequence SQ of images.
  • the threshold TH may be for instance of 60%. Of course, any other values of threshold may be defined.
  • the display is performed for each image I of the sequence SQ of images acquired.
  • the information IN is the 3D image of the left ventricle LV with colors on regions RI as illustrated in Fig.5, each color being associated to a value of confidence level CL.
  • the regions RIl and RI2 have a low confidence level CL.
  • another color may be used for the other regions which have an associated confidence level greater than the threshold TH.
  • a map of confidence levels CL associated respectively to the borders B of a plurality of regions RI of the feature of interest FI which have a low or high confidence level may be displayed.
  • the user may either validate or correct manually the borders B of the regions RI which have a low confidence level CL.
  • the user interface M USER comprises manual editing tools.
  • the system SYS may slice automatically the 3D images at positions of low confidence. For instance a MPR "Multiplanar Reconstruction" 2D slice view through low confidence part of the image will be automatically displayed to the user, also with color indicating regions (for instance the regions RIl, RI2) to review, as illustrated in Fig.6. More than one MPR view may be displayed. Three perpendicular planes slicing through a region of interest RI may be displayed for instance.
  • the system SYS may correct its borders. The system SYS will then automatically move the region of interest RI toward the region with the second lowest confidence, etc. If not, the user indicates to the system that he wants to move to the next region with low confidence and so on. In another embodiment, the user may himself choose the MPR views with the use of an orthoviewer P as illustrated in Fig.5. The user may use the orthoviewer P and move it until he sees the 2D slices of regions of interest, such as the 2D slices for the regions RIl and RI2 for instance.
  • the information may be updated in order to show the corrections of the user and to apply a confidence level CL which is reliable on these corrections.
  • Another color may be used in order to indicate to the user which parts are effectively covered by his corrections.
  • the originated confidence information IN called intrinsic may always be displayed, and an updated confidence information INu called extrinsic may be displayed in parallel.
  • this extrinsic confidence information INu will store all the regions and extent of the user interaction, along its history.
  • the extrinsic confidence information INu will allow displaying to the user where he has made modifications.
  • a change in the density of the intervention e.g. lots of interventions in one part of the image and none in other parts
  • step 3 of computation of local estimation of a confidence level again. It permits to redo an estimation taking into account the corrections of the user.
  • Fig.7 is a representation in 2D of the final parametric image IF. Regions at low confidence level have been erased from the parametric image IP. Therefore, the user may see which segments SG of the parametric representation is covered by a region or a plurality of regions RI with a low confidence level for instance.
  • Fig.8 is a representation in 3D of the final parametric image IF where regions at low confidence level have been erased from the parametric image IP.
  • the resulting parametric image IF permits to identify and to show the region where final measurements (wall dyssynchrony) are really reliable.
  • the intrinsic and extrinsic confidence information are here a combination of all the intrinsic and extrinsic information of each image I.
  • the user interface M USER comprises adequate means such as a button for example.
  • Fig.9 illustrates the method for medical imaging according to an embodiment of the invention where the different operations controlled by the system SYS are shown.
  • the invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer.
  • a device claim enumerating several means several of these means may be embodied by one and the same item of hardware.
  • the mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Abstract

The invention relates to a medical imaging system. First, determination of the borders of at least one region of a feature of interest is performed. Second, a confidence level which is representative of the determination of the borders of at least one region of the feature of interest is computed. Third, an information representative of at least one region of the feature of interest where the associated borders have a confidence level which is lower than a predefined threshold is displayed.

Description

Medical imaging system
FIELD OF THE INVENTION
The present invention relates to a medical imaging system, and to a corresponding method. The invention finds, in particular, its application in the domain of ultrasound imaging.
BACKGROUND OF THE INVENTION
A known medical imaging system makes it possible to acquire a sequence of 3D images of a feature of interest of a body, such as the left ventricle of the heart, and to display it on a screen, to determine the borders of such feature of interest and display them on the screen, to detect visually on the screen errors of border determination, and to correct manually such errors. In order to detect visually such errors, the user chooses a region of interest on a 3D sequence and looks at all the 2D images composing a 3D image. Then, when the user sees that there is an error of border determination, he may correct it manually. One drawback of said imaging system is that the user of said system is loses if he wants to see and correct if necessary all the borders of the regions of the feature of interest of the 3D sequence because he has to extract all the 2D corresponding images, one 3D image being composed of about a hundred 2D images. He has to look at about 3000 images in a cardiac cycle, which is very tedious.
SUMMARY OF THE INVENTION
It is an object of embodiments of the invention to propose a system which permits a user to save time and help him for the correction of the borders of regions of a feature of interest.
To this end, the system comprises, in an embodiment, controlling means for controlling the following operations: - automatic determination of the borders of at least one region of a feature of interest in a sequence of images of a part of a body,
- computation of a confidence level which is representative of the determination of the borders of at least one region of the feature of interest, - display of an information representative of at least one region of the feature of interest where the associated borders have a confidence level which is lower than a predefined threshold.
Although the invention is well adapted for a sequence of images comprising a plurality of images, such as a sequence of images representative of a whole cardiac cycle, the invention may be used with a sequence of images comprising a single image. Therefore, the expression "sequence of images" should also be understood as meaning "at least one image".
The display of an information representative of a border with its confidence level permits the user to save time, as he will see automatically the borders where he has to focus on and which may need to be corrected.
According to a not limited embodiment, the displayed information is a map of the confidence levels associated respectively to the borders of a plurality of regions of the feature of interest. It permits the user to have a global view of the regions of a feature of interest and their associated confidence level.
According to a not limited embodiment, the controlling means permit the control of the display of a second information representative of at least one region of the feature of interest whose borders have been corrected. It permits the user to follow his own modifications.
According to a not limited embodiment the controlling means permit the control of the automatic display of a 2D slice view of one region of low confidence based on the information. It permits to guide the user in his corrections. The present invention also relates to a method for medical imaging which comprises the steps of :
- determining automatically the borders of at least one region of a feature of interest in a sequence of images of a part of a body, - computing a confidence level which is representative of the determination of the borders of at least one region of the feature of interest,
- displaying an information representative of at least one region of the feature of interest where the associated borders have a confidence level which is lower than a predefined threshold.
The present invention finally relates to a computer program product comprising program instructions for implementing said method when said program is executed by a processor.
These and other aspects of the invention will be apparent from and will be elucidated with reference to the embodiments described hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will now be described in more detail, by way of not limited examples, with reference to the accompanying drawings, wherein:
- Fig.l is a schematic diagram of a system according to an embodiment of the invention which cooperates with a probe ;
- Fig.2 is a schematic drawing of a feature of interest such as the left ventricle of a heart, from which a sequence of images is acquired via a system according to an embodiment of the invention ;
- Fig.3 is a first view of a segmentation of a feature of interest such as the left ventricle of a heart, which may be used by the system according to an embodiment of the invention;
- Fig.4 is a second view of a segmentation of a feature of interest such as the left ventricle of a heart, which may be used by the system according to an embodiment of the invention ;
- Fig.5 is a display of a feature of interest such as the left ventricle of a heart with regions having borders with low confidence levels, performed by the system according to an embodiment of the invention ;
- Fig.6 is another display of the borders of some regions of a feature of interest such as the left ventricle of a heart, performed by the system according to an embodiment of the invention ;
- Fig.7 is a first display, of a map of confidence levels associated to different regions of a feature of interest such as the left ventricle of a heart, performed by the system according to an embodiment of the invention ;
- Fig.8 is a second display, of a map of confidence levels associated to different regions of a feature of interest such as the left ventricle of a heart, performed by the system according to an embodiment of the invention ; and
- Fig.9 represents a diagram of a method for medical imaging according to an embodiment of the invention.
DETAILED DESCRIPTION OF THE INVENTION
A system SYS in accordance with an embodiment of the invention is described in Fig.l.
It cooperates with a transducer's array TAR and its associated electronics, the whole forming a probe PRB. The system SYS comprises a controller CTRL for controlling the following operations :
- acquisition of a sequence of images SQ of a part of a body;
- automatic determination of the borders B of at least one region RI of a feature of interest FI in a sequence of images SQ of a part of a body, - computation of a confidence level CL associated to the borders B of at least one region RI of the feature of interest FI which is representative of the determination of said borders,
- display of an information IN representative of at least one region RI of the feature of interest FI where the associated borders have a confidence level CL which is lower than a predefined threshold TH.
The system SYS further optionally comprises a screen SCR for displaying the sequences SQ of images acquired, such as a LCD screen, and a user interface M USER.
The system SYS may comprise a memory MEM in order to save the images I acquired.
In an embodiment, the controller CTRL is further arranged to control the display of the sequence of images SQ, and the automatic display of a 2D slice view of one region RI of low confidence based on the information IN;
It is to be noted that the controller CTRL comprises a microprocessor that can be pre-programmed by means of instructions or that can be programmed by a user of the system SYS, for instance via the interface M USER.
It is to be noted that an image I is a 3D grey level image that may be split up in 2D slices which is usually called a MPR "Multiplanar Reconstruction" view.
Such a system SYS may be used in ultrasound, in particular, where organ measurements need to be performed, such as the left ventricle LV of a heart.
One reminds that a heart is composed of a left and a right ventricles LV and RV, an aorta AO, and a left and right atrium LA and RA as shown in Fig.2, and that the arterial blood goes from the left ventricle LV to the aorta AO while the right ventricle RV exits the venous blood received from the right atrium RA to the pulmonary artery. As the way the left ventricle LV is working is indicative of the health of the heart, one focus more particularly on said left ventricle LV when using the ultrasound imaging system SYS.
Referring now to Fig.3, the inner wall of left ventricle LV of the heart may be segmented in seventeen segments SG as defined in the standard "Standardized Myocardial Segmentation and Nomenclature for Tomographic Imaging of the Heart" by the Cardiac Imaging Committee of the Council on Clinical Cardiology of the American Heart Association. Thus, Fig.3 is a display on a circumferential polar plot of such a segmentation called "bulls eye" and Fig.4 is a 3D view of such segmentation. The seventeen segments are named by the standard. For example, the segment number 17 is the apex, and the segments number 1 and 7 which identify the locations of the anterior wall at the base and mid-cavity are named basal anterior and mid-anterior. Such a segmentation may be used by the ultrasound imaging system as described below. In order to acquire images of the left ventricle LV, the ultrasonic probe PRB is applied on the body of a patient, at the apex near the heart in a not limited embodiment, and the imaging system SYS performs the operations described hereinafter.
1) Acquisition of a sequence of three-dimensional images SQ.
The user of the system SYS moves the probe PRB on the part of the body which is of interest, here the heart, and more particularly the left ventricle LV. A sequence of grey level three-dimensional images is acquired. The sequence of images SQ is displayed on the screen SCR. It is to be noted that a sequence SQ of three- dimensional images is performed at about 20Hz and a sequence SQ is composed of about 20 three-dimensional images. It is to be noted that in order to view the entire volume of the left ventricle LV, the images acquisition is performed during four cardiac cycles, wherein one fourth of the left ventricle LV is acquired at each cardiac cycle. This 3D acquisition permits to obtain some volumes.
It should be noted that acquisition of the sequence of images SQ is not necessary to the invention. In the embodiment of Fig.1, the controller CTRL also controls this acquisition, however this acquisition may be controlled by a separate system. For instance, the acquisition may be performed by an acquisition system and the sequence of images sent, for instance by means of a wireless connection, to a system comprising means for controlling automatic determination of borders of at least one region RI of a feature of interest FI in the sequence of images SQ, for ccomputing a confidence level associated to the borders of at least one region of the feature of interest which is representative of the determination of said borders, and for displaying an information representative of at least one region of the feature of interest where the associated borders have a confidence level which is lower than a predefined threshold.
2) Automatic determination of the borders B of at least one region RI of a feature of interest FI in a sequence of images SQ of a part of a body. This operation permits to determine the position of the borders B of different regions RI of the left ventricle LV in each image I of the sequence SQ, usually by a method called segmentation of the left ventricle LV. Such a segmentation is well described, for instance, in the document "Efficient Model-Based Quantification of Left Ventricular Function in 3- D Echochardiography - O. Gerard, A. Collet-Billon, J-M. Rouet, M. Jacob, M. Fradkin and C. Allouche- IEEE transactions on medical imaging, Vol.21, N°9, September 2002".
It is to be noted that a region RI is composed of at least one voxel and may be composed of a plurality of voxels. The automatic determination may for instance be based on a measurement of the movement of the left ventricle LV. In this case, it may be based for instance on features characterization in the image such as edge based on gradient, density level, texture tracking.
It is to be noted that the type of features characterization used is chosen according to an anatomic model function of the regions of the left ventricle LV seen on the sequence SQ of images.
Of course any method for automatic determination of the borders B of a region RI of a feature of interest FI in an image I may be used.
When the borders B have been estimated, one may determine the evolution of the surface of the left ventricle LV by acquiring a parametric image IP based for instance on the velocity information of different regions RI. A color is associated to the velocity information of the regions RI of the left ventricle LV in order to be displayed as a parametric image IP on the screen SCR. For example, a red color can be used when the regions contract whereas a blue color can be used when the regions relax. When the left ventricle LV is working correctly, the whole left ventricle LV should be displayed in red when it contracts, and in blue when it relaxes. If it is not the case, the left ventricle LV is displayed for some parts in red and for other parts in blue. The colors are not uniform. It means that some regions RI contract or relax later than other regions RI because their speed peaks are different. Hence, the evolution of the surface shows if there is any asynchronism in the left ventricle LV. Of course, other information than the velocity may be used for the parametric image IP such as, for instance, deformation, displacement or acceleration information. The acquisition of a parametric image is well described, for instance, in the document " Strain and strain rate parametric imaging. A new method for post processing to 3-/4- dimensional images from three standard apical planes. Preliminary data on feasibility, artefact and regional dyssynergy visualisation" from A. Stoylen, CB. Inggul, H. Torp - Cardiovascular Ultrasound 2003, 1 :11 doi:10.1186/1476-7120-1-11 - Department of Circulation and Medical Imaging, Faculty of Medicine, Norwegian University of Science and Technology, Trondheim. Norway.
3) Computation of a confidence level CL which is representative of the determination of the borders B of at least one region RI of the feature of interest FI.
Because the image quality of an ultrasound acquisition depends of several factors such as for instance the patient echogenicity, the line density, the limited field of view, ...etc., a confidence level CL is associated to the determination of the borders B of the different regions RI of the left ventricle LV. This confidence level depends on local estimation of the feature characterization used to determine the borders B, as described before, such as threshold on grey level for the density level, gradient level for the edge based on gradient, and global/local statistic for the texture tracking.
The computation of a confidence level is performed for each image I of the acquired sequence SQ of images.
Such a confidence level associated to a measurement is well described, for instance, in the document "A confidence measure based moving object extraction system built for compressed domain - R. Wang, H.J. Zhang, Y. Q. Zhang - IEEE International Symposium on Circuits and Systems, May 28-31, 2000, Geneva, Switzerland - ISCAS2000".
4) Display of an information IN representative of at least one region RI of the feature of interest FI where the associated borders B have a confidence level CL which is lower than a predefined threshold TH. The threshold TH may be for instance of 60%. Of course, any other values of threshold may be defined.
The display is performed for each image I of the sequence SQ of images acquired. For instance, the information IN is the 3D image of the left ventricle LV with colors on regions RI as illustrated in Fig.5, each color being associated to a value of confidence level CL. The regions RIl and RI2 have a low confidence level CL. Of course, another color may be used for the other regions which have an associated confidence level greater than the threshold TH. Hence, a map of confidence levels CL associated respectively to the borders B of a plurality of regions RI of the feature of interest FI which have a low or high confidence level may be displayed.
Hence, with the information IN displayed, the user may either validate or correct manually the borders B of the regions RI which have a low confidence level CL.
In order for the user to manually modify the borders B, the user interface M USER comprises manual editing tools.
5) Automatic display of a 2D slice view of one region of low confidence based on the information IN.
In order to help the user to correct manually the borders B of the regions of low confidence level, the system SYS, based on the confidence map MP, may slice automatically the 3D images at positions of low confidence. For instance a MPR "Multiplanar Reconstruction" 2D slice view through low confidence part of the image will be automatically displayed to the user, also with color indicating regions (for instance the regions RIl, RI2) to review, as illustrated in Fig.6. More than one MPR view may be displayed. Three perpendicular planes slicing through a region of interest RI may be displayed for instance.
If the region RI shown by the system SYS is of interest for the user, he may correct its borders. The system SYS will then automatically move the region of interest RI toward the region with the second lowest confidence, etc. If not, the user indicates to the system that he wants to move to the next region with low confidence and so on. In another embodiment, the user may himself choose the MPR views with the use of an orthoviewer P as illustrated in Fig.5. The user may use the orthoviewer P and move it until he sees the 2D slices of regions of interest, such as the 2D slices for the regions RIl and RI2 for instance.
Automatic positioning on regions RI of low confidence permits to guide the user in his corrections. The user thus saves time.
6) Update the information IN and display.
Whenever the user corrects a border B, the information may be updated in order to show the corrections of the user and to apply a confidence level CL which is reliable on these corrections. Another color may be used in order to indicate to the user which parts are effectively covered by his corrections.
Hence, the originated confidence information IN called intrinsic may always be displayed, and an updated confidence information INu called extrinsic may be displayed in parallel. Hence, this extrinsic confidence information INu will store all the regions and extent of the user interaction, along its history. Hence, the extrinsic confidence information INu will allow displaying to the user where he has made modifications. With this information, a change in the density of the intervention (e.g. lots of interventions in one part of the image and none in other parts) will be a warning sign to explore the non-modified regions.
It is to be noted that after a correction is performed by the user, one may perform the step 3) of computation of local estimation of a confidence level again. It permits to redo an estimation taking into account the corrections of the user.
7) Merge of a parametric image IP with the intrinsic IN and extrinsic INu confidence information. As a final result, the user may have a final parametric image IF which is a merge between the parametric image IP, which shows for instance the velocity information as described before, and the intrinsic IN and extrinsic INu confidence information, as illustrated in Fig.7 and Fig.8. Fig.7 is a representation in 2D of the final parametric image IF. Regions at low confidence level have been erased from the parametric image IP. Therefore, the user may see which segments SG of the parametric representation is covered by a region or a plurality of regions RI with a low confidence level for instance.
Fig.8 is a representation in 3D of the final parametric image IF where regions at low confidence level have been erased from the parametric image IP.
Hence, the resulting parametric image IF permits to identify and to show the region where final measurements (wall dyssynchrony) are really reliable.
It is to be noted that the intrinsic and extrinsic confidence information are here a combination of all the intrinsic and extrinsic information of each image I.
In order for the user to obtain the final parametric image IF, the user interface M USER comprises adequate means such as a button for example.
Fig.9 illustrates the method for medical imaging according to an embodiment of the invention where the different operations controlled by the system SYS are shown.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be capable of designing many alternative embodiments without departing from the scope of the invention as defined by the appended claims. The examples used have been described for an echocardiographic sequence of images, but of course they may be extended to any images coming for other imaging modalities.
In the claims, any reference signs placed in parentheses shall not be construed as limiting the claims. The word "comprising" and "comprises", and the like, does not exclude the presence of elements or steps other than those listed in any claim or the specification as a whole. The singular reference of an element does not exclude the plural reference of such elements and vice-versa.
The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In a device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims

1. A medical imaging system, comprising controlling means (CTRL) for controlling the following operations :
- automatic determination of the borders (B) of at least one region (RI) of a feature of interest (FI) in a sequence of images (SQ) of a part of a body, - computation of a confidence level (CL) which is representative of the determination of the borders (B) of at least one region (RI) of the feature of interest
(FI),
- display of an information (IN) representative of at least one region (RI) of the feature of interest (FI) where the associated borders (B) have a confidence level (CL) which is lower than a predefined threshold (TH).
2. A system as claimed in claim 1, wherein the displayed information (IN) is a map of the confidence levels (CL) associated respectively to the borders (B) of a plurality of regions of the feature of interest (FI).
3. A system as claimed in claim 1, wherein the controlling means (CTRL) permits the control of the display of a second information (INu) representative of at least one region (RI) of the feature of interest (FI) whose borders (B) have been corrected.
4. A system as claimed in claim 1, wherein the controlling means (CTRL) permits the control of the automatic display of a 2D slice view of one region (RI) of low confidence based on the information (IN).
5. A method for medical imaging, comprising the steps of : - Determining automatically the borders (B) of at least one region (RI) of a feature of interest (FI) in a sequence of images (SQ) of a part of a body,
- Computing a confidence level (CL) which is representative of the determination of the borders (B) of at least one region (RI) of the feature of interest (FI), - Displaying an information (I) representative of at least one region (RI) of the feature of interest (FI) where the associated borders (B) have a confidence level (CL) which is lower than a predefined threshold (TH).
6. A computer program product comprising program instructions for implementing, when said program is executed by a processor, the method as claimed in the preceding claim.
PCT/IB2007/055161 2006-12-26 2007-12-17 Medical imaging system WO2008078265A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009543558A JP2010514486A (en) 2006-12-26 2007-12-17 Medical imaging system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP06301294 2006-12-26
EP06301294.2 2006-12-26

Publications (2)

Publication Number Publication Date
WO2008078265A2 true WO2008078265A2 (en) 2008-07-03
WO2008078265A3 WO2008078265A3 (en) 2009-02-05

Family

ID=39563030

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2007/055161 WO2008078265A2 (en) 2006-12-26 2007-12-17 Medical imaging system

Country Status (5)

Country Link
JP (1) JP2010514486A (en)
KR (1) KR20090098839A (en)
CN (1) CN101568941A (en)
RU (1) RU2009128709A (en)
WO (1) WO2008078265A2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2389662A1 (en) * 2009-01-23 2011-11-30 Koninklijke Philips Electronics N.V. Cardiac image processing and analysis
WO2012091763A1 (en) 2010-12-27 2012-07-05 St. Jude Medical, Atrial Fibrillation Division, Inc. Refinement of an anatomical model using ultrasound
US9763587B2 (en) 2010-06-10 2017-09-19 Biosense Webster (Israel), Ltd. Operator-controlled map point density
EP3578109A4 (en) * 2017-02-01 2020-01-22 Fujifilm Corporation Ultrasound diagnostic device, ultrasound diagnostic device control method and ultrasound diagnostic device control program
US10743844B2 (en) 2014-07-29 2020-08-18 Koninklijke Philips N.V. Ultrasound imaging apparatus
WO2020234653A1 (en) * 2019-05-20 2020-11-26 Aranz Healthcare Limited Automated or partially automated anatomical surface assessment methods, devices and systems

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8200466B2 (en) 2008-07-21 2012-06-12 The Board Of Trustees Of The Leland Stanford Junior University Method for tuning patient-specific cardiovascular simulations
US9405886B2 (en) 2009-03-17 2016-08-02 The Board Of Trustees Of The Leland Stanford Junior University Method for determining cardiovascular information
US8315812B2 (en) 2010-08-12 2012-11-20 Heartflow, Inc. Method and system for patient-specific modeling of blood flow
JP5847454B2 (en) * 2011-06-23 2016-01-20 キヤノン株式会社 Subject information acquisition apparatus, display control method, and program
JP5987640B2 (en) * 2012-11-05 2016-09-07 コニカミノルタ株式会社 Method and apparatus for three-dimensional restoration of subject using ultrasound
US20210100530A1 (en) * 2019-10-04 2021-04-08 GE Precision Healthcare LLC Methods and systems for diagnosing tendon damage via ultrasound imaging

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020072671A1 (en) 2000-12-07 2002-06-13 Cedric Chenal Automated border detection in ultrasonic diagnostic images

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006085266A1 (en) * 2005-02-08 2006-08-17 Philips Intellectual Property & Standard Gmbh Medical image viewing protocols

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020072671A1 (en) 2000-12-07 2002-06-13 Cedric Chenal Automated border detection in ultrasonic diagnostic images

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PHILIP K P ET AL., AUTOMATIC DETECTION OF MYOCARDIAL CONTOURS IN CINE-COMPUTED TOMOGRAPHIC IMAGES

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2389662A1 (en) * 2009-01-23 2011-11-30 Koninklijke Philips Electronics N.V. Cardiac image processing and analysis
US9763587B2 (en) 2010-06-10 2017-09-19 Biosense Webster (Israel), Ltd. Operator-controlled map point density
US10568532B2 (en) 2010-06-10 2020-02-25 Biosense Webster (Israel) Ltd. Operator-controlled map point density
WO2012091763A1 (en) 2010-12-27 2012-07-05 St. Jude Medical, Atrial Fibrillation Division, Inc. Refinement of an anatomical model using ultrasound
EP2618739A4 (en) * 2010-12-27 2015-07-01 St Jude Medical Atrial Fibrill Refinement of an anatomical model using ultrasound
US10524765B2 (en) 2010-12-27 2020-01-07 St. Jude Medical, Atrial Fibrillation Division, Inc. Refinement of an anatomical model using ultrasound
US10743844B2 (en) 2014-07-29 2020-08-18 Koninklijke Philips N.V. Ultrasound imaging apparatus
EP3578109A4 (en) * 2017-02-01 2020-01-22 Fujifilm Corporation Ultrasound diagnostic device, ultrasound diagnostic device control method and ultrasound diagnostic device control program
US11589842B2 (en) 2017-02-01 2023-02-28 Fujifilm Corporation Ultrasound diagnostic apparatus, method for controlling ultrasound diagnostic apparatus, and program for controlling ultrasound diagnostic apparatus
WO2020234653A1 (en) * 2019-05-20 2020-11-26 Aranz Healthcare Limited Automated or partially automated anatomical surface assessment methods, devices and systems

Also Published As

Publication number Publication date
WO2008078265A3 (en) 2009-02-05
JP2010514486A (en) 2010-05-06
RU2009128709A (en) 2011-02-10
KR20090098839A (en) 2009-09-17
CN101568941A (en) 2009-10-28

Similar Documents

Publication Publication Date Title
WO2008078265A2 (en) Medical imaging system
US7715609B2 (en) Method for automatically determining the position and orientation of the left ventricle in 3D image data records of the heart
JP4918048B2 (en) Image processing apparatus and method
US8620040B2 (en) Method for determining a 2D contour of a vessel structure imaged in 3D image data
US20160117797A1 (en) Image Processing Apparatus and Image Processing Method
US10019804B2 (en) Medical image processing apparatus, method, and program
CN103222879A (en) System and method for identifying an optimal image frame for ultrasound imaging
EP2936364B1 (en) Method and apparatus for simulating blood flow under patient-specific boundary conditions derived from an estimated cardiac ejection output
US9462952B2 (en) System and method for estimating artery compliance and resistance from 4D cardiac images and pressure measurements
CN107427279A (en) Use the Ultrasonic Diagnosis of the cardiac function of the cardiac module chamber with user's control
WO2019075279A1 (en) Artificially intelligent ejection fraction determination
US20090010505A1 (en) Method, a System and a Computer Program for Segmenting a Structure in a Dataset
EP3120323B1 (en) Image processing apparatus and method for segmenting a region of interest
CN110956076B (en) Method and system for structure identification in three-dimensional ultrasound data based on volume rendering
US7689018B2 (en) Anomaly detection in volume data structure information
WO2010020933A2 (en) Processing cardiac data for personalized aha diagram
WO2010086810A1 (en) Transmural perfusion gradient image analysis
EP2059173A1 (en) System and method for measuring left ventricular torsion
JP2019082745A5 (en)
CN112308845B (en) Left ventricle segmentation method and device and electronic equipment
EP3244798B1 (en) Adaptive segmentation for rotational c-arm computed tomography with a reduced angular range
Goyal et al. MRI image based patient specific computational model reconstruction of the left ventricle cavity and myocardium
CN107427282A (en) Ultrasonic Diagnosis to cardiac function is split by single-degree-of-freedom chamber
US10832492B2 (en) Panoramic visualization of coronary arterial tree
US9286677B2 (en) Reorientation of cardiac images

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200780048158.4

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07859398

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 2007859398

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 1020097013099

Country of ref document: KR

ENP Entry into the national phase

Ref document number: 2009543558

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 4393/CHENP/2009

Country of ref document: IN

ENP Entry into the national phase

Ref document number: 2009128709

Country of ref document: RU

Kind code of ref document: A