US20120306862A1 - Image processing device, method and program - Google Patents

Image processing device, method and program Download PDF

Info

Publication number
US20120306862A1
US20120306862A1 US13/478,220 US201213478220A US2012306862A1 US 20120306862 A1 US20120306862 A1 US 20120306862A1 US 201213478220 A US201213478220 A US 201213478220A US 2012306862 A1 US2012306862 A1 US 2012306862A1
Authority
US
United States
Prior art keywords
moving image
image
characteristic part
phase
ultrasonic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/478,220
Inventor
Yuanzhong Li
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Corp
Original Assignee
Fujifilm Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Corp filed Critical Fujifilm Corp
Assigned to FUJIFILM CORPORATION reassignment FUJIFILM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LI, YUANZHONG
Publication of US20120306862A1 publication Critical patent/US20120306862A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • G06T3/14
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/50Clinical applications
    • A61B6/507Clinical applications involving determination of haemodynamic parameters, e.g. perfusion CT
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • G06T2207/101363D ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30048Heart; Cardiac

Definitions

  • the present invention relates to an image processing device, an image processing method and an image processing program for aligning and displaying a three-dimensional moving image taken with a CT or MR apparatus with a two-dimensional or three-dimensional moving image taken with an ultrasonic diagnostics apparatus.
  • ultrasonic moving images taken with an ultrasonic diagnostics apparatus which has no problem of radiation exposure, etc., allows examination with a simple device, and provides information about blood flow based on reflection of ultrasonic waves by the blood flow, are also effective for imaging diagnosis.
  • three-dimensional ultrasonic moving images of subjects have become available as the ultrasonic moving images, in addition to conventional ultrasonic moving images obtained with respect to a predetermined two-dimensional cross section of subjects.
  • doctors conduct the imaging diagnosis with displaying both the moving images in a state where they show the same phase of heart beat based on electrocardiographic data that is obtained during imaging of each moving image.
  • the user references one of the images with manually changing the position and direction shown in the other of the images to be the same as those shown in the one of the images so that the moving images showing the same position and phase are displayed at the same time on a display in a manner allowing comparison therebetween.
  • a CT or MR image is taken in fixed position and orientation relative to the subject
  • an ultrasonic image is taken with pressing an ultrasound probe against the subject at an arbitrary angle.
  • Patent Document 1 Japanese Unexamined Patent Publication No. 2003-153877 has proposed a technique which involves: extracting a predetermined characteristic part, such as a blood vessel area including a blood flow image, from ultrasonic image data of an examined body part; aligning the position of the predetermined characteristic part shown in the ultrasonic image with the position of the predetermined characteristic part shown in an MR image obtained in advance; correcting the MR image such that the MR image and the ultrasonic image show the predetermined characteristic part in the same position; and superimposing the corrected MR image and the ultrasonic image and displaying the superimposed image on a display device.
  • a predetermined characteristic part such as a blood vessel area including a blood flow image
  • Patent Document 2 Japanese Unexamined Patent Publication No. 2009-022459 has proposed a technique which involves: synchronizing timing of a three-dimensional CT moving image with timing of a three-dimensional ultrasonic moving image based on electrocardiographic data; reconstructing the ultrasonic image by transforming a spatial coordinate system of the ultrasonic moving image into a spatial coordinate system of the CT moving image using a transformation matrix; and displaying the reconstructed moving images being aligned and superimposed.
  • Patent Document 1 it is able to spatially align a three-dimensional still image taken with a CT apparatus with an ultrasonic still image.
  • a three-dimensional moving image taken with a CT apparatus and an ultrasonic moving image cannot be associated with the phases of heart beat, and therefore it is difficult to display the superimposed moving images such that they show the same phase and the same spatial position.
  • Patent Document 2 it is necessary to obtain electrocardiographic data corresponding to the moving images to align the moving images with each other with respect to the phase of heart beat. Therefore, in a case where the electrocardiographic data corresponding to one of or both of the moving images is not available, it is difficult to associate the corresponding phases of heart beat shown in the moving images with each other.
  • the present invention is directed to providing an image processing device, an image processing method and an image processing program that facilitate aligning a three-dimensional moving image taken with a CT or MR apparatus with an ultrasonic moving image with respect to phases of a periodic motion and the position thereof to generate and display a superimposed image of the three-dimensional moving image taken with a CT or MR apparatus and the ultrasonic moving image.
  • An aspect of the image processing device is an image processing device including: image obtaining means for obtaining a three-dimensional moving image showing a body part of a patient and an ultrasonic moving image showing the body part, the body part making a predetermined periodic motion; characteristic part extracting means for extracting, from a plurality of frame images forming the obtained three-dimensional moving image and ultrasonic moving image, a predetermined characteristic part having a shape that changes in response to the periodic motion; phase obtaining means for obtaining phases of the periodic motion captured in the three-dimensional moving image and the ultrasonic moving image, wherein, for at least one of the three-dimensional moving image and the ultrasonic moving image, the phases are obtained based on the shape of the extracted characteristic part; associating means for associating, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the extracted characteristic part and the obtained
  • An aspect of the image processing method is an image processing method including: obtaining a three-dimensional moving image showing a body part of a patient and an ultrasonic moving image showing the body part, the body part making a predetermined periodic motion; extracting, from a plurality of frame images forming the obtained three-dimensional moving image and ultrasonic moving image, a predetermined characteristic part having a shape that changes in response to the periodic motion; obtaining phases of the periodic motion captured in the three-dimensional moving image and the ultrasonic moving image, wherein, for at least one of the three-dimensional moving image and the ultrasonic moving image, the phases are obtained based on the shape of the extracted characteristic part; associating, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the extracted characteristic part and the obtained phases; generating a superimposed image of the three-dimensional moving image and the ultrasonic
  • An aspect of the image processing program is an image processing program for causing a computer to function as: image obtaining means for obtaining a three-dimensional moving image showing a body part of a patient and an ultrasonic moving image showing the body part, the body part making a predetermined periodic motion; characteristic part extracting means for extracting, from a plurality of frame images forming the obtained three-dimensional moving image and ultrasonic moving image, a predetermined characteristic part having a shape that changes in response to the periodic motion; phase obtaining means for obtaining phases of the periodic motion captured in the three-dimensional moving image and the ultrasonic moving image, wherein, for at least one of the three-dimensional moving image and the ultrasonic moving image, the phases are obtained based on the shape of the extracted characteristic part; associating means for associating, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based
  • the “body part” herein may be any body part that makes a predetermined periodic motion, and a typical example thereof is the heart.
  • the “predetermined periodic motion” herein may be any repeatable motion where each part included in the body part moves in a predetermined direction within a predetermined range, such as heart beat, respiration in lungs, flexion and extension of a joint, etc.
  • the characteristic part may be any of the ventricles, the atriums, the muscles, the valves and the apex of the heart.
  • the phase obtaining means may obtain the phases based on a state of opening and closing of any of the valves of the heart.
  • the phase obtaining means may optionally obtain the phases by identifying the end of diastole and/or systole of the heart based on the state of opening and closing of the mitral valve and/or the aortic valve among the valves of the heart.
  • the three-dimensional moving image may be any three-dimensional moving image that shows the shape of a body part of a subject, and an example thereof is a three-dimensional moving image taken with a CT or MRI apparatus.
  • the ultrasonic moving image may be a three-dimensional moving image or a moving image showing a cross section including the characteristic part.
  • phases of the periodic motion refers to stages of the periodic motion, and the number of phases for one period of the periodic motion may be any number.
  • the characteristic part extracting means may automatically extract the characteristic part.
  • the phase obtaining means may obtain the phases from accompanying information of the moving image.
  • the image generating means may generate the superimposed image by obtaining pixel spacing from accompanying information of each of the three-dimensional moving image and the ultrasonic moving image and providing the moving images with the same pixel spacing based on the obtained pixel spacing.
  • the image generating means may generate the superimposed image by superimposing, on the three-dimensional moving image, the ultrasonic moving image shown by a color Doppler method based on Doppler shift of blood flow.
  • a three-dimensional moving image showing a body part, which makes a predetermined periodic motion, of a patient and an ultrasonic moving image showing the body part are obtained; from a plurality of frame images forming the obtained three-dimensional moving image and ultrasonic moving image, a predetermined characteristic part having a shape that changes in response to the periodic motion is extracted; phases of the periodic motion captured in each of the three-dimensional moving image and the ultrasonic moving image are obtained, wherein, for at least one of the three-dimensional moving image and the ultrasonic moving image, the phases are obtained based on the shape of the extracted characteristic part; for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase is associated with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the extracted characteristic part and the obtained phases; a superimposed image of the three-dimensional moving image and the ultrasonic moving image is generated by aligning,
  • the superimposed image of the three-dimensional moving image and the ultrasonic moving image can easily be generated based on the shape of the characteristic part.
  • the user can understand the object of observation by compensating for low resolution areas of the ultrasonic moving image with the high spatial resolution of the three-dimensional moving image, and can easily understand the information that is obtained only from the ultrasonic moving image at the same time. Therefore, the user can efficiently and accurately conduct the imaging diagnosis.
  • FIG. 1 is a diagram illustrating the schematic configuration of an image processing device according to one embodiment of the present invention
  • FIG. 2 is a diagram illustrating the flow of a process carried out by the image processing device according to one embodiment of the invention
  • FIG. 3 is a diagram illustrating an example of a superimposed image displayed by the image processing device according to one embodiment of the invention.
  • FIG. 4 is a diagram illustrating an example of a superimposed image displayed by a modification of the image processing device according to one embodiment of the invention.
  • FIG. 1 illustrates the schematic configuration of a hospital system 1 incorporating an image processing device 6 according to one embodiment of the invention.
  • the hospital system 1 includes an examination room system 3 , a data server 4 and a diagnosis workstation (WS) 6 , which are connected with each other via a local area network (LAN) 2 .
  • LAN local area network
  • the examination room system 3 includes various modalities 32 for imaging a subject, and an examination room workstation (WS) 31 used for checking and controlling images outputted from the individual modalities.
  • the modalities 32 in this example includes a CT (Computed Tomography) apparatus and an MRI (Magnetic Resonance Imaging) apparatus, which are able to obtain a shape image representing shape information of the heart, and also includes an ultrasonic diagnostics apparatus, etc.
  • CT Computer Tomography
  • MRI Magnetic Resonance Imaging
  • the CT apparatus and the MRI apparatus are compliant to the DICOM (Digital Imaging and Communication in Medicine) standard, and output the obtained volume data as a DICOM file with adding accompanying information.
  • DICOM Digital Imaging and Communication in Medicine
  • the file outputted from each modality 32 is transferred to the data server 4 by the examination room WS 31 .
  • the data server 4 is formed by a computer with relatively high processing capacity including a high-performance processor and a mass memory, on which a software program for providing the function of a database management system (DBMS) is implemented.
  • the program is stored in a storage, loaded in the memory upon startup, and executed by the processor.
  • the data server 4 causes the file transferred from the examination room WS 31 to be stored in a mass storage 5 . Further, in response to a request to search from the diagnosis WS 6 , the data server 4 selects a file that meets a search condition from files stored in the mass storage 5 and sends the file to the diagnosis WS 6 .
  • DBMS database management system
  • the diagnosis WS 6 is formed by a general-purpose workstation including a standard type processor, a memory and a storage, on which the image processing program for assisting diagnosis is implemented.
  • the image processing program is installed on the diagnosis WS 6 from a recording medium, such as a DVD, or downloaded from a server computer connected via the network before being installed.
  • a display 7 and an input device 8 are connected to the diagnosis WS 6 .
  • the image processing program implemented on the diagnosis WS 6 is formed by sets of program modules for accomplishing various functions. Among them is a set of program modules for accomplishing the image processing function.
  • the program is stored in the storage, loaded in the memory upon startup, and executed by the processor.
  • the diagnosis WS 6 operates as: image obtaining means 61 for obtaining a three-dimensional moving image V 1 showing a body part (the heart), which makes a predetermined periodic motion, of a patient and an ultrasonic moving image V 2 showing the body part; characteristic part extracting means 62 for extracting, from a plurality of frame images forming the obtained three-dimensional moving image V 1 and ultrasonic moving image V 2 , a predetermined characteristic part (mitral valve MV) having a shape that changes in response to the periodic motion; phase obtaining means 63 for obtaining phases of the periodic motion captured in the three-dimensional moving image V 1 and the ultrasonic moving image V 2 , wherein, for at least one of the three-dimensional moving image V 1 and the ultrasonic moving image V
  • FIG. 2 is a flow chart illustrating the flow of image processing of this embodiment.
  • FIG. 3 shows an example of the displayed superimposed image. Now, the flow of a process carried out by the functions of the WS 6 (image processing device) of this embodiment is described in detail using FIGS. 2 and 3 . This embodiment is described in conjunction with the case of heart examination as an example.
  • a moving image of the chest of the subject including one period of heart beat is taken using a CT apparatus, or the like, and the thus taken three-dimensional moving image V 1 (volume data) with the accompanying information added thereto is transferred as a DICOM file to the data server 4 and stored in the mass storage 5 .
  • the volume data is formed by a collection of pieces of voxel data representing a density distribution in a three-dimensional space. In each voxel data, X-ray absorption, or the like, is indicated as a voxel value.
  • a moving image of the chest of the same subject is taken by transesophageal echocardiography (TEE), which is ultrasound imaging carried out by inserting an ultrasound probe through the mouth to the esophagus, and the thus taken three-dimensional ultrasonic moving image V 2 is transferred to the data server 4 and stored in the mass storage 5 .
  • TEE transesophageal echocardiography
  • the image obtaining means 61 sends the inputted information to the data server 4 and sends a request to search for and transfer the corresponding file stored in the mass storage 5 .
  • the data server 4 which has received the above-described request searches for the requested file in the mass storage 5 and transfers the file to the image obtaining means 61 .
  • the image obtaining means 61 obtains the three-dimensional moving image V 1 and the three-dimensional ultrasonic moving image V 2 contained in the file transferred from the data server 4 and stores them in the memory (S 01 ).
  • the characteristic part extracting means 62 extracts, as the predetermined characteristic part, the mitral valve MV, which is a heart valve located between the left ventricle LV and the left atrium LD, from each of the three-dimensional moving image V 1 and the three-dimensional ultrasonic moving image V 2 (S 02 ).
  • the characteristic part extracting means 62 segments the mitral valve MV of the heart captured in the moving images V 1 and V 2 in time series for at least one period of heart beat according to the method taught in the above-mentioned Non-Patent Document, and extracts information for identifying the position of each sample point on the contour of the mitral valve MV in each frame image forming the moving images V 1 and V 2 .
  • the phase obtaining means 63 obtains the phase of heart beat of each frame image based on the position of each sample point on the contour of the mitral valve MV captured in the moving images V 1 and V 2 (S 03 ).
  • the one period of heart beat include a systole and a diastole.
  • the aortic valve AV changes from the open state to the closed state and the mitral valve MV starts to open from the closed state.
  • the mitral valve MV changes to the closed state and the aortic valve starts to open from the closed state.
  • the phase obtaining means 63 identifies the end of diastole and the end of systole to identify the phases of heart beat.
  • the shape of the mitral valve MV captured in each of the moving images V 1 and V 2 is obtained at predetermined time intervals using predetermined parameters according to the method taught in the above-mentioned Non-Patent Document, and the state of opening and closing of the mitral valve MV is identified based on the shape of the mitral valve MV to obtain the phase of heart beat corresponding to the state of opening and closing of the mitral valve MV.
  • the state of opening and closing of the mitral valve MV and the predetermined parameters representing the shape of the mitral valve MV are associated with each other and stored.
  • the phase obtaining means 63 identifies, for each moving image V 1 , V 2 , a frame in which the mitral valve MV has changed from the open state to the closed state as a frame corresponding to the end of diastole of the heart beat. Also, the phase obtaining means 63 identifies, for each moving image V 1 , V 2 , a frame in which the mitral valve MV has changed from the closed state to the open state (start-to-open state) as a frame corresponding to the end of systole of the heart beat.
  • the predetermined parameters representing the shape of the mitral valve MV may, for example, be distances between specific sample points on the contour of the mitral valve MV.
  • the associating means 64 temporally associates the frames forming the moving images V 1 and V 2 with each other such that the moving images V 1 and V 2 are aligned with each other with respect to the phases corresponding to the end of systole and the end of diastole (the associating means 64 may perform interpolation in the time axis direction, as necessary) (S 04 ).
  • the frame images of the images V 1 and V 2 showing the same phase are associated with each other, and the spatial positions of the same characteristic part shown in the associated frame images are associated with each other.
  • the associating means 64 associates the frames of these images using the moving image having the smaller number of frames for one period as the reference.
  • the frames of the moving image having the greater number of frames for one period may be appropriately decimated, as necessary.
  • interpolation may be performed using a known method so that each pair of corresponding frame images of the moving images shows the same phase.
  • the phase of each frame image forming one of the moving images may be obtained, and then, using frame images of the other of the moving images before and after the obtained phase, an interpolated frame image of the other of the moving images having the shape corresponding to the obtained phase may be generated by a known method, to associate the frame images of the one of the moving images with the thus generated frame images of the other of the moving images such that each pair of associated frame images shows the same phase.
  • the image generating means 65 generates volume rendered images for a series of frame images extracted from the three-dimensional moving image V 1 by the above-described operation. For a series of frame images extracted from the three-dimensional ultrasonic moving image V 2 , the image generating means 65 generates images by transforming the coordinate system of the three-dimensional ultrasonic moving image V 2 into the coordinate system of the three-dimensional moving image V 1 so that the images V 1 and V 2 show the characteristic part associated by the associating means 64 in the same position, the same direction and the same size. Then, the image generating means 65 generates the superimposed image of the moving images V 1 and V 2 by a known method and stores the superimposed image in the storage 5 (S 05 ).
  • the image generating means 65 achieves the spatial alignment by transforming the coordinate system of one of the images into the coordinate system of the other of the images so that the images show the same characteristic part in the same spatial position based on the position associated by the associating means 64 , and appropriately correcting the transformed coordinate system so that the images show the same characteristic part in the same spatial position, the same direction and the same size.
  • the associating means 64 obtains pixel spacing information of the three-dimensional moving image V 1 and the three-dimensional ultrasonic moving image V 2 from the DICOM header information of each image, and enlarges or reduces the moving images V 1 and V 2 , as appropriate, based on the pixel spacing information to provide the series of frame images extracted from the three-dimensional moving image V 1 and the three-dimensional ultrasonic moving image V 2 with the same pixel spacing.
  • the superimposed image generated by the image generating means 65 of this embodiment shows voxel values based on the three-dimensional moving image V 1 at a predetermined transparency by volume rendering, and as shown by arrow C in FIG. 3 , shows voxel values and the direction of blood flow based on the three-dimensional ultrasonic moving image V 2 by the known color Doppler method.
  • the image generating means 65 may apply any of various known generation method that allows display of the superimposed images of the series of frame images extracted from the moving images V 1 and V 2 such that the superimposed images show the same characteristic part in the same spatial position, the same direction and the same size.
  • the display controlling means 66 obtains the superimposed moving image (superimposed frame images) generated by the image generating means 65 , and causes the display 7 to display the superimposed image, as shown in FIG. 3 (S 06 ).
  • the phases of the periodic motion of the body part that makes a predetermined motion are obtained based on the shape of the characteristic part captured in the three-dimensional moving image V 1 and the ultrasonic moving image V 2 and the three-dimensional moving image V 1 and the ultrasonic moving image V 2 are aligned with each other with respect to the phases to achieve the spatial alignment based on the position of the characteristic part captured in the moving images. Therefore, even in a case where the electrocardiographic data of one of the moving images is not available, the moving images can appropriately be associated with each other.
  • the user can understand the object of observation by compensating for low resolution areas of the ultrasonic moving image V 2 with the high spatial resolution of the three-dimensional moving image V 1 , and can easily understand the information that is obtained only from the ultrasonic moving image at the same time. Therefore, the user can efficiently and accurately conduct the imaging diagnosis.
  • the superimposed image of the ultrasonic moving image V 2 which is shown by the color Doppler method based on the Doppler shift of blood flow, and the three-dimensional moving image V 1 is displayed. Therefore, the user can preferably understand the body part of interest at high spatial resolution based on the three-dimensional moving image and the blood flow information, which is obtained only from the ultrasonic moving image, at the same time in an intuitive manner.
  • the shape of the characteristic part is automatically recognized to be extracted from the three-dimensional moving image and the three-dimensional ultrasonic moving image to eliminate the need of manual operation by the user to extract the characteristic part, and thus the shape of the characteristic part can be extracted efficiently and easily.
  • the periodic motion of the heart is accurately identified based on the state of opening and closing of the valves of the heart, thereby preferably obtaining the phases.
  • the phase obtaining means obtains the phases by identifying the end of diastole and/or systole of the heart based on the state of opening and closing of the mitral valve and/or the aortic valve among the valves of the heart. Therefore, more accurate identification of the periodic motion of the heart is achieved based on the change of the shape of the characteristic part in response to the heart beat.
  • the associating means 64 aligns the moving images V 1 and V 2 with respect to the phases of heart beat (the end of systole and the end of diastole), thereby more accurately associating the moving images V 1 and V 2 with each other.
  • the phases of both the three-dimensional moving image V 1 and the ultrasonic moving image V 2 are obtained by automatic recognition, and therefore the phases are easily and accurately obtained and associated.
  • the phases of the periodic motion may be identified based on the shape of the characteristic part, and for the other of the moving images, the phase of the periodic motion may be obtained based on the DICOM header information, or the like.
  • the automatic recognition may be applied to only one of the moving images to minimize increase of computational load and efficiently obtain the phases of the moving images.
  • the image generating means 65 generates the superimposed image by obtaining the pixel spacing from the accompanying information of each of the three-dimensional moving image V 1 and the ultrasonic moving image V 2 , and providing the moving images with the same pixel spacing based on the obtained pixel spacing. This facilitates obtaining the pixel spacing of each moving image and accurately providing the moving images of the same size to generate the superimposed image.
  • the above-described image processing is carried out based on the three-dimensional moving image V 1 taken with a CT or MR apparatus and the three-dimensional ultrasonic moving image V 2 , and this provides the user with more detailed understanding of the object of observation.
  • the characteristic part extracting operation according to this embodiment may be achieved by applying the method taught in Y. Zheng et al., “Four-Chamber Heart Modeling and Automatic Segmentation for 3D Cardiac CT Volumes Using Marginal Space Learning and Steerable Features”, IEEE TRANSACTIONS ON MEDICAL IMAGING, Vol. 27, pp. 1668-1681, 2008. It should be noted that the characteristic part extracting means 62 may apply any of known various methods that can extract a characteristic part of a structure from the two three-dimensional moving images V 1 and V 2 .
  • the user may manually input the position and shape of the characteristic part, such as the valves of the heart, using a mouse, or the like, for each of the three-dimensional moving images V 1 and V 2 , and the image processing device may obtain such inputs to extract the position and the shape of the characteristic part.
  • the characteristic part such as the valves of the heart
  • the phase obtaining means 63 may determine the phases of heart beat by using any method that uses the nature that the aortic valve AV changes from the open state to the closed state and the mitral valve MV starts to open from the closed state at the end of systole, and the mitral valve MV changes to the closed state and the aortic valve starts to open from the closed state at the end of diastole.
  • a period from a point when the mitral valve MV starts to open from the closed state (the end of systole) to a point when the mitral valve MV again starts to open from the closed state (the end of systole) may be detected as the one period of heart beat to associate the images V 1 and V 2 such that the images are aligned with respect to the phase of the end of systole
  • a period from a point when the mitral valve MV changes from the open state to the closed state (the end of diastole) to a point when the mitral valve MV again changes from the open state to the closed state (the end of diastole) may be detected as the one period of heart beat to associate the images V 1 and V 2 such that the images are aligned with respect to the phase of the end of diastole.
  • the phases of heart beat may be determined based on the state of opening and closing of the aortic valve AV, in place of the mitral valve MV, or information of the state of opening and closing of the mitral valve MV and information of the state of opening and closing of the aortic valve AV may be weighted to be used to determine the phase of heart beat.
  • any of various characteristic parts such as the left ventricle LV, the left atrium LA, the right ventricle RV, the right atrium RA, the valves MV, AV, PV and TV and the apex AC of the heart, as shown in FIG. 3 , or any combination of these characteristic parts is used as the predetermined characteristic part to identify the periodic motion of the heart based on the periodical change of the shape depending on the phase of heart beat, similarly to this embodiment, the phases are accurately obtained. In the case where more than one characteristic parts are used to identify the periodical motion of the heart, the phases are more accurately obtained based on the more than one pieces of information.
  • the phase obtaining means 63 may arbitrarily specify a period used to associate the moving images V 1 and V 2 with each other.
  • the phase obtaining means 63 receives an input by the user via a mouse and/or keyboard to identify one of the periods specified by the user by using any known method.
  • period selection buttons corresponding to the two or more periods contained in the three-dimensional moving image V 1 or the three-dimensional ultrasonic moving image V 2 may be displayed to receive the selection of period by the user, or the user may be prompted to input the start time of one of the periods contained in the three-dimensional moving image V 1 or the three-dimensional ultrasonic moving image V 2 via a keyboard, or the like, and the phase obtaining means 63 may receive the selection of period by the user.
  • FIG. 4 shows an example of the displayed superimposed image according to a modification of the above-described embodiment.
  • the above-described embodiment is described in conjunction with the three-dimensional ultrasonic moving image V 2 as an example, it is apparent for those skilled in the art that the invention is similarly applicable to a two-dimensional moving image as long as the image shows a cross section showing a recognizable characteristic part included in a body part, such as a cross section P showing the ventricles LV and RV, the atriums LA and RA, the valves MV, AV, PV and TV and the apex AC of the heart, as shown in FIG. 4 .
  • the user can observe, with respect to a predetermined cross section including the characteristic part, a high spatial resolution image taken with a CT or MR apparatus, and can understand information, such as information of blood flow, which is obtained only from a two-dimensional moving image taken with an ultrasonic diagnostics apparatus at the same time. This facilitates the user to accurately conduct the imaging diagnosis.
  • the predetermined body part may be any body part that makes a predetermined periodical motion, such as flexion and extension of a knee joint.
  • one or more parts forming the knee joint may be segmented to obtain parameters representing the state of flexion and extension of the knee, such as distances between predetermined points on the thigh bone and the shinbone, from the segmented parts, and the phases of the periodical motion from the flexed state to the extended state may be obtained based on the parameters representing the state of flexion and extension.
  • the alignment of the three-dimensional moving image V 1 taken with a CT or MR apparatus and the ultrasonic moving image V 2 may be achieved by transforming the coordinate system of the three-dimensional moving image V 1 taken with a CT or MR apparatus into the coordinate system of the ultrasonic moving image V 2 .
  • the associating means 64 may associate the phases of the three-dimensional moving image V 1 taken with a CT or MR apparatus and the ultrasonic moving image V 2 for only a part of one period of periodic motion, for one period of periodic motion, or for two or more periods of periodic motion.
  • the image processing program of the invention may be installed on two or more computers in a distributed manner to cause the two or more computers to function as the image processing device.

Abstract

A three-dimensional moving image and an ultrasonic moving image showing a body part making periodic motion are obtained, and, from the moving images, a characteristic part having a shape that changes with the periodic motion is extracted. Phases of the periodic motion captured in the moving images are obtained. For at least one of the moving images, the phases are obtained based on the shape of the extracted characteristic part. For each phase, the positions of the characteristic part shown in the three-dimensional moving image and the ultrasonic moving image are associated with each other based on the extracted characteristic part and the obtained phases. A superimposed image is generated by aligning, for each phase, the positions of the characteristic part shown in the three-dimensional moving image and the ultrasonic moving image with each other based on the associated positions of the characteristic part and the phases, and displayed.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to an image processing device, an image processing method and an image processing program for aligning and displaying a three-dimensional moving image taken with a CT or MR apparatus with a two-dimensional or three-dimensional moving image taken with an ultrasonic diagnostics apparatus.
  • 2. Description of the Related Art
  • In the medical field, detailed three-dimensional moving images, which are obtained by imaging a body part of a patient making a predetermined periodic motion, such as the heart, with a modality having high spatial resolution and contrast resolution, such as a CT or MR apparatus, are widely used for imaging diagnosis.
  • On the other hand, ultrasonic moving images taken with an ultrasonic diagnostics apparatus, which has no problem of radiation exposure, etc., allows examination with a simple device, and provides information about blood flow based on reflection of ultrasonic waves by the blood flow, are also effective for imaging diagnosis. Further, along with the development of the ultrasound diagnostic technique, three-dimensional ultrasonic moving images of subjects have become available as the ultrasonic moving images, in addition to conventional ultrasonic moving images obtained with respect to a predetermined two-dimensional cross section of subjects.
  • In order to use advantages of both the above-described moving images, doctors conduct the imaging diagnosis with displaying both the moving images in a state where they show the same phase of heart beat based on electrocardiographic data that is obtained during imaging of each moving image. At this time, the user (doctor) references one of the images with manually changing the position and direction shown in the other of the images to be the same as those shown in the one of the images so that the moving images showing the same position and phase are displayed at the same time on a display in a manner allowing comparison therebetween. However, while a CT or MR image is taken in fixed position and orientation relative to the subject, an ultrasonic image is taken with pressing an ultrasound probe against the subject at an arbitrary angle. Therefore, it is difficult to identify the position and direction of the ultrasonic image relative to the subject. Further, it is necessary to identify a frame image forming the CT or MRI moving image and a frame image forming the ultrasonic moving image that correspond to the same phase. Therefore, it imposes significant time and labor on the user to display these moving images in a manner allowing comparison therebetween.
  • Japanese Unexamined Patent Publication No. 2003-153877 (hereinafter, Patent Document 1) has proposed a technique which involves: extracting a predetermined characteristic part, such as a blood vessel area including a blood flow image, from ultrasonic image data of an examined body part; aligning the position of the predetermined characteristic part shown in the ultrasonic image with the position of the predetermined characteristic part shown in an MR image obtained in advance; correcting the MR image such that the MR image and the ultrasonic image show the predetermined characteristic part in the same position; and superimposing the corrected MR image and the ultrasonic image and displaying the superimposed image on a display device.
  • Further, Japanese Unexamined Patent Publication No. 2009-022459 (hereinafter, Patent Document 2) has proposed a technique which involves: synchronizing timing of a three-dimensional CT moving image with timing of a three-dimensional ultrasonic moving image based on electrocardiographic data; reconstructing the ultrasonic image by transforming a spatial coordinate system of the ultrasonic moving image into a spatial coordinate system of the CT moving image using a transformation matrix; and displaying the reconstructed moving images being aligned and superimposed.
  • According to the technique taught in Patent Document 1, it is able to spatially align a three-dimensional still image taken with a CT apparatus with an ultrasonic still image. However, with the technique taught in Patent Document 1, a three-dimensional moving image taken with a CT apparatus and an ultrasonic moving image cannot be associated with the phases of heart beat, and therefore it is difficult to display the superimposed moving images such that they show the same phase and the same spatial position. Further, according to the technique taught in Patent Document 2, it is necessary to obtain electrocardiographic data corresponding to the moving images to align the moving images with each other with respect to the phase of heart beat. Therefore, in a case where the electrocardiographic data corresponding to one of or both of the moving images is not available, it is difficult to associate the corresponding phases of heart beat shown in the moving images with each other.
  • SUMMARY OF THE INVENTION
  • In view of the above-described circumstances, the present invention is directed to providing an image processing device, an image processing method and an image processing program that facilitate aligning a three-dimensional moving image taken with a CT or MR apparatus with an ultrasonic moving image with respect to phases of a periodic motion and the position thereof to generate and display a superimposed image of the three-dimensional moving image taken with a CT or MR apparatus and the ultrasonic moving image.
  • An aspect of the image processing device according to the invention is an image processing device including: image obtaining means for obtaining a three-dimensional moving image showing a body part of a patient and an ultrasonic moving image showing the body part, the body part making a predetermined periodic motion; characteristic part extracting means for extracting, from a plurality of frame images forming the obtained three-dimensional moving image and ultrasonic moving image, a predetermined characteristic part having a shape that changes in response to the periodic motion; phase obtaining means for obtaining phases of the periodic motion captured in the three-dimensional moving image and the ultrasonic moving image, wherein, for at least one of the three-dimensional moving image and the ultrasonic moving image, the phases are obtained based on the shape of the extracted characteristic part; associating means for associating, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the extracted characteristic part and the obtained phases; image generating means for generating a superimposed image of the three-dimensional moving image and the ultrasonic moving image by aligning, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the associated positions of the characteristic part and the phases; and display controlling means for displaying the generated superimposed image on a display device.
  • An aspect of the image processing method according to the invention is an image processing method including: obtaining a three-dimensional moving image showing a body part of a patient and an ultrasonic moving image showing the body part, the body part making a predetermined periodic motion; extracting, from a plurality of frame images forming the obtained three-dimensional moving image and ultrasonic moving image, a predetermined characteristic part having a shape that changes in response to the periodic motion; obtaining phases of the periodic motion captured in the three-dimensional moving image and the ultrasonic moving image, wherein, for at least one of the three-dimensional moving image and the ultrasonic moving image, the phases are obtained based on the shape of the extracted characteristic part; associating, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the extracted characteristic part and the obtained phases; generating a superimposed image of the three-dimensional moving image and the ultrasonic moving image by aligning, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the associated positions of the characteristic part and the phases; and displaying the generated superimposed image on a display device.
  • An aspect of the image processing program according to the invention is an image processing program for causing a computer to function as: image obtaining means for obtaining a three-dimensional moving image showing a body part of a patient and an ultrasonic moving image showing the body part, the body part making a predetermined periodic motion; characteristic part extracting means for extracting, from a plurality of frame images forming the obtained three-dimensional moving image and ultrasonic moving image, a predetermined characteristic part having a shape that changes in response to the periodic motion; phase obtaining means for obtaining phases of the periodic motion captured in the three-dimensional moving image and the ultrasonic moving image, wherein, for at least one of the three-dimensional moving image and the ultrasonic moving image, the phases are obtained based on the shape of the extracted characteristic part; associating means for associating, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the extracted characteristic part and the obtained phases; image generating means for generating a superimposed image of the three-dimensional moving image and the ultrasonic moving image by aligning, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the associated positions of the characteristic part and the phases; and display controlling means for displaying the generated superimposed image on a display device.
  • The “body part” herein may be any body part that makes a predetermined periodic motion, and a typical example thereof is the heart. The “predetermined periodic motion” herein may be any repeatable motion where each part included in the body part moves in a predetermined direction within a predetermined range, such as heart beat, respiration in lungs, flexion and extension of a joint, etc. Further, in the invention, in the case where the body part making the periodic motion is the heart, the characteristic part may be any of the ventricles, the atriums, the muscles, the valves and the apex of the heart. In this case, the phase obtaining means may obtain the phases based on a state of opening and closing of any of the valves of the heart. The phase obtaining means may optionally obtain the phases by identifying the end of diastole and/or systole of the heart based on the state of opening and closing of the mitral valve and/or the aortic valve among the valves of the heart.
  • In the invention, the three-dimensional moving image may be any three-dimensional moving image that shows the shape of a body part of a subject, and an example thereof is a three-dimensional moving image taken with a CT or MRI apparatus.
  • In the invention, the ultrasonic moving image may be a three-dimensional moving image or a moving image showing a cross section including the characteristic part.
  • The “phases of the periodic motion” herein refers to stages of the periodic motion, and the number of phases for one period of the periodic motion may be any number.
  • In the invention, the characteristic part extracting means may automatically extract the characteristic part.
  • In the invention, for at least one of the three-dimensional moving image and the ultrasonic moving image, the phase obtaining means may obtain the phases from accompanying information of the moving image.
  • In the invention, the image generating means may generate the superimposed image by obtaining pixel spacing from accompanying information of each of the three-dimensional moving image and the ultrasonic moving image and providing the moving images with the same pixel spacing based on the obtained pixel spacing.
  • In the invention, the image generating means may generate the superimposed image by superimposing, on the three-dimensional moving image, the ultrasonic moving image shown by a color Doppler method based on Doppler shift of blood flow.
  • According to the image processing device, method and program of the invention, a three-dimensional moving image showing a body part, which makes a predetermined periodic motion, of a patient and an ultrasonic moving image showing the body part are obtained; from a plurality of frame images forming the obtained three-dimensional moving image and ultrasonic moving image, a predetermined characteristic part having a shape that changes in response to the periodic motion is extracted; phases of the periodic motion captured in each of the three-dimensional moving image and the ultrasonic moving image are obtained, wherein, for at least one of the three-dimensional moving image and the ultrasonic moving image, the phases are obtained based on the shape of the extracted characteristic part; for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase is associated with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the extracted characteristic part and the obtained phases; a superimposed image of the three-dimensional moving image and the ultrasonic moving image is generated by aligning, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the associated positions of the characteristic part and the phases; and the generated superimposed image is displayed on a display device. Therefore, even when there is no electrocardiographic data available, the superimposed image of the three-dimensional moving image and the ultrasonic moving image can easily be generated based on the shape of the characteristic part. With this superimposed image, the user can understand the object of observation by compensating for low resolution areas of the ultrasonic moving image with the high spatial resolution of the three-dimensional moving image, and can easily understand the information that is obtained only from the ultrasonic moving image at the same time. Therefore, the user can efficiently and accurately conduct the imaging diagnosis.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a diagram illustrating the schematic configuration of an image processing device according to one embodiment of the present invention,
  • FIG. 2 is a diagram illustrating the flow of a process carried out by the image processing device according to one embodiment of the invention,
  • FIG. 3 is a diagram illustrating an example of a superimposed image displayed by the image processing device according to one embodiment of the invention, and
  • FIG. 4 is a diagram illustrating an example of a superimposed image displayed by a modification of the image processing device according to one embodiment of the invention.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Hereinafter, embodiments of an image processing device, an image processing program and an image processing method of the present invention will be described in detail with reference to the drawings.
  • FIG. 1 illustrates the schematic configuration of a hospital system 1 incorporating an image processing device 6 according to one embodiment of the invention. The hospital system 1 includes an examination room system 3, a data server 4 and a diagnosis workstation (WS) 6, which are connected with each other via a local area network (LAN) 2.
  • The examination room system 3 includes various modalities 32 for imaging a subject, and an examination room workstation (WS) 31 used for checking and controlling images outputted from the individual modalities. The modalities 32 in this example includes a CT (Computed Tomography) apparatus and an MRI (Magnetic Resonance Imaging) apparatus, which are able to obtain a shape image representing shape information of the heart, and also includes an ultrasonic diagnostics apparatus, etc. Among these modalities 32, the CT apparatus and the MRI apparatus are compliant to the DICOM (Digital Imaging and Communication in Medicine) standard, and output the obtained volume data as a DICOM file with adding accompanying information.
  • The file outputted from each modality 32 is transferred to the data server 4 by the examination room WS 31. The data server 4 is formed by a computer with relatively high processing capacity including a high-performance processor and a mass memory, on which a software program for providing the function of a database management system (DBMS) is implemented. The program is stored in a storage, loaded in the memory upon startup, and executed by the processor. The data server 4 causes the file transferred from the examination room WS 31 to be stored in a mass storage 5. Further, in response to a request to search from the diagnosis WS 6, the data server 4 selects a file that meets a search condition from files stored in the mass storage 5 and sends the file to the diagnosis WS 6.
  • The diagnosis WS 6 is formed by a general-purpose workstation including a standard type processor, a memory and a storage, on which the image processing program for assisting diagnosis is implemented. The image processing program is installed on the diagnosis WS 6 from a recording medium, such as a DVD, or downloaded from a server computer connected via the network before being installed. A display 7 and an input device 8, such as a mouse and a keyboard, are connected to the diagnosis WS 6.
  • The image processing program implemented on the diagnosis WS 6 is formed by sets of program modules for accomplishing various functions. Among them is a set of program modules for accomplishing the image processing function. The program is stored in the storage, loaded in the memory upon startup, and executed by the processor. With this, the diagnosis WS 6 operates as: image obtaining means 61 for obtaining a three-dimensional moving image V1 showing a body part (the heart), which makes a predetermined periodic motion, of a patient and an ultrasonic moving image V2 showing the body part; characteristic part extracting means 62 for extracting, from a plurality of frame images forming the obtained three-dimensional moving image V1 and ultrasonic moving image V2, a predetermined characteristic part (mitral valve MV) having a shape that changes in response to the periodic motion; phase obtaining means 63 for obtaining phases of the periodic motion captured in the three-dimensional moving image V1 and the ultrasonic moving image V2, wherein, for at least one of the three-dimensional moving image V1 and the ultrasonic moving image V2, the phases are obtained based on the shape of the extracted characteristic part; associating means for associating, for each phase, the position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with the position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the extracted characteristic part and the obtained phases; image generating means 65 for generating a superimposed image of the three-dimensional moving image and the ultrasonic moving image by aligning, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image V1 corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image V2 corresponding to the phase based on the associated positions of the characteristic part and the phases; and display controlling means 66 for displaying the generated superimposed image on the display device 7.
  • FIG. 2 is a flow chart illustrating the flow of image processing of this embodiment. FIG. 3 shows an example of the displayed superimposed image. Now, the flow of a process carried out by the functions of the WS 6 (image processing device) of this embodiment is described in detail using FIGS. 2 and 3. This embodiment is described in conjunction with the case of heart examination as an example.
  • Prior to the process of this embodiment, during the heart examination, a moving image of the chest of the subject including one period of heart beat is taken using a CT apparatus, or the like, and the thus taken three-dimensional moving image V1 (volume data) with the accompanying information added thereto is transferred as a DICOM file to the data server 4 and stored in the mass storage 5. The volume data is formed by a collection of pieces of voxel data representing a density distribution in a three-dimensional space. In each voxel data, X-ray absorption, or the like, is indicated as a voxel value. Further, a moving image of the chest of the same subject is taken by transesophageal echocardiography (TEE), which is ultrasound imaging carried out by inserting an ultrasound probe through the mouth to the esophagus, and the thus taken three-dimensional ultrasonic moving image V2 is transferred to the data server 4 and stored in the mass storage 5.
  • First, when an image processing function for the heart is selected on an initial screen and the patient ID number, the examination number, etc., are inputted on a predetermined input screen, the image obtaining means 61 sends the inputted information to the data server 4 and sends a request to search for and transfer the corresponding file stored in the mass storage 5.
  • The data server 4 which has received the above-described request searches for the requested file in the mass storage 5 and transfers the file to the image obtaining means 61. The image obtaining means 61 obtains the three-dimensional moving image V1 and the three-dimensional ultrasonic moving image V2 contained in the file transferred from the data server 4 and stores them in the memory (S01).
  • Subsequently, the characteristic part extracting means 62 extracts, as the predetermined characteristic part, the mitral valve MV, which is a heart valve located between the left ventricle LV and the left atrium LD, from each of the three-dimensional moving image V1 and the three-dimensional ultrasonic moving image V2 (S02).
  • In this example, the method taught in R. I. Ionasec et al., “Patient-Specific Modeling and Quantification of the Aortic and Mitral Valves From 4-D Cardiac CT and TEE”, IEEE TRANSACTIONS ON MEDICAL IMAGING, VOL. 29, NO. 9, pp. 1636-1651, 2010, is applied to the operations to extract the characteristic part and obtain the phases from the three-dimensional moving image V1 and the three-dimensional ultrasonic moving image V2.
  • The characteristic part extracting means 62 segments the mitral valve MV of the heart captured in the moving images V1 and V2 in time series for at least one period of heart beat according to the method taught in the above-mentioned Non-Patent Document, and extracts information for identifying the position of each sample point on the contour of the mitral valve MV in each frame image forming the moving images V1 and V2.
  • Then, the phase obtaining means 63 obtains the phase of heart beat of each frame image based on the position of each sample point on the contour of the mitral valve MV captured in the moving images V1 and V2 (S03).
  • The one period of heart beat include a systole and a diastole. At the end of the systole, the aortic valve AV changes from the open state to the closed state and the mitral valve MV starts to open from the closed state. At the end of the diastole, the mitral valve MV changes to the closed state and the aortic valve starts to open from the closed state. Using this nature, the phase obtaining means 63 identifies the end of diastole and the end of systole to identify the phases of heart beat.
  • In this embodiment, the shape of the mitral valve MV captured in each of the moving images V1 and V2 is obtained at predetermined time intervals using predetermined parameters according to the method taught in the above-mentioned Non-Patent Document, and the state of opening and closing of the mitral valve MV is identified based on the shape of the mitral valve MV to obtain the phase of heart beat corresponding to the state of opening and closing of the mitral valve MV. In this embodiment, for each of the moving images V1 and V2, the state of opening and closing of the mitral valve MV and the predetermined parameters representing the shape of the mitral valve MV are associated with each other and stored. The phase obtaining means 63 identifies, for each moving image V1, V2, a frame in which the mitral valve MV has changed from the open state to the closed state as a frame corresponding to the end of diastole of the heart beat. Also, the phase obtaining means 63 identifies, for each moving image V1, V2, a frame in which the mitral valve MV has changed from the closed state to the open state (start-to-open state) as a frame corresponding to the end of systole of the heart beat. The predetermined parameters representing the shape of the mitral valve MV may, for example, be distances between specific sample points on the contour of the mitral valve MV.
  • Then, the associating means 64 temporally associates the frames forming the moving images V1 and V2 with each other such that the moving images V1 and V2 are aligned with each other with respect to the phases corresponding to the end of systole and the end of diastole (the associating means 64 may perform interpolation in the time axis direction, as necessary) (S04). In this example, the frame images of the images V1 and V2 showing the same phase are associated with each other, and the spatial positions of the same characteristic part shown in the associated frame images are associated with each other.
  • In the above-described operation, in the case where the number of frames of the three-dimensional moving image V1 and the number of frames of the three-dimensional ultrasonic moving image V2 for one period differ from each other, the associating means 64 associates the frames of these images using the moving image having the smaller number of frames for one period as the reference. For example, the frames of the moving image having the greater number of frames for one period may be appropriately decimated, as necessary. Further, in the case where the phases of the associated frames are slightly out of alignment, interpolation may be performed using a known method so that each pair of corresponding frame images of the moving images shows the same phase. For example, the phase of each frame image forming one of the moving images may be obtained, and then, using frame images of the other of the moving images before and after the obtained phase, an interpolated frame image of the other of the moving images having the shape corresponding to the obtained phase may be generated by a known method, to associate the frame images of the one of the moving images with the thus generated frame images of the other of the moving images such that each pair of associated frame images shows the same phase.
  • The image generating means 65 generates volume rendered images for a series of frame images extracted from the three-dimensional moving image V1 by the above-described operation. For a series of frame images extracted from the three-dimensional ultrasonic moving image V2, the image generating means 65 generates images by transforming the coordinate system of the three-dimensional ultrasonic moving image V2 into the coordinate system of the three-dimensional moving image V1 so that the images V1 and V2 show the characteristic part associated by the associating means 64 in the same position, the same direction and the same size. Then, the image generating means 65 generates the superimposed image of the moving images V1 and V2 by a known method and stores the superimposed image in the storage 5 (S05).
  • Specifically, the image generating means 65 achieves the spatial alignment by transforming the coordinate system of one of the images into the coordinate system of the other of the images so that the images show the same characteristic part in the same spatial position based on the position associated by the associating means 64, and appropriately correcting the transformed coordinate system so that the images show the same characteristic part in the same spatial position, the same direction and the same size. It should be noted that, during the spatial position alignment, the associating means 64 obtains pixel spacing information of the three-dimensional moving image V1 and the three-dimensional ultrasonic moving image V2 from the DICOM header information of each image, and enlarges or reduces the moving images V1 and V2, as appropriate, based on the pixel spacing information to provide the series of frame images extracted from the three-dimensional moving image V1 and the three-dimensional ultrasonic moving image V2 with the same pixel spacing.
  • Further, as shown in FIG. 3, the superimposed image generated by the image generating means 65 of this embodiment shows voxel values based on the three-dimensional moving image V1 at a predetermined transparency by volume rendering, and as shown by arrow C in FIG. 3, shows voxel values and the direction of blood flow based on the three-dimensional ultrasonic moving image V2 by the known color Doppler method.
  • As the method for generating the superimposed images of the series of frame images extracted from the three-dimensional moving image V1 and the three-dimensional ultrasonic moving image V2, the image generating means 65 may apply any of various known generation method that allows display of the superimposed images of the series of frame images extracted from the moving images V1 and V2 such that the superimposed images show the same characteristic part in the same spatial position, the same direction and the same size.
  • The display controlling means 66 obtains the superimposed moving image (superimposed frame images) generated by the image generating means 65, and causes the display 7 to display the superimposed image, as shown in FIG. 3 (S06).
  • As described above, according to this embodiment, the phases of the periodic motion of the body part that makes a predetermined motion are obtained based on the shape of the characteristic part captured in the three-dimensional moving image V1 and the ultrasonic moving image V2 and the three-dimensional moving image V1 and the ultrasonic moving image V2 are aligned with each other with respect to the phases to achieve the spatial alignment based on the position of the characteristic part captured in the moving images. Therefore, even in a case where the electrocardiographic data of one of the moving images is not available, the moving images can appropriately be associated with each other. Further, by generating and displaying the superimposed image of the associated moving images, the user can understand the object of observation by compensating for low resolution areas of the ultrasonic moving image V2 with the high spatial resolution of the three-dimensional moving image V1, and can easily understand the information that is obtained only from the ultrasonic moving image at the same time. Therefore, the user can efficiently and accurately conduct the imaging diagnosis.
  • In this embodiment, the superimposed image of the ultrasonic moving image V2, which is shown by the color Doppler method based on the Doppler shift of blood flow, and the three-dimensional moving image V1 is displayed. Therefore, the user can preferably understand the body part of interest at high spatial resolution based on the three-dimensional moving image and the blood flow information, which is obtained only from the ultrasonic moving image, at the same time in an intuitive manner.
  • In this embodiment, the shape of the characteristic part is automatically recognized to be extracted from the three-dimensional moving image and the three-dimensional ultrasonic moving image to eliminate the need of manual operation by the user to extract the characteristic part, and thus the shape of the characteristic part can be extracted efficiently and easily.
  • In the case where the body part that makes a periodic motion is the heart and the predetermined characteristic part is any of the valves of the heart, the periodic motion of the heart is accurately identified based on the state of opening and closing of the valves of the heart, thereby preferably obtaining the phases. Further, in this case, the phase obtaining means obtains the phases by identifying the end of diastole and/or systole of the heart based on the state of opening and closing of the mitral valve and/or the aortic valve among the valves of the heart. Therefore, more accurate identification of the periodic motion of the heart is achieved based on the change of the shape of the characteristic part in response to the heart beat. Still further, in this embodiment, the associating means 64 aligns the moving images V1 and V2 with respect to the phases of heart beat (the end of systole and the end of diastole), thereby more accurately associating the moving images V1 and V2 with each other.
  • In this embodiment, the phases of both the three-dimensional moving image V1 and the ultrasonic moving image V2 are obtained by automatic recognition, and therefore the phases are easily and accurately obtained and associated. Alternatively, for one of the three-dimensional moving image and the three-dimensional ultrasonic moving image, the phases of the periodic motion may be identified based on the shape of the characteristic part, and for the other of the moving images, the phase of the periodic motion may be obtained based on the DICOM header information, or the like. In this case, the automatic recognition may be applied to only one of the moving images to minimize increase of computational load and efficiently obtain the phases of the moving images.
  • In this embodiment, the image generating means 65 generates the superimposed image by obtaining the pixel spacing from the accompanying information of each of the three-dimensional moving image V1 and the ultrasonic moving image V2, and providing the moving images with the same pixel spacing based on the obtained pixel spacing. This facilitates obtaining the pixel spacing of each moving image and accurately providing the moving images of the same size to generate the superimposed image.
  • In this embodiment, the above-described image processing is carried out based on the three-dimensional moving image V1 taken with a CT or MR apparatus and the three-dimensional ultrasonic moving image V2, and this provides the user with more detailed understanding of the object of observation.
  • The characteristic part extracting operation according to this embodiment may be achieved by applying the method taught in Y. Zheng et al., “Four-Chamber Heart Modeling and Automatic Segmentation for 3D Cardiac CT Volumes Using Marginal Space Learning and Steerable Features”, IEEE TRANSACTIONS ON MEDICAL IMAGING, Vol. 27, pp. 1668-1681, 2008. It should be noted that the characteristic part extracting means 62 may apply any of known various methods that can extract a characteristic part of a structure from the two three-dimensional moving images V1 and V2. For example, the user may manually input the position and shape of the characteristic part, such as the valves of the heart, using a mouse, or the like, for each of the three-dimensional moving images V1 and V2, and the image processing device may obtain such inputs to extract the position and the shape of the characteristic part.
  • The phase obtaining means 63 may determine the phases of heart beat by using any method that uses the nature that the aortic valve AV changes from the open state to the closed state and the mitral valve MV starts to open from the closed state at the end of systole, and the mitral valve MV changes to the closed state and the aortic valve starts to open from the closed state at the end of diastole. For example, a period from a point when the mitral valve MV starts to open from the closed state (the end of systole) to a point when the mitral valve MV again starts to open from the closed state (the end of systole) may be detected as the one period of heart beat to associate the images V1 and V2 such that the images are aligned with respect to the phase of the end of systole, or a period from a point when the mitral valve MV changes from the open state to the closed state (the end of diastole) to a point when the mitral valve MV again changes from the open state to the closed state (the end of diastole) may be detected as the one period of heart beat to associate the images V1 and V2 such that the images are aligned with respect to the phase of the end of diastole. Alternatively, for example, the phases of heart beat may be determined based on the state of opening and closing of the aortic valve AV, in place of the mitral valve MV, or information of the state of opening and closing of the mitral valve MV and information of the state of opening and closing of the aortic valve AV may be weighted to be used to determine the phase of heart beat.
  • In the case where any of various characteristic parts, such as the left ventricle LV, the left atrium LA, the right ventricle RV, the right atrium RA, the valves MV, AV, PV and TV and the apex AC of the heart, as shown in FIG. 3, or any combination of these characteristic parts is used as the predetermined characteristic part to identify the periodic motion of the heart based on the periodical change of the shape depending on the phase of heart beat, similarly to this embodiment, the phases are accurately obtained. In the case where more than one characteristic parts are used to identify the periodical motion of the heart, the phases are more accurately obtained based on the more than one pieces of information.
  • If the moving image V1 and/or the moving image V2 contains two or more periods of periodic motion, the phase obtaining means 63 may arbitrarily specify a period used to associate the moving images V1 and V2 with each other. The phase obtaining means 63 according to this embodiment receives an input by the user via a mouse and/or keyboard to identify one of the periods specified by the user by using any known method. For example, period selection buttons corresponding to the two or more periods contained in the three-dimensional moving image V1 or the three-dimensional ultrasonic moving image V2 may be displayed to receive the selection of period by the user, or the user may be prompted to input the start time of one of the periods contained in the three-dimensional moving image V1 or the three-dimensional ultrasonic moving image V2 via a keyboard, or the like, and the phase obtaining means 63 may receive the selection of period by the user.
  • FIG. 4 shows an example of the displayed superimposed image according to a modification of the above-described embodiment. Although the above-described embodiment is described in conjunction with the three-dimensional ultrasonic moving image V2 as an example, it is apparent for those skilled in the art that the invention is similarly applicable to a two-dimensional moving image as long as the image shows a cross section showing a recognizable characteristic part included in a body part, such as a cross section P showing the ventricles LV and RV, the atriums LA and RA, the valves MV, AV, PV and TV and the apex AC of the heart, as shown in FIG. 4. The user can observe, with respect to a predetermined cross section including the characteristic part, a high spatial resolution image taken with a CT or MR apparatus, and can understand information, such as information of blood flow, which is obtained only from a two-dimensional moving image taken with an ultrasonic diagnostics apparatus at the same time. This facilitates the user to accurately conduct the imaging diagnosis.
  • It should be noted that the present invention is not limited to this embodiment. For example, the predetermined body part may be any body part that makes a predetermined periodical motion, such as flexion and extension of a knee joint. In the case where the invention is applied to flexion and extension of the knee, or the like, one or more parts forming the knee joint may be segmented to obtain parameters representing the state of flexion and extension of the knee, such as distances between predetermined points on the thigh bone and the shinbone, from the segmented parts, and the phases of the periodical motion from the flexed state to the extended state may be obtained based on the parameters representing the state of flexion and extension.
  • It should be noted that the alignment of the three-dimensional moving image V1 taken with a CT or MR apparatus and the ultrasonic moving image V2 may be achieved by transforming the coordinate system of the three-dimensional moving image V1 taken with a CT or MR apparatus into the coordinate system of the ultrasonic moving image V2.
  • The associating means 64 may associate the phases of the three-dimensional moving image V1 taken with a CT or MR apparatus and the ultrasonic moving image V2 for only a part of one period of periodic motion, for one period of periodic motion, or for two or more periods of periodic motion.
  • Although the embodiments of the present invention have been described with respect to the case where the image processing program of the invention is implemented on a single diagnosis WS to cause the WS to function as the image processing device, the image processing program may be installed on two or more computers in a distributed manner to cause the two or more computers to function as the image processing device.

Claims (11)

1. An image processing device comprising:
an image obtaining unit for obtaining a three-dimensional moving image showing a body part of a patient and an ultrasonic moving image showing the body part, the body part making a predetermined periodic motion;
a characteristic part extracting unit for extracting, from a plurality of frame images forming the obtained three-dimensional moving image and ultrasonic moving image, a predetermined characteristic part having a shape that changes in response to the periodic motion;
a phase obtaining unit for obtaining phases of the periodic motion captured in the three-dimensional moving image and the ultrasonic moving image, wherein, for at least one of the three-dimensional moving image and the ultrasonic moving image, the phases are obtained based on the shape of the extracted characteristic part;
an associating unit for associating, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the extracted characteristic part and the obtained phases;
an image generating unit for generating a superimposed image of the three-dimensional moving image and the ultrasonic moving image by aligning, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the associated positions of the characteristic part and the phases; and
a display controlling unit for displaying the generated superimposed image on a display device.
2. The image processing device as claimed in claim 1, wherein the characteristic part extracting unit automatically extracts the characteristic part.
3. The image processing device as claimed in claim 1, wherein the body part making the periodic motion is a heart, and the characteristic part is any of ventricles, atriums, muscles, valves and an apex of the heart.
4. The image processing device as claimed in claim 3, wherein the phase obtaining unit obtains the phases based on a state of opening and closing of any of the valves of the heart.
5. The image processing device as claimed in claim 4, wherein the phase obtaining unit obtains the phases by identifying an end of diastole and/or systole of the heart based on the state of opening and closing of a mitral valve and/or an aortic valve among the valves of the heart.
6. The image processing device as claimed in claim 1, wherein, for at least one of the three-dimensional moving image and the ultrasonic moving image, the phase obtaining unit obtains the phases from accompanying information of the moving image.
7. The image processing device as claimed in claim 1, wherein the image generating unit generates the superimposed image by obtaining pixel spacing from accompanying information of each of the three-dimensional moving image and the ultrasonic moving image and providing the moving images with the same pixel spacing based on the obtained pixel spacing.
8. The image processing device as claimed in claim 1, wherein the image generating unit generates the superimposed image by superimposing, on the three-dimensional moving image, the ultrasonic moving image shown by a color Doppler method based on Doppler shift of blood flow.
9. The image processing device as claimed in claim 1, wherein the ultrasonic moving image is a moving image showing a cross section including the characteristic part.
10. An image processing method comprising:
obtaining a three-dimensional moving image showing a body part of a patient and an ultrasonic moving image showing the body part, the body part making a predetermined periodic motion;
extracting, from a plurality of frame images forming the obtained three-dimensional moving image and ultrasonic moving image, a predetermined characteristic part having a shape that changes in response to the periodic motion;
obtaining phases of the periodic motion captured in the three-dimensional moving image and the ultrasonic moving image, wherein, for at least one of the three-dimensional moving image and the ultrasonic moving image, the phases are obtained based on the shape of the extracted characteristic part;
associating, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the extracted characteristic part and the obtained phases;
generating a superimposed image of the three-dimensional moving image and the ultrasonic moving image by aligning, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the associated positions of the characteristic part and the phases; and
displaying the generated superimposed image on a display device.
11. A non-transitory storage medium containing an image processing program for causing a computer to function as:
an image obtaining unit for obtaining a three-dimensional moving image showing a body part of a patient and an ultrasonic moving image showing the body part, the body part making a predetermined periodic motion;
a characteristic part extracting unit for extracting, from a plurality of frame images forming the obtained three-dimensional moving image and ultrasonic moving image, a predetermined characteristic part having a shape that changes in response to the periodic motion;
a phase obtaining unit for obtaining phases of the periodic motion captured in the three-dimensional moving image and the ultrasonic moving image, wherein, for at least one of the three-dimensional moving image and the ultrasonic moving image, the phases are obtained based on the shape of the extracted characteristic part;
an associating unit for associating, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the extracted characteristic part and the obtained phases;
an image generating unit for generating a superimposed image of the three-dimensional moving image and the ultrasonic moving image by aligning, for each phase, a position of the characteristic part in a frame image forming the three-dimensional moving image corresponding to the phase with a position of the characteristic part in a frame image forming the ultrasonic moving image corresponding to the phase based on the associated positions of the characteristic part and the phases; and
a display controlling unit for displaying the generated superimposed image on a display device.
US13/478,220 2011-05-30 2012-05-23 Image processing device, method and program Abandoned US20120306862A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP120352/2011 2011-05-30
JP2011120352A JP5501292B2 (en) 2011-05-30 2011-05-30 Image processing apparatus, image processing method, and image processing program

Publications (1)

Publication Number Publication Date
US20120306862A1 true US20120306862A1 (en) 2012-12-06

Family

ID=47261311

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/478,220 Abandoned US20120306862A1 (en) 2011-05-30 2012-05-23 Image processing device, method and program
US13/480,352 Expired - Fee Related US9224188B2 (en) 2011-05-30 2012-05-24 Image processing device, method and program

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/480,352 Expired - Fee Related US9224188B2 (en) 2011-05-30 2012-05-24 Image processing device, method and program

Country Status (2)

Country Link
US (2) US20120306862A1 (en)
JP (1) JP5501292B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018140073A (en) * 2017-02-28 2018-09-13 キヤノンメディカルシステムズ株式会社 Ultrasonic diagnostic apparatus, image processing apparatus and image processing program

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6130669B2 (en) * 2013-01-07 2017-05-17 セイコーインスツル株式会社 Biological information detection apparatus and biological information detection program
KR101531183B1 (en) * 2013-12-13 2015-06-25 기초과학연구원 Apparatus and method for ecocardiography image processing using navier-stokes equation
US20210137491A1 (en) * 2018-03-29 2021-05-13 Terumo Kabushiki Kaisha Information selection device
US20230084352A1 (en) * 2021-09-10 2023-03-16 Cerner Innovation, Inc. Linking graph

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003079627A (en) * 2001-09-14 2003-03-18 Aloka Co Ltd Cardiac wall movement evaluation apparatus
US20030187350A1 (en) * 2002-03-29 2003-10-02 Jun Omiya Image processing device and ultrasonic diagnostic device
JP2009022459A (en) * 2007-07-18 2009-02-05 Toshiba Corp Medical image processing display device and its processing program
JP2010029283A (en) * 2008-07-25 2010-02-12 Konica Minolta Medical & Graphic Inc Program, portable storage medium, and information processor

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3834365B2 (en) * 1996-10-16 2006-10-18 アロカ株式会社 Ultrasonic diagnostic equipment
JP3878462B2 (en) * 2001-11-22 2007-02-07 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Diagnostic imaging support system
US7678050B2 (en) * 2004-08-24 2010-03-16 General Electric Company Method and apparatus for detecting cardiac events
JP4786246B2 (en) 2005-08-03 2011-10-05 株式会社東芝 Image processing apparatus and image processing system
US20070167806A1 (en) * 2005-11-28 2007-07-19 Koninklijke Philips Electronics N.V. Multi-modality imaging and treatment
US7415093B2 (en) * 2006-10-30 2008-08-19 General Electric Company Method and apparatus of CT cardiac diagnostic imaging using motion a priori information from 3D ultrasound and ECG gating
US8165361B2 (en) * 2008-01-14 2012-04-24 General Electric Company System and method for image based multiple-modality cardiac image alignment
JP2009247739A (en) 2008-04-09 2009-10-29 Toshiba Corp Medical image processing and displaying device, computer processing program thereof, and ultrasonic diagnosing equipment
WO2010014977A1 (en) * 2008-08-01 2010-02-04 The Trustees Of Columbia University In The City Of New York Systems and methods for matching and imaging tissue characteristics
US8771189B2 (en) * 2009-03-18 2014-07-08 Siemens Medical Solutions Usa, Inc. Valve assessment from medical diagnostic imaging data
US8659603B2 (en) * 2009-03-31 2014-02-25 General Electric Company System and method for center point trajectory mapping

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003079627A (en) * 2001-09-14 2003-03-18 Aloka Co Ltd Cardiac wall movement evaluation apparatus
US20030187350A1 (en) * 2002-03-29 2003-10-02 Jun Omiya Image processing device and ultrasonic diagnostic device
JP2009022459A (en) * 2007-07-18 2009-02-05 Toshiba Corp Medical image processing display device and its processing program
JP2010029283A (en) * 2008-07-25 2010-02-12 Konica Minolta Medical & Graphic Inc Program, portable storage medium, and information processor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
R.I. Ionasec, et al., "Patient-Specific Modeling and Quantification of the Aortic and Mitral Valves From 4-D Cardiac CT and TEE", IEEE Transactions on medical imaging, vol. 29, No. 9, pp. 1636-1651, 2010 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018140073A (en) * 2017-02-28 2018-09-13 キヤノンメディカルシステムズ株式会社 Ultrasonic diagnostic apparatus, image processing apparatus and image processing program

Also Published As

Publication number Publication date
JP5501292B2 (en) 2014-05-21
JP2012245221A (en) 2012-12-13
US9224188B2 (en) 2015-12-29
US20120306863A1 (en) 2012-12-06

Similar Documents

Publication Publication Date Title
US10304198B2 (en) Automatic medical image retrieval
US9179890B2 (en) Model-based positioning for intracardiac echocardiography volume stitching
US10803354B2 (en) Cross-modality image synthesis
US10685438B2 (en) Automated measurement based on deep learning
CN106485691B (en) Information processing apparatus, information processing system, and information processing method
US10052032B2 (en) Stenosis therapy planning
CN109152566B (en) Correcting for probe-induced deformations in ultrasound fusion imaging systems
Khalil et al. 2D to 3D fusion of echocardiography and cardiac CT for TAVR and TAVI image guidance
JP2012205899A (en) Image generating method and system of body organ using three-dimensional model and computer readable recording medium
JP5498989B2 (en) Image processing apparatus, image processing method, and image processing program
US9224188B2 (en) Image processing device, method and program
WO2011114733A1 (en) Medical image conversion device, method, and program
Kim et al. Automatic segmentation of the left ventricle in echocardiographic images using convolutional neural networks
US11903691B2 (en) Combined steering engine and landmarking engine for elbow auto align
Wehbe et al. Deep learning for cardiovascular imaging: A review
US11495346B2 (en) External device-enabled imaging support
Tobon-Gomez et al. A multimodal database for the 1 st cardiac motion analysis challenge
EP3721412B1 (en) Registration of static pre-procedural planning data to dynamic intra-procedural segmentation data
Bersvendsen et al. Robust spatio-temporal registration of 4D cardiac ultrasound sequences
Yang et al. Automatic left ventricle segmentation based on multiatlas registration in 4D CT images
Mao et al. Direct 3d ultrasound fusion for transesophageal echocardiography
US20190251691A1 (en) Information processing apparatus and information processing method
Positano et al. Automatic characterization of myocardial perfusion in contrast enhanced MRI
Rahimi et al. Trimodality image registration of ultrasound, cardiac computed tomography, and magnetic resonance imaging for transcatheter aortic valve implantation and replacement image guidance
EP4231234A1 (en) Deep learning for registering anatomical to functional images

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUJIFILM CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LI, YUANZHONG;REEL/FRAME:028344/0707

Effective date: 20120426

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION