WO2012153539A1 - 医用画像処理装置とその方法 - Google Patents
医用画像処理装置とその方法 Download PDFInfo
- Publication number
- WO2012153539A1 WO2012153539A1 PCT/JP2012/003093 JP2012003093W WO2012153539A1 WO 2012153539 A1 WO2012153539 A1 WO 2012153539A1 JP 2012003093 W JP2012003093 W JP 2012003093W WO 2012153539 A1 WO2012153539 A1 WO 2012153539A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- boundary
- coordinate system
- left ventricular
- volume data
- image processing
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 181
- 238000000034 method Methods 0.000 title claims description 84
- 238000001514 detection method Methods 0.000 claims abstract description 301
- 210000005240 left ventricle Anatomy 0.000 claims abstract description 80
- 230000002861 ventricular Effects 0.000 claims description 210
- 238000012937 correction Methods 0.000 claims description 87
- 230000002107 myocardial effect Effects 0.000 claims description 59
- 210000004165 myocardium Anatomy 0.000 claims description 57
- 230000008569 process Effects 0.000 claims description 50
- 238000010186 staining Methods 0.000 claims description 35
- 238000004364 calculation method Methods 0.000 claims description 30
- 239000002872 contrast media Substances 0.000 claims description 17
- 238000011156 evaluation Methods 0.000 claims description 15
- 210000004115 mitral valve Anatomy 0.000 claims description 15
- 210000000591 tricuspid valve Anatomy 0.000 claims description 4
- 210000004351 coronary vessel Anatomy 0.000 claims description 2
- 210000002837 heart atrium Anatomy 0.000 claims description 2
- 210000003540 papillary muscle Anatomy 0.000 claims description 2
- 238000003672 processing method Methods 0.000 claims description 2
- 230000000747 cardiac effect Effects 0.000 abstract description 4
- 238000003745 diagnosis Methods 0.000 abstract description 3
- 239000013598 vector Substances 0.000 description 42
- 238000010586 diagram Methods 0.000 description 35
- 238000002591 computed tomography Methods 0.000 description 19
- 210000005241 right ventricle Anatomy 0.000 description 17
- 238000004891 communication Methods 0.000 description 11
- 238000003384 imaging method Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 238000002595 magnetic resonance imaging Methods 0.000 description 7
- 239000012528 membrane Substances 0.000 description 7
- 210000005242 cardiac chamber Anatomy 0.000 description 5
- 230000010412 perfusion Effects 0.000 description 5
- 238000003491 array Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 239000000470 constituent Substances 0.000 description 3
- 210000005246 left atrium Anatomy 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000009877 rendering Methods 0.000 description 3
- 210000005245 right atrium Anatomy 0.000 description 3
- 239000000523 sample Substances 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 230000001746 atrial effect Effects 0.000 description 2
- 230000017531 blood circulation Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 238000002347 injection Methods 0.000 description 2
- 239000007924 injection Substances 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 241000288110 Fulica Species 0.000 description 1
- 241000270295 Serpentes Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 210000000709 aorta Anatomy 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000007429 general method Methods 0.000 description 1
- 230000002068 genetic effect Effects 0.000 description 1
- 230000004217 heart function Effects 0.000 description 1
- 238000003702 image correction Methods 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000009206 nuclear medicine Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 238000000513 principal component analysis Methods 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 238000010845 search algorithm Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000002922 simulated annealing Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000013179 statistical model Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
- A61B6/503—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of the heart
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5217—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5207—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/149—Segmentation; Edge detection involving deformable models, e.g. active contour models
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/032—Transmission computed tomography [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0883—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30048—Heart; Cardiac
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30101—Blood vessel; Artery; Vein; Vascular
Definitions
- Embodiments described herein relate generally to a medical image processing apparatus and method.
- a conventional medical image processing apparatus accesses the volume data of the heart, fits a deformable boundary model to the structure in the volume data, detects the boundary, and specifies one or more feature points based on the boundary model Then, an image of the heart based on the feature points is displayed on the display.
- an embodiment of the present invention has been made to solve the above-described problems, and an object thereof is to provide a medical image processing apparatus and method that can improve the accuracy of heart boundary detection.
- An embodiment of the present invention relates to a coordinate system for detecting, from the volume data, a three-dimensional left ventricular coordinate system comprising an acquisition unit for acquiring heart volume data and three axes including at least the left ventricular long axis of the heart Using the detection unit and the boundary model expressed in the left ventricular coordinate system, the error between the boundary pattern obtained by applying the boundary model to the volume data and the predetermined boundary pattern model is reduced.
- the boundary model is deformed to detect a left ventricular boundary from the volume data, and a cross-sectional image orthogonal to at least one of the three axes of the left ventricular coordinate system is displayed on the cross-sectional image.
- a display unit for displaying the detected left ventricular boundary.
- FIG. 1 is a block diagram showing a configuration of a medical image processing apparatus according to a first embodiment.
- the flowchart which similarly shows operation
- the flowchart of the 1st detection method of a coordinate system detection part Explanatory drawing of the left ventricular coordinate system detected.
- FIG. 3 is a vw plan view of the heart showing a boundary pattern model.
- the graph which shows a boundary pattern model.
- the figure for demonstrating the correction process which concerns on 6th Embodiment The figure which shows the example of correction
- the figure which shows the example of a detection by the boundary detection part which concerns on 6th Embodiment. 10 is a flowchart illustrating a processing example of a medical image processing apparatus according to a sixth embodiment.
- the figure which shows the structural example of the medical image processing apparatus which concerns on 7th Embodiment. 15 is a flowchart illustrating a processing example of a medical image processing apparatus according to a seventh embodiment.
- the medical image processing apparatus 10 is an apparatus that displays an image related to the heart.
- FIG. 1 is a block diagram showing a medical image processing apparatus 10.
- the medical image processing apparatus 10 includes an acquisition unit 101, a coordinate system detection unit 102, a boundary detection unit 103, a display unit 104, and a storage unit 110 that stores a three-dimensional left ventricular boundary model (hereinafter simply referred to as “boundary model”). I have.
- the acquisition unit 101 acquires three-dimensional volume data related to the heart and sends the data to the coordinate system detection unit 102, the boundary detection unit 103, and the display unit 104.
- the coordinate system detection unit 102 detects the left ventricular coordinate system from the volume data and sends the data to the boundary detection unit 103 and the display unit 104.
- the boundary detection unit 103 detects a three-dimensional left ventricular boundary using the boundary model stored in the storage unit 110 and sends the data to the display unit 104.
- the display unit 104 displays volume data, left ventricular coordinate system, and left ventricular boundary.
- FIG. 2 is a flowchart showing the operation of the medical image processing apparatus 10.
- step 1 the acquisition unit 101 acquires volume data in which the heart is imaged from an external imaging device or the like.
- volume data means a grayscale image having a three-dimensional extent in the spatial direction.
- the three-dimensional volume data of the heart can generally be imaged by a medical image diagnostic apparatus such as an X-ray CT apparatus, an MRI apparatus, an ultrasonic diagnostic apparatus, or a nuclear medicine diagnostic apparatus. It is not limited.
- the volume data can be acquired directly from the imaging device, or can be acquired from an external medium such as a server, a personal computer, an HDD, or a DVD in which the imaging data is stored.
- the volume data acquired by the acquisition unit 101 is sent to the coordinate system detection unit 102, the boundary detection unit 103, and the display unit 104.
- noise removal processing, contrast enhancement processing, and image enlargement / reduction processing may be performed on the volume data so as to be suitable for each processing step.
- image enlargement / reduction processing the scale relationship between the volume data to be transferred to each is stored.
- step 2 the coordinate system detection unit 102 detects the left ventricular coordinate system from the volume data sent from the acquisition unit 101.
- the “left ventricular coordinate system” is a three-dimensional coordinate system including at least three axes including a left ventricular long axis (hereinafter simply referred to as “long axis”).
- 3 to 6 show the relationship between the left ventricular coordinate system and the volume data coordinate system.
- 3 is a diagram showing a volume data coordinate system based on the human body
- FIG. 4 is a diagram showing a cross-sectional image perpendicular to the axis of the volume data coordinate system
- FIG. 5 is an overlay of the volume data coordinate system and the left ventricular coordinate system.
- FIG. 6 is a diagram showing a cross-sectional image perpendicular to the axis of the left ventricular coordinate system.
- the volume data is generally stored in a volume data coordinate system based on the coordinate system of the imaging device.
- the volume data coordinate system includes, for example, a body axis direction (z-axis), a dorsal-abdominal direction (y-axis), and a left-right direction (x-axis) when volume data is imaged with a CT apparatus.
- these cross-sectional images orthogonal to the axial direction are referred to as an axial view, a sagittal view, and a coronal view.
- the position, direction, and size of the heart within the volume data vary depending on individual differences, breathing, heartbeat, imaging device, and the relative positional relationship of the human body. Therefore, it is necessary to specify the position and posture of the heart in the volume data.
- the left ventricular coordinate system is detected to set the position and direction of the left ventricle in the volume data.
- the first detection method of the left ventricular coordinate system for example, Non-Patent Document 1 (Sakada Yuki et al., “Automatic detection of diagnostic reference cross section from echocardiographic volume data using random tree”, Image Sensing Symposium Proceedings, There is a method shown in the 16th Image Sensing Symposium, IS3-24).
- the first detection method will be described with reference to FIGS.
- FIG. 7 is a flowchart of the first detection method.
- the coordinate system detection unit 102 uses the left ventricular coordinate system as the long axis (u-axis) and an axis orthogonal to the left ventricle in which a cross-section through which a four-chamber cross-sectional image can be observed is set.
- V-axis An axis (w-axis) orthogonal to the two axes is set in advance, and as shown in FIG. 5, a left ventricular cross-sectional image pattern based on the left ventricular coordinate system formed by the center p of the left ventricle Is created in advance.
- the “left ventricular cross-sectional image pattern” means a left ventricular cross-sectional image used for pattern matching.
- step 102 the coordinate system detection unit 102 applies the left ventricular cross-sectional image pattern to the volume data while searching for the best matching position, posture, and scale, and the volume data is changed to FIG. Set the left ventricular coordinate system as shown.
- a second detection method of the left ventricular coordinate system will be described with reference to the flowchart of FIG. 9 and FIG.
- the three-dimensional coordinate system can be set with three points that do not lie on a straight line in the three-dimensional space.
- the coordinate system detection unit 102 detects the apex position, the mitral valve position, and the right ventricular corner point from the volume data.
- the coordinate system detection unit 102 detects the apex position by performing pattern matching between the volume data and an image pattern around the apex position learned in advance.
- the “image pattern” means an image used for pattern matching.
- the coordinate system detection unit 102 also detects the mitral valve position and the right ventricular corner point by pattern matching between the volume data and the image pattern around the mitral valve position and the image pattern around the right ventricular corner point. To do.
- step 112 the coordinate system detection unit 102 sets the midpoint between the apex position and the mitral valve position as the origin p from the volume data.
- step 113 the coordinate system detection unit 102 sets a vector from the origin p to the apex position as the long axis (u axis) (see FIG. 10).
- step 114 the coordinate system detection unit 102 sets a vector that is orthogonal to the long axis (u axis) and that passes through the right ventricular corner point as the second axis (v axis) (FIG. 10). reference).
- the right ventricular corner point can be detected by collating with a previously learned image pattern in a cross section orthogonal to the detected major axis.
- step 115 the coordinate system detection unit 102 sets a direction orthogonal to the major axis (u axis) and the second axis (v axis) as the third axis (w axis) (see FIG. 10). ).
- the other one point in the heart is not limited to the right ventricular corner point, and for example, the left ventricular coordinate system can be set by detecting the tricuspid valve position, the left ventricular outflow path position, and the like in the same manner.
- the detection method of the left ventricular coordinate system is not limited to these, and any detection method that can set a three-dimensional coordinate system having the long axis as one of the constituent elements may be used.
- step 3 the boundary detection unit 103 uses the left ventricular coordinate system obtained by the coordinate system detection unit 102 and the boundary model stored in the storage unit 110. Detect ventricular boundaries.
- the left ventricle boundary detected by the boundary detection unit 103 is a myocardium surrounding the left ventricle (a gray region surrounded by a solid line in FIG. 6) and a left ventricular lumen (white region). It means the boundary between the intima boundary which is the boundary, the myocardium surrounding the left ventricle and the epicardial boundary which is the boundary outside the left ventricle, or both.
- 6A is a short-axis cross-sectional image (Short Axis View: vw plan view), (b) is a 4-chamber cross-sectional image (4 chamber view: uv plan view), and (c) is a 2-chamber cross-sectional image ( 2 chamber View: wu plan view).
- 6A is a short-axis cross-sectional image (Short Axis View: vw plan view)
- (b) is a 4-chamber cross-sectional image (4 chamber view: uv plan view)
- (c) is a 2-chamber cross-sectional image (
- the intimal boundary of the left ventricle can be approximated by a bowl-shaped boundary model with the apex at the apex position. Assuming that the left ventricular coordinate system is detected, the boundary model can be approximated by a quadric surface having the long axis (u-axis) as the central axis as shown in FIG. 11, for example.
- the boundary model of FIG. 11 is shown in Formula (1).
- This boundary model has four variables a1, a2, a3, and a4, and can define a convex paraboloid on the left ventricular coordinate system. Therefore, the boundary detection in the present embodiment is to obtain the boundary model variables (parameters) a1, a2, a3, and a4 in the volume data from which the three-dimensional left ventricular coordinate system is obtained.
- the method of obtaining the variables a1 to a4 defines energy that can be calculated between the boundary model and volume data, gives initial values to the variables, and repeatedly optimizes the variables so that the defined energy becomes smaller.
- the fitting of the boundary model based on such energy minimization is a general method, and is used as the fitting method of the snake or the active contour model.
- this embodiment is characterized in that the boundary model presupposes detection of the left ventricular coordinate system. That is, by detecting the left ventricular coordinate system, not only can the three-dimensional shape be expressed in a simple form as in equation (1), but the range of the variable is limited by obtaining the prior distribution of each variable. It is possible to devise such as. In addition, if the left ventricular coordinate system is detected, a rough left ventricular myocardial region such as the side wall, anterior wall, lower wall, and septum can be identified. It is possible to improve the accuracy by processing according to the above.
- the boundary detection unit 103 updates the variables a1 to a4 until the luminance differential amount in the normal direction increases at a point on the three-dimensional boundary. Since the normal direction is naturally three-dimensional, energy can be calculated by calculating the differential amount (difference amount) in the three-dimensional direction in the volume data.
- a general search algorithm such as a greedy algorithm, a simulated annealing, or a genetic algorithm can be used as a method for updating a model parameter necessary for energy minimization.
- the display unit 104 displays the volume data obtained by the acquisition unit 101, the left ventricular coordinate system obtained by the coordinate system detection unit 102, and the three-dimensional left ventricular boundary obtained by the boundary detection unit 103. Display on a display device such as a display, projector, or printer.
- the display unit 104 is not limited to the above-described device as long as it can display a cross-sectional image of volume data and a detected boundary.
- the cross section of the volume data displayed on the display unit 104 is determined based on the left ventricular coordinate system.
- a cross section orthogonal to the long axis (u-axis) direction of the left ventricular coordinate system is a left ventricular short-axis image that is common for the user (medical worker).
- the boundary detection result can be confirmed on a cross-sectional image in which the position and posture of the heart are easily recognized by the user.
- the major axis (u-axis) direction and the direction (v-axis direction) in which the four-chamber cross-sectional image can be defined are detected as the left ventricular coordinate system, the four-chamber cross-sectional image and the left on the cross-section in addition to the short-axis image
- the ventricular boundary can be displayed, and the whole image of the detection result of the three-dimensional boundary and the detection result of the left ventricular coordinate system can be more easily grasped.
- the same effect can be obtained by detecting the v-axis as the direction defining the three-chamber cross-sectional image or the two-chamber cross-sectional image.
- the three cross sections are two cross sections including the long axis and the short axis cross section with the center of the left ventricle as the center of the cross-sectional image. Since these cross-sectional images are orthogonal to each other, it is easy to grasp the positional relationship, and it is easy to confirm whether the left ventricular coordinate system is correct together with the detected left ventricular boundary.
- the coordinate system detection unit 102 detects the left ventricular coordinate system
- the boundary detection unit 103 detects the left ventricular boundary based on the left ventricular coordinate system, thereby enabling highly accurate boundary detection.
- the confirmation of the boundary detection result can be displayed in an easy-to-understand manner on the display unit 104.
- the medical image processing apparatus 10 according to the second embodiment will be described with reference to the block diagram of FIG.
- the medical image processing apparatus 10 of the present embodiment is a first modification of the medical image processing apparatus 10 of the first embodiment.
- the coordinate system detection unit 102 converts the volume data obtained by the acquisition unit 101 into the detected left ventricular coordinate system, thereby simplifying the subsequent work.
- the coordinate system detection unit 102 provides the coordinate-converted image to the boundary detection unit 103 and the display unit 104, it is not necessary to provide volume data from the acquisition unit 101, and the data flow can be simplified as shown in FIG. .
- a medical image processing apparatus 10 according to the third embodiment will be described.
- the medical image processing apparatus 10 of the present embodiment is a second modification of the medical image processing apparatus 10 of the first embodiment.
- the method for detecting the intima boundary (myocardial inner boundary) of the left ventricle in the boundary detection unit 103 has been described.
- the detection of the outer membrane boundary (outer myocardial boundary) is handled in the same manner. it can.
- the storage unit 110 stores the boundary model related to the intima boundary and the boundary model related to the epicardial boundary, so that the boundary detection unit 103 can independently detect the boundary between the intima and the epicardium. This makes it possible to reduce the time required for boundary detection by using a parallel computer.
- boundary detection unit 103 may integrally detect the intima boundary and the epicardial boundary by the storage unit 110 storing an integral boundary model in which the intima boundary and the epicardial boundary are combined.
- a medical image processing apparatus 20 according to the fourth embodiment will be described with reference to FIGS.
- FIG. 13 is a block diagram showing the medical image processing apparatus 20.
- the medical image processing apparatus 20 includes an acquisition unit 201 that acquires volume data, a coordinate system detection unit 202 that detects a left ventricular coordinate system, a boundary detection unit 203 that detects a left ventricular boundary, and an evaluation that evaluates the detected left ventricular boundary.
- Unit 204 a display unit 205 for displaying the evaluation result, a first storage unit 210 for storing the boundary model, and a second storage unit 220 for storing the boundary pattern model.
- the boundary model is represented by a linear sum of a plurality of basis vectors obtained by principal component analysis of a three-dimensional boundary taught in a plurality of volume data collected in advance and an average shape.
- This expression method is described in Non-Patent Document 2 (TFCootes and CJ Taylor, ”Statistical Models of Appearance for Computer Vision”, called “Dynamic Shape Model”, http://personalpages.manchester.ac.uk/staff /timothy.f.cootes/Models/app_models.pdf).
- This expression method can handle the inner and outer boundaries simultaneously by treating the coordinates of the points on the inner and outer boundaries as one vector.
- alignment using the left ventricular coordinate system can be performed on the assumption that the left ventricular coordinate system is detected.
- a left ventricular coordinate system and a three-dimensional boundary are taught for a plurality of volume data collected in advance. Since the boundaries taught in multiple volume data are expressed in independent three-dimensional image coordinate systems, even if the two boundaries are exactly the same shape, if the positions on each image are different, The boundary shape vectors have different values. Therefore, normalization of the boundary shape using the left ventricular coordinate system is performed to absorb the difference in the coordinate system of the boundary shape vector. Normalization is performed by coordinate-transforming points on the boundary into the left ventricular coordinate system in each volume data.
- FIG. 15 is a diagram of the boundary model learned in this manner, and is a left ventricular boundary model in which the major axis and minor axis directions are expressed by 18 points.
- the boundary pattern model is an image pattern around a three-dimensional boundary.
- the second storage unit 220 collects boundary patterns around a three-dimensional boundary taught in a plurality of learning volume data collected in advance, and learns it as a boundary pattern model. The learning of the boundary pattern model will be described with reference to FIGS.
- FIG. 16 is a vw plan view of the heart.
- the vertical axis represents pixel values (for example, luminance values), and the horizontal axis represents a straight line passing through the inner and outer boundaries divided by a plurality of points (30 points in the figure). 1 to 30) are assigned. That is, in FIG. 16, the inner point of the myocardial region of the heart is No. 10 in FIG. 17, and the outer point is No. 20. Further, in the vw cross-sectional view of FIG. 16, the pixel is divided into 18 parts at equal angles in the circumferential direction (direction indicated by the arrow) with the origin p as the center, and 30 profiled pixels on the straight line at each divided position
- the value pattern means series 1 to 18 in FIG.
- the pixel value patterns represented in each series in FIG. 17 are patterns obtained by averaging the pattern of pixel values of each series collected from a plurality of learning volume data (hereinafter referred to as “boundary pattern”). is there. Since this averaged pattern (boundary pattern) is the number of points (18 points) on the boundary in the major axis (u-axis) direction, these boundary patterns are collectively referred to as a boundary pattern model.
- a profile relating to a linear image passing through two points having the same point number (same sequence number) as shown in FIG. 16 is stored from a plurality of learning volume data. Since there are as many profiles regarding a linear image passing through two points of the same point number (same series number) as the number of learning volume data, the pixel values of these profiles are averaged as shown in FIG. Create a boundary pattern. Since there are as many boundary patterns as there are points on the boundary in the u-axis direction, these are collectively referred to as a boundary pattern model.
- FIG. 14 is a flowchart showing the operation of the medical image processing apparatus 20.
- step 21 the acquisition unit 201 acquires volume data in which the heart is imaged.
- the coordinate system detection unit 202 detects the left ventricular coordinate system using the volume data obtained from the acquisition unit 201. This detection method is the same as in the first embodiment.
- the boundary detection unit 203 uses the left ventricular coordinate system obtained by the coordinate system detection unit 202, the boundary model stored in the first storage unit 210, and the boundary pattern model stored in the second storage unit 220. Is used to detect the left ventricular boundary from the volume data obtained from the acquisition unit 201.
- the boundary detection unit 203 of this embodiment detects the intima boundary and the epicardial boundary of the left ventricle as follows.
- the boundary detection unit 203 applies the boundary model represented by the dynamic shape model to the volume data.
- the boundary detection unit 203 obtains a boundary pattern (hereinafter referred to as “input boundary pattern”) at the position of the boundary model applied to the volume data.
- the input boundary pattern is expressed by pixel values in the volume data.
- the boundary detection unit 203 obtains energy necessary for detecting the boundary. This energy determines an error between the input boundary pattern and the boundary pattern model described above. Examples of the “error” include a square error, an absolute sum, a difference between the luminance value of each pixel of the input boundary pattern and the luminance value of each pixel in the boundary pattern model.
- the boundary detection unit 203 deforms the boundary model represented by the dynamic shape model so as to reduce the energy.
- the deformation changes, for example, the weight of the linear sum of the average shape representing the dynamic shape model. Decreasing energy is equivalent to deforming the boundary model to resemble the boundary pattern model.
- the boundary detection unit 203 obtains a boundary model that is deformed so that the energy (an error between the boundary pattern and the boundary pattern model) is minimized.
- the intima boundary and epicardial boundary of the left ventricle detected by this boundary model are the intima boundary and epicardial boundary to be finally detected.
- the evaluation unit 204 compares the minimized energy (hereinafter referred to as “final energy”) with a predetermined threshold value.
- the display unit 205 may display a comparison result (that is, an evaluation result) with a threshold value. Here, it is evaluated whether or not it is lower than the threshold value, and the following is performed according to the result. If the final energy is less than or equal to the threshold value, the process proceeds to step 25 (in the case of Yes). When the minimum energy is higher than the threshold value, the process proceeds to step 26 (in the case of No).
- step 25 since the minimum energy is equal to or less than the threshold value, the display unit 205 displays the cross-sectional image and the boundary detection result using the left ventricular coordinate system, the left ventricular boundary, and the volume data.
- step 26 since the minimum energy is higher than the threshold value, the display unit 205 notifies the user that there is a high possibility that the boundary detection is not accurate. That is, the high final energy means that the left ventricular boundary is detected at a position showing a pattern different from the boundary pattern model. In this case, since the detection of the left ventricular coordinate system is incorrect, it can be determined that the correct boundary position could not be detected, so that the left ventricular boundary cannot be detected is displayed. In this case, it is possible to automatically take measures such as detecting the left ventricular coordinate system again by changing the conditions.
- the coordinate system detection unit 202 detects the left ventricular coordinate system
- the boundary detection unit 203 detects the left ventricular boundary based on the left ventricular coordinate system, thereby enabling highly accurate boundary detection.
- the confirmation of the boundary detection result can be made easy to understand on the display unit 205, and a fail-safe function can be provided such as determining whether or not the boundary detection result is acceptable and displaying it, or performing the process again.
- boundary pattern model has been described with a pattern obtained by averaging a plurality of learning volume data.
- profiles relating to images are collected from a plurality of learning volume data, standard deviation and covariance between boundary patterns are collected.
- Boundary pattern models can be constructed using known pattern recognition techniques, such as discriminators that are learned using subspaces or profiles relating to collected boundaries and profiles that are independent of boundaries.
- a medical image processing apparatus 20 according to the fifth embodiment will be described.
- the medical image processing apparatus 20 of the present embodiment is a first modification of the medical image processing apparatus 20 of the fourth embodiment.
- a fail-safe function is provided for the automatic detection process. For example, if the volume data acquired by the acquisition unit 201 does not capture the entire heart, or if the heart is not actually captured, the left ventricular coordinate system may be detected depending on the contents of the volume data itself. May fail. If detection of the left ventricular coordinate system fails, the display unit 205 notifies the user that the left ventricular coordinate system cannot be detected. Then, by displaying the volume data on the basis of the image coordinate system, it is possible to reduce work errors by providing a mechanism for allowing the user to confirm whether the volume data used for detection of the left ventricular coordinate system is appropriate.
- step 26 of the fourth embodiment in the final evaluation performed after the boundary detection process, the notification of the boundary detection failure is notified to the user by the evaluation of the final energy at the detected left ventricular boundary and the threshold value. Can be given. At this time, by displaying only the volume data without displaying the detected boundary, the user can easily check whether the volume data is appropriate.
- the left ventricular coordinate system can be detected again under different predetermined conditions.
- the different conditions are to change the range in which the left ventricular coordinate system is searched, the random number used during the search, or the boundary model used.
- the user may be allowed to set the left ventricular coordinate system.
- the user may be provided with a user interface for designating the left ventricular coordinate system, and the boundary detection process may be executed again after the teaching is completed.
- a fail-safe function can be provided for failures due to fully automatic detection.
- a medical image processing apparatus 30 according to the sixth embodiment will be described with reference to FIGS.
- FIG. 18 is a diagram illustrating a configuration example of an image processing system in which the medical image processing apparatus 30 according to the sixth embodiment is installed.
- the image processing system shown in FIG. 18 has a medical image diagnostic apparatus 100, an image storage apparatus 200, and a medical image processing apparatus 30.
- Each device illustrated in FIG. 18 is in a state where it can communicate with each other directly or indirectly by, for example, a hospital LAN (Local Area Network) installed in a hospital.
- a hospital LAN Local Area Network
- PACS Picture Archiving and Communication System
- each device transmits and receives medical images and the like according to the DICOM (Digital Imaging and Communications in Medicine) standard.
- DICOM Digital Imaging and Communications in Medicine
- the medical image diagnostic apparatus 100 is an X-ray diagnostic apparatus, an X-ray CT (Computed Tomography) apparatus, an MRI (Magnetic Resonance Imaging) apparatus, an ultrasonic diagnostic apparatus, or a group of these apparatuses.
- the medical image photographed by the medical image diagnostic apparatus 100 is two-dimensional image data or three-dimensional image data (volume data).
- the medical image captured by the medical image diagnostic apparatus 100 is two-dimensional moving image data or three-dimensional moving image data in which these image data are captured in time series.
- imaging performed by an X-ray CT apparatus which is an example of the medical image diagnostic apparatus 100 will be briefly described.
- the X-ray CT apparatus has a rotating frame that can rotate while supporting an X-ray tube that irradiates X-rays and an X-ray detector that detects X-rays transmitted through the subject at opposing positions.
- the X-ray CT apparatus collects projection data by rotating a rotating frame while irradiating X-rays from an X-ray tube, and reconstructs an X-ray CT image from the projection data.
- the X-ray CT image is a tomographic image on the rotation plane (axial plane) between the X-ray tube and the X-ray detector.
- a plurality of detection element arrays which are X-ray detection elements arrayed in the channel direction, are arrayed along the body axis direction of the subject.
- an X-ray CT apparatus having an X-ray detector in which 16 detection element arrays are arranged from a projection data acquired by one rotation of a rotating frame, a plurality of sheets (in the body axis direction of the subject) ( For example, 16 X-ray CT images are reconstructed.
- the X-ray CT apparatus reconstructs, for example, 500 X-ray CT images covering the entire heart as volume data by rotating the rotating frame and moving the top plate on which the subject is mounted. be able to.
- volume data covering the entire heart can be reconstructed only by performing a conventional scan that rotates the rotating frame once. Can do.
- the X-ray CT apparatus can take an X-ray CT image in time series by continuously performing a helical scan and a conventional scan.
- the MRI apparatus uses an MR signal acquired by changing the phase encoding gradient magnetic field, the slice selection gradient magnetic field, and the frequency encoding gradient magnetic field, and an MRI image of an arbitrary cross section or an MRI image of an arbitrary plurality of cross sections. (Volume data) can be reconstructed.
- the ultrasonic diagnostic apparatus can generate an ultrasonic image of an arbitrary cross section by an operator adjusting the position of an ultrasonic probe that performs two-dimensional ultrasonic scanning.
- the ultrasonic diagnostic apparatus can generate a three-dimensional ultrasonic image (volume data) by performing a three-dimensional scanning of ultrasonic waves by using a mechanical scan probe or a 2D probe.
- the X-ray diagnostic apparatus performs imaging while fixing the position of the C-arm that supports the X-ray tube and the X-ray detector, thereby generating a two-dimensional X-ray image. Further, the X-ray diagnostic apparatus can generate a three-dimensional X-ray image (volume data) by rotating the C arm.
- the image storage device 200 is a database that stores medical images. Specifically, the image storage apparatus 200 stores and stores the medical image transmitted from the medical image diagnostic apparatus 100 in the storage unit of the own apparatus. The medical image stored in the image storage device 200 is stored in association with incidental information such as a patient ID, examination ID, device ID, and series ID, for example.
- the medical image processing apparatus 30 is, for example, a workstation or a PC (Personal Computer) used by a doctor or laboratory technician working in a hospital to interpret a medical image.
- An operator of the medical image processing apparatus 30 can acquire a necessary medical image from the image storage apparatus 200 by performing a search using a patient ID, examination ID, apparatus ID, series ID, and the like.
- the medical image processing apparatus 30 according to the sixth embodiment is an apparatus that performs various types of image processing on medical images in addition to displaying medical images for interpretation.
- the medical image processing apparatus 30 according to the sixth embodiment has a function of performing various image processes for supporting image diagnosis.
- the application of the image processing system described above is not limited when PACS is introduced.
- the image processing system is similarly applied when an electronic medical record system for managing an electronic medical record attached with a medical image is introduced.
- the image storage device 200 is a database that stores electronic medical records.
- the image processing system is similarly applied when HIS (Hospital Information System) and RIS (Radiology Information System) are introduced.
- HIS Hospital Information System
- RIS Radiology Information System
- the image processing system is not limited to the configuration example described above. The functions and sharing of each device may be appropriately changed according to the operation mode.
- the medical image processing apparatus 30 may acquire a medical image directly from the medical image diagnostic apparatus 100 or a storage medium such as a DVD.
- the medical image processing apparatus 30 according to the sixth embodiment detects the boundary of the myocardium in the input image captured including the subject's heart as image processing for image diagnosis support. For example, the medical image processing apparatus 30 according to the sixth embodiment detects a myocardial boundary in an input image that is a medical image photographed including the heart of a subject after injection of a contrast medium.
- the input image is image data in which, in a two-dimensional space or a three-dimensional space, the shape of the heart and the shadow of each part by the contrast agent are depicted in shades of brightness values.
- the input image is a plurality of contrast image data along a time series that is taken continuously.
- the input image may be taken after injecting the contrast agent, and at least a part of the myocardium around the left ventricle may be depicted.
- FIG. 19 is a diagram illustrating a configuration example of the medical image processing apparatus 30 according to the sixth embodiment.
- the medical image processing apparatus 30 according to the sixth embodiment includes an input unit 11, a display unit 12, a communication unit 13, a storage unit 14, and a control unit 15.
- the input unit 11 is a mouse, a keyboard, a trackball, or the like, and receives input of various operations on the medical image processing apparatus 30 from the operator. Specifically, the input unit 11 according to the sixth embodiment receives input of information for acquiring a medical image to be subjected to image processing from the image storage device 200. For example, the input unit 11 accepts input of patient ID, examination ID, device ID, series ID, and the like. Further, the input unit 11 according to the sixth embodiment accepts input of conditions for various processes performed by the control unit 15 described later. The input unit 11 also serves as an acquisition unit that acquires three-dimensional volume data of the heart.
- the display unit 12 is a monitor, for example, and displays various information. Specifically, the display unit 12 according to the sixth embodiment displays a GUI (Graphical User Interface) for receiving various operations from the operator, a medical image, and the like.
- GUI Graphic User Interface
- the communication unit 13 is a NIC (Network Interface Card) or the like, and communicates with other devices. For example, the communication unit 13 transmits information such as a patient ID received by the input unit 11 to the image storage device 200 and receives a medical image from the image storage device 200.
- NIC Network Interface Card
- the storage unit 14 is a hard disk, a semiconductor memory element, or the like, and stores various information. Specifically, the storage unit 14 according to the sixth embodiment stores information used for various processes performed by the control unit 15 described later. More specifically, as shown in FIG. 19, the storage unit 14 according to the sixth embodiment includes an image storage unit 141, a site-specific template storage unit 142, a boundary pattern model storage unit 143, and a boundary model storage. Part 144.
- the image storage unit 141 stores medical images acquired from the image storage device 200 via the communication unit 13, processing results of the control unit 15, and the like. Specifically, the image storage unit 141 stores an input image acquired from the image storage device 200 as a myocardial boundary detection target. For example, the image storage unit 141 uses, as an input image, a four-chamber cross-sectional image of a heart dyed by a contrast agent, a three-chamber cross-sectional image, a two-chamber cross-sectional image, or a volume of the whole heart dyed by a contrast agent. Store the data.
- the site-specific template storage unit 142 stores a template for detecting each site of the heart by pattern matching in association with each site.
- the boundary pattern model storage unit 143 stores a boundary pattern model in which the myocardium and the brightness value pattern around the myocardial boundary in the image including the heart and stained with the contrast agent are modeled by learning.
- the boundary pattern model storage unit 143 stores a boundary pattern model in which a pattern of luminance values around a myocardium and a myocardial boundary in a medical image including a heart stained with a contrast agent is modeled by learning.
- the boundary model storage unit 144 stores a boundary model in which the boundary shape of the myocardium in the image used for learning the boundary pattern model is modeled by learning.
- the boundary model storage unit 144 stores a boundary model in which the boundary shape of the myocardium in the medical image used for learning the boundary pattern model is modeled by learning.
- the site-specific template, the boundary pattern model, and the boundary model will be described in detail later.
- the control unit 15 is an electronic circuit such as a CPU (Central Processing Unit) or MPU (Micro Processing Unit), an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array), and the medical image processing apparatus 30. Perform overall control.
- a CPU Central Processing Unit
- MPU Micro Processing Unit
- ASIC Application Specific Integrated Circuit
- FPGA Field Programmable Gate Array
- control unit 15 controls the display of the GUI and the display of medical images on the display unit 12.
- control unit 15 controls transmission / reception performed with the image storage apparatus 200 via the communication unit 13.
- control unit 15 controls the storage and reading of various data such as medical images in the storage unit 14.
- control unit 15 performs various image processing on the medical image.
- image processing the control unit 15 performs various rendering processes for displaying a medical image as volume data on the display unit 12.
- rendering process the control unit 15 reconstructs an MPR image from the volume data by a volume rendering process that generates a two-dimensional image reflecting the three-dimensional information of the volume data or a cross section reconstruction method (MPR: Multi Planer Reconstruction). The process to do is performed.
- MPR Multi Planer Reconstruction
- volume data taken by an X-ray CT apparatus and including volume heart data stained by a contrast agent a four-chamber cross-sectional image of the heart is used as a boundary detection target of the myocardium, four input images are used.
- the operator inputs a display request for, for example, three orthogonal cross sections (axial surface, coronal surface, sagittal surface) of the volume data via the input unit 11.
- the control unit 15 generates three orthogonal cross sections from the volume data and causes the display unit 12 to display them.
- the operator refers to the three orthogonal cross sections and sets a cross section in which the right atrium, the right ventricle, the left atrium, and the left ventricle are all depicted.
- the control unit 15 reconstructs a four-chamber cross-sectional image (MPR image) using the set cross-section, and stores the reconstructed four-chamber cross-sectional image as an input image in the image storage unit 141.
- the cross section set by the operator is, for example, a cross section parallel to the displayed coronal plane or an oblique cross section.
- FIG. 20 is a diagram illustrating an example of an input image.
- the input image illustrated in FIG. 20 is a four-chamber cross-sectional image of the heart in the volume data captured by the X-ray CT image.
- the input image includes the right atrium, the right ventricle, the left atrium, and the left ventricle, which are four heart chambers surrounded by the myocardium, and is further located at the boundary between the right atrium and the right ventricle. And a mitral valve located between the left atrium and the left ventricle.
- the control unit 15 includes a coordinate system detection unit 151, a part region detection unit 152, a calculation unit 153, a correction unit 154, and a boundary detection unit 155.
- the myocardial boundary in the input image is detected by the functions of these processing units.
- the control unit 15 according to the sixth embodiment detects the boundary of the myocardium around the left ventricle in the four-chamber cross-sectional image that is the input image shown in FIG.
- boundary pattern model and the boundary model used for boundary detection will be described in detail, and then the processing of the coordinate system detection unit 151, the region region detection unit 152, the calculation unit 153, the correction unit 154, and the boundary detection unit 155 will be described in detail. explain.
- the boundary pattern model and the boundary model are created in advance using the same learning image group and stored in the storage unit 14 in order to perform boundary detection. Specifically, a boundary model is created from the learning image group, and then a boundary pattern model is created using the learning image group and the boundary model.
- the control unit 15 creates a boundary pattern model and a boundary model and stores them in the storage unit 14 will be described. However, this embodiment may be a case where an apparatus other than the medical image processing apparatus 30 creates the boundary pattern model and the boundary model. In such a case, for example, the control unit 15 receives the boundary pattern model and the boundary model via the communication unit 13 and stores them in the storage unit 14. Alternatively, for example, the control unit 15 reads the boundary pattern model and the boundary model stored in the storage medium and stores them in the storage unit 14.
- FIG. 21 is a diagram illustrating an example of a learning image group.
- the learning image group is a four-chamber cross-sectional image of each of a plurality of subjects imaged by contrast-enhanced CT examination of the heart.
- the boundary pattern model according to the sixth embodiment is data obtained by learning the myocardium around the left ventricle and the luminance value pattern around the myocardial boundary in the four-chamber cross-sectional image group shown in FIG.
- the boundary model according to the sixth embodiment is data in which the boundary shape of the myocardium around the left ventricle in the four-chamber cross-sectional image group shown in FIG. 21 is modeled by learning.
- FIG. 22 is a diagram illustrating an example of creating a boundary model
- FIG. 23 is a diagram illustrating an example of creating a boundary pattern model.
- each boundary is represented by a two-dimensional point group.
- the model creator determines a point on the intima of the left ventricle and a point on the epicardium of the left ventricle in each of the four-chamber cross-sectional images constituting the learning image group. A plurality of pairs are set along the myocardium.
- the intima shape of the left ventricle of each learning image is represented by “N” vectors, and the outside of the left ventricle of each learning image is displayed.
- the film shape is expressed by “N” vectors. That is, “2N” vectors for each learning image form a boundary vector group for learning the boundary shape.
- the sizes and directions of the four-chamber cross-sectional images constituting the learning image group are different.
- the shape expressed by each boundary vector group is expressed by different image coordinate systems. Therefore, in the creation of the boundary model, the shape is normalized using the left ventricular coordinate system obtained from each learning image.
- the control unit 15 obtains the long axis of the left ventricle, the center of the left ventricle, and the short axis perpendicular to the long axis from each learning image. For example, the control unit 15 uses a line segment connecting the position of the mitral valve and the apex of the left ventricle as the long axis, and the midpoint of the line segment is positioned at the center of the left ventricle. For example, the control unit 15 sets the direction perpendicular to the long axis direction and toward the right ventricle as the short axis direction. As a result, in each learning image, as shown in the lower left diagram of FIG.
- an orthogonal coordinate system of the left ventricle defined by a center position vector and a long axis vector and a short axis vector with the center position as the origin is obtained. It is done. Then, the control unit 15 sets the scale of the left ventricular coordinate system obtained from each learning image to “1” as the length from the left ventricular center position to the apex of the left ventricle, thereby normalizing the orthogonal coordinate system. Ask for. Then, the control unit 15 converts the coordinate system of the boundary vector group by the normalized orthogonal coordinate system for each learning image. As a result, the control unit 15 obtains a learning boundary vector group normalized by the normalized orthogonal coordinate system.
- control part 15 calculates
- the control unit 15 stores the average shape and the shape basis vector in the boundary model storage unit 144 as a boundary model.
- the average shape as the boundary model is a vector group expressing the average shape of the intima shape and the epicardial shape of the left ventricle, as indicated by two dotted lines in the lower right diagram of FIG.
- the control unit 15 cuts out a luminance value profile on a line segment passing through an intima boundary point and an epicardial boundary point set as a pair from each learning image. Since a plurality of pairs of intima boundary points and epicardial boundary points are set, the control unit 15 cuts out a plurality of luminance value profiles from one learning image. Then, the control unit 15 performs luminance value pattern learning using a plurality of luminance value profiles cut out from the respective learning images. For example, the control unit 15 cuts out the luminance value profile using a pair of two points (an intima boundary point and an epicardial boundary point) set in the learning image shown in the upper diagram of FIG.
- the line segment used for the luminance value profile is set as shown in the upper left diagram of FIG. 23, for example.
- the control unit 15 sets a straight line obtained by extending a line segment connecting the intima boundary point and the epicardial boundary point inward (left ventricle lumen side) and outward (left ventricle outer side).
- the control part 15 is a point (inside point) located inside by the same distance as the distance between the inner and outer membranes, and a point (outside) located outside by the same distance as the distance between the inner and outer membranes in the set straight line. Point).
- the control unit 15 cuts out the luminance value profile on the line segment connecting the inner point and the outer point.
- the space between the medial point and the intimal boundary point corresponds to the left ventricular cavity
- the space between the intimal boundary point and the epicardial boundary point corresponds to the myocardium
- the space between the epicardial boundary point and the outer point corresponds to the outside of the left ventricle.
- the control unit 15 cuts out the luminance value profile in a direction from the inner point to the outer point.
- control unit 15 sets a plurality of line segments connecting the inner point and the outer point in one learning image. As a result, the control unit 15 acquires the luminance value in the range surrounded by the two-dot chain lines shown in the upper right diagram of FIG. 23 as the luminance value pattern in the peripheral region of the myocardial boundary. Then, the control unit 15 creates a boundary pattern model by learning the luminance value pattern in the peripheral area of the myocardial boundary in each learning image.
- control unit 15 identifies a corresponding pair between the learning images by converting the left ventricular coordinate system of each learning image into a normalized orthogonal coordinate system. Then, for example, the control unit 15 creates a boundary pattern model by calculating an average value of the luminance value profiles between the identified pairs. Or the control part 15 calculates the average value and standard deviation of a luminance value profile as a boundary pattern model.
- the lower part of FIG. 23 shows an example of the boundary pattern model created by the above-described processing.
- the boundary pattern model is a luminance value sequence in which the luminance values of the pixels are arranged in a direction from the left ventricular lumen to the outside of the left ventricle for each pair (P).
- the control unit 15 associates the portions “left ventricle lumen, myocardium, outside the left ventricle” where the pixels of each luminance value are located. Further, for example, the control unit 15 sets the arrangement order of the luminance value sequence (the arrangement order of the pairs) as “the order toward the side wall, the apex, and the septum”.
- the boundary pattern model has the luminance values “a1 to a10” of the ten pixels constituting the left ventricle as the luminance value string “P: 1”.
- the luminance values “a11 to a20” of the ten pixels constituting the myocardium and the luminance values “a21 to a30” of the ten pixels constituting the outside of the left ventricle are stored.
- it may be associated with whether the part outside the left ventricle is outside the heart or the right ventricle.
- the luminance values “a21 to a30” of the ten pixels constituting the outside of the left ventricle are associated as “outside the heart”.
- the luminance value string length differs for each pair (P).
- the boundary pattern model has a luminance value sequence of “P: 200”, the luminance values of 13 pixels constituting the left ventricle, the luminance values of 13 pixels constituting the myocardium, and the left ventricular exterior. And the luminance values of the 13 pixels constituting the.
- the luminance value string length may be fixed, and the interval of pixels acquired for creating the luminance value string may be varied according to the distance between the inner and outer membranes.
- the control unit 15 stores the boundary pattern model illustrated in the lower diagram of FIG. 23 in the boundary pattern model storage unit 143. In the subsequent process, the control unit 15 determines that the first pixel of the luminance value string “P: 1” is the pixel of the left ventricle region having the luminance value “a1”, and the fifteenth pixel is the luminance value “a15”. Information such as a pixel in the myocardial region can be acquired from the boundary pattern model.
- the coordinate system detection unit 151 detects at least the long axis of the heart as a left ventricular coordinate system indicating the position and orientation of the heart in the input image from the input image.
- the coordinate system detection unit 151 detects at least the long axis of the left ventricle as the left ventricular coordinate system from the input image.
- the coordinate system detection unit 151 detects the short axis of the heart in the input image as the left ventricular coordinate system together with the long axis.
- the information detection method performed by the coordinate system detection unit 151 will be described roughly as two types: a first information detection method and a second information detection method.
- the coordinate system detection unit 151 detects the position of the mitral valve and the position of the apex from the input image, and connects the line segment connecting the position of the mitral valve and the position of the apex. The midpoint of is the left ventricular center position. Then, the coordinate system detection unit 151 detects a vector from the left ventricular center position to the apex position as the long axis.
- the coordinate system detection unit 151 uses a template for each part for pattern matching stored in advance in the template storage unit 142 for each part described above.
- FIG. 24 is a diagram illustrating an example of a site-specific template.
- the site-specific template storage unit 142 stores a luminance pattern around a previously learned apex (see the dotted circle in the figure) as an apex template.
- the site-specific template storage unit 142 stores a luminance pattern around a mitral valve (see a dotted circle in the drawing) learned in advance as a mitral valve template.
- the coordinate system detection unit 151 performs pattern matching between the input image and the mitral valve template to detect the mitral valve position. Similarly, the coordinate system detection unit 151 performs pattern matching between the input image and the apex template to detect the position of the apex.
- FIG. 25 is a diagram illustrating an example of a left ventricular coordinate system. As shown in FIG. 25, the coordinate system detection unit 151 detects the center position of the left ventricle and the long axis as the left ventricular coordinate system. As a result, the coordinate system detection unit 151 detects the origin and the long axis vector in the left ventricular coordinate system of the input image.
- the coordinate system detection unit 151 When performing the first information detection method, the coordinate system detection unit 151 further detects a short axis in the four-chamber cross-sectional image that is the input image. Specifically, the coordinate system detection unit 151 uses a site-specific template as in the long axis detection. For example, the coordinate system detection unit 151 uses a right ventricular corner point template as the site-specific template.
- FIG. 26 is a diagram illustrating an example of a part used for short axis detection.
- the corner point of the right ventricle is, as illustrated in FIG. 26, a point located on the outermost side on the outer periphery of the right ventricle in the two-chamber cross-sectional image in which the left ventricle and the right ventricle are depicted. That is.
- the coordinate system detection unit 151 performs pattern matching between the input image and the corner point template to detect the position of the corner point of the right ventricle.
- the coordinate system detection unit 151 obtains a line segment orthogonal to the long axis from the corner point of the right ventricle, and sets the short axis as shown in FIG. 25 by translating the obtained line segment to the origin. Thereby, the coordinate system detection unit 151 detects a short axis vector in the left ventricular coordinate system of the input image.
- the coordinate system detection unit 151 uses the tricuspid valve template for the position of the tricuspid valve in addition to the corner point of the right ventricle. May be detected. Note that when the minor axis of the three-chamber cross-sectional image is detected by the first information detection method, the coordinate system detection unit 151 detects a left ventricular outflow passage through which blood flows from the left ventricle to the aorta. In addition, when the short axis of the two-chamber cross-sectional image is detected by the first information detection method, the coordinate system detection unit 151 detects the front wall point.
- the coordinate system detection unit 151 pattern-matches the long axis template, which is a luminance pattern around the long axis learned in advance, with the input image to determine the long axis. To detect.
- the “periphery of the major axis” indicates a rectangular area that is uniquely determined from the major axis.
- the coordinate system detection unit 151 performs pattern matching between the short axis template, which is a luminance pattern around the short axis, learned in advance, and the input image to determine the short axis. To detect.
- the sixth embodiment may be a case where the major axis and the minor axis are detected as the left ventricular coordinate system by a method other than the method described above. Further, the sixth embodiment may be a case where only the information on the long axis is detected as the left ventricular coordinate system.
- the method for detecting the left ventricular coordinate system is not limited to the above-described method, and any method can be used.
- the part region detection unit 152 detects a predetermined part region in the input image using the left ventricular coordinate system. Specifically, the part region detection unit 152 detects a region including at least one of the ventricle, the atrium, the left ventricular outflow tract, the annulus, the papillary muscle, the myocardium, and the coronary artery as the predetermined part region. Furthermore, in the sixth embodiment, the part region detection unit 152 detects a plurality of part regions using the left ventricular coordinate system. In the sixth embodiment, the region region detection unit 152 uses a left ventricular coordinate system to determine a left ventricular region (left ventricular lumen region), a myocardial region, and a right ventricular region (right ventricular lumen region). A case of detection will be described. Note that the part region includes at least one pixel. 27 to 31 are diagrams showing a method for detecting a region.
- the part region detection unit 152 detects a predetermined range determined by the long axis included in the left ventricular coordinate system as a part region in the input image.
- the region region detection unit 152 detects a partial range of the long axis as the left ventricular region.
- the region region detection unit 152 detects a solid line portion on the long axis as a left ventricular region.
- the part region detection unit 152 detects a rectangle of a predetermined size including the long axis as the left ventricular region.
- the partial range on the long axis and the size of the rectangle are statistically obtained values, and such values are set in advance in the region region detection unit 152.
- Detecting the right ventricle includes a first detection method using a long axis and a second detection method using a long axis and a short axis.
- the region detection unit 152 detects, for example, a region within a predetermined range located at a predetermined distance from the long axis indicated by the hatched portion in FIG. 28 as the right ventricular region.
- a “vector L1” is set in which the midpoint of the line segment between the center of the left ventricle and the apex is the start point, and the end point is located at a predetermined distance from the midpoint toward the left side on the image. The In the method illustrated in FIG.
- vector L1 is a vector orthogonal to the long axis.
- the part region detection unit 152 uses, as a reference, the end point of “vector L1”, two line segments orthogonal to “vector L1” have long sides, and two line segments parallel to “vector L1” Is detected as a left ventricular region.
- the distance from the midpoint to the end point and the value of the rectangle size are statistically obtained values, and these values are set in the region region detection unit 152 in advance.
- the left-right direction in the input image can be acquired from information such as the coordinate system of the medical image diagnostic apparatus 100 and the posture of the subject, which are given as supplementary information to the medical image in accordance with the DICOM standard.
- the second detection method of the right ventricular region is performed when information on the short axis is detected as the left ventricular coordinate system. That is, in the second detection method, the part region detection unit 152 detects a predetermined range determined by the major axis direction and the minor axis direction as a part region (right ventricular region) in the input image. For example, in the second detection method of the right ventricular region, the part region detection unit 152 is parallel to the short axis vector as shown by the solid line in FIG. 29, and the midpoint of the line segment between the left ventricular center and the apex is determined. A line segment on “vector L2” as a starting point is detected as a right ventricular region.
- the part region detection unit 152 detects a rectangle of a predetermined size including “vector L2” as the right ventricular region. Note that the start position of “vector L2,” the position and length of the line segment on “vector L2,” and the size of the rectangle are values that are statistically obtained. Is set.
- the detection of the myocardial region includes the first detection method using the long axis and the second detection method using the boundary model described above.
- the part region detection unit 152 detects a region of a predetermined size located at a predetermined distance from the major axis as a myocardial region, for example, as indicated by a hatched rectangle in FIG.
- a “vector L3” is set in which a point located at a predetermined distance on the long axis from the center of the left ventricle is set as a start point, and an end point is located at a predetermined distance from the start point toward the right side on the image. Is done.
- FIG. 30 a “vector L3” is set in which a point located at a predetermined distance on the long axis from the center of the left ventricle is set as a start point, and an end point is located at a predetermined distance from the start point toward the right side on the image. Is done.
- FIG. 30 a “vector L3
- vector L3 is a vector orthogonal to the long axis. Then, for example, the part region detection unit 152 uses, as a reference, the end point of “vector L3”, two line segments orthogonal to “vector L3” have long sides, and two line segments parallel to “vector L3” A rectangle of a predetermined size having a short side is detected as a myocardial region. Note that the positions of the start point and end point and the size of the rectangle are statistically obtained values, and such values are set in advance in the part region detection unit 152. Further, the left-right direction in the input image can be acquired from the supplementary information of the medical image as described above.
- the region region detection unit 152 converts the boundary model (average shape) indicated by two dotted lines in FIG. 31 into the left ventricular coordinate system of the input image. Then, the region region detection unit 152 applies the converted boundary model to the input image based on the left ventricular coordinate system (the center position of the left ventricle and the position of the apex).
- the boundary model is, for example, a bowl shape centered on the long axis, and can be uniquely applied by using information on the long axis of the input image and information on the long axis of the converted boundary model. Then, the region region detection unit 152 obtains the center line (see the solid curve shown in FIG.
- the myocardial region determined by the above processing is a rough region, and the sixth embodiment is a case where a rectangle or the like in the fitted boundary model is detected as the myocardial region in addition to the center line. Also good. Further, when performing the second detection method of the myocardial region, the region detection unit 152 detects the myocardial region from the processing result at an intermediate stage in the boundary model fitting process that is repeatedly executed by the subsequent boundary detection unit 155. It may be.
- the part region detection unit 152 selects a boundary model that most closely matches the left ventricular coordinate system of the input image from the plurality of boundary models, and selects the selected boundary A myocardial region may be detected using a model.
- the region region detection unit 152 may use short axis information.
- the part region detection unit 152 may detect a predetermined range that is statistically a myocardium on a straight line parallel to the short axis vector as a myocardial region.
- the method for detecting the region of the region is not limited to the above-described method, and any method can be used as long as the region of the region can be specified using the left ventricular coordinate system.
- the calculation unit 153 calculates the degree of staining indicating the contrast agent concentration in a predetermined region based on the left ventricular coordinate system.
- the calculation unit 153 performs a staining degree calculation process using the part region detected by the part region detection unit 152.
- the processing of the calculation unit 153 is performed without performing the processing of the part region detection unit 152. good.
- the concentration of the contrast agent has a correlation with the luminance value.
- the luminance value increases as the concentration of the contrast agent increases. Therefore, the staining degree can be calculated from the luminance value in the input image.
- the calculation unit 153 calculates a statistical representative value in the luminance value sequence of a plurality of pixels constituting the part region as the degree of staining of the part region. For example, the calculation unit 153 sets the luminance values of all the pixels constituting the part region as a luminance value string. Alternatively, for example, the calculation unit 153 randomly selects a predetermined number of pixels from all the pixels constituting the part region, and sets the luminance values of the selected plurality of pixels as a luminance value string. And the calculation part 153 makes the median value of a luminance value row
- the statistical representative value is not limited to the median value, and may be, for example, a mode value, a maximum value, a minimum value, an average value, or a standard deviation of a luminance value sequence.
- the representative value as the degree of shade is calculated as the top Nth value in the sorted luminance value sequence by sorting the luminance value sequence in order of the luminance value in order to remove the influence of noise in the image. It may be.
- the representative value used as the staining degree may be a combination of a plurality of representative values such as an average value and a standard deviation.
- the correction unit 154 performs correction such that the luminance value of the input image and the luminance value of the boundary pattern model are close to each other in the corresponding part region, using the degree of shade of the part region.
- the correction unit 154 according to the sixth embodiment performs a correction process using the degree of shadow on the boundary pattern model to generate a corrected boundary pattern model.
- the correction unit 154 according to the sixth embodiment converts the luminance value of the boundary pattern model of the part region into the luminance value of the input image of the part region based on the degree of staining of the part region.
- a correction boundary pattern model is generated by performing the correction to be close.
- FIG. 32 is a diagram for explaining the correction processing according to the sixth embodiment. In the following description, it is assumed that the luminance value of the “i” th pixel of the boundary pattern model is “ai” and the correction unit 154 corrects “ai” as “pi”.
- the correction unit 154 calculates “pi” by the following equation (3), for example.
- “dl” is the degree of shadow of the left ventricular region.
- “al” is a representative value (for example, an average value) of luminance values of all the pixels associated with the left ventricular chamber in the same luminance value sequence as “ai” in the boundary pattern model.
- “an” is a luminance value of a pixel associated as an intima boundary point in the myocardium in the same pair of luminance value strings as “ai”, as shown in FIG. Note that the correction unit 154 can acquire “an” with the first pixel as the intima boundary point among the pixels associated as the myocardium in the same pair of luminance value sequences as “ai”. .
- the correction unit 154 when “ai” is greater than “dl”, the correction unit 154 performs an arithmetic process according to the expression (3), thereby determining “ai” as a reference.
- the value “pi” is calculated by reducing the value of “ai” so that “becomes a value close to“ dl ”.
- the correction unit 154 calculates “pi” by, for example, the following equation (4).
- “dr” is the degree of staining of the right ventricular region.
- “ar” is a representative value (for example, an average value) of luminance values of all the pixels associated with the right ventricular chamber in the same luminance value sequence as “ai” in the boundary pattern model.
- “ap” is a luminance value of a pixel associated as an outer membrane boundary point in the myocardium in the same pair of luminance value strings as “ai”, as shown in FIG.
- the correction unit 154 may acquire “ap” with the last pixel as the outer membrane boundary point among the pixels associated with the myocardium in the same pair of luminance value sequences as “ai”. it can.
- the correction unit 154 when “ai” is greater than “dr”, the correction unit 154 performs an arithmetic process according to the equation (4), thereby setting “ai” as a reference.
- the value “pi” is calculated by reducing the value of “ai” so that “becomes a value close to“ dr ”.
- the correction unit 154 compares the luminance value of “ai” with the threshold value. Thus, it is determined whether the “i” th pixel is a pixel outside the heart or a pixel in the right ventricle.
- the threshold value is a value obtained statistically.
- the correction unit 154 calculates “pi” by, for example, the following equation (5).
- “dm” is the degree of shadow of the myocardial region.
- “am” is a representative value (for example, an average value) of luminance values of all the pixels associated with the myocardium in the same luminance value sequence as “ai” in the boundary pattern model.
- FIG. 35 is a diagram illustrating a correction example of the boundary pattern model.
- the luminance value profile illustrated in FIG. 35 is a luminance value profile from the pixel in the left ventricle toward the pixel in the right ventricle, and plots the luminance value around the septum that is the myocardium around the left ventricle. is there.
- the luminance value of the boundary pattern model before correction and the luminance value profile of the boundary pattern model after correction increase in the left ventricle, myocardium, and right ventricle. It shows that. This indicates that the staining degree of the input image is higher than that of the learning image group.
- the luminance value of the boundary pattern model may be rounded according to the degree of shadowing.
- the method of correcting the boundary pattern model is not limited to the arithmetic processing by the above formulas (3) to (5). By performing a correction process using the degree of shade by a combination of an addition process, a subtraction process, a multiplication process, a division process, or a rounding process, the luminance value of the boundary pattern model is close to the luminance value of the input image. If correction can be performed, the correction unit 154 may perform arbitrary calculation processing.
- the boundary detection unit 155 detects the boundary of the myocardium in the input image using the data corrected by the correction unit 154. That is, the boundary detection unit 155 according to the sixth embodiment detects the boundary of the myocardium in the input image using the corrected boundary pattern model. Specifically, the boundary detection unit 155 according to the sixth embodiment detects the boundary of the myocardium in the input image using the corrected boundary pattern model and the boundary model. More specifically, the boundary detection unit 155 according to the sixth embodiment detects the boundary of the myocardium around the left ventricle in the input image using the corrected boundary pattern model and the boundary model.
- the boundary detection unit 155 performs matching between the luminance pattern around the boundary and the corrected boundary pattern model when the boundary model is applied to the input image while variously changing the boundary model. Then, the boundary detection unit 155 searches the boundary shape in which the luminance pattern most matched with the corrected boundary pattern model is obtained from the boundary shapes obtained by changing the boundary model, so that the myocardium around the left ventricle in the input image is obtained. Detect boundary of.
- the boundary detection unit 155 initially sets “b” described in Expression (2) to “0” and sets the initial energy value to infinity (processing 1). Then, the boundary detection unit 155 generates the shape “x” based on the current “b” (processing 2). Then, the boundary detection unit 155 converts the generated coordinate system of the shape “x” into the left ventricular coordinate system of the input image obtained by the processing of the coordinate system detection unit 151 (processing 3).
- the boundary detection unit 155 applies the shape “x” after the coordinate conversion to the input image using the left ventricular coordinate system of the input image, and cuts out the luminance pattern around the boundary (processing 4). Then, the boundary detection unit 155 calculates an error (for example, a normalized square error) between the cut-out luminance pattern and the corrected boundary pattern model, and calculates the current energy (Process 5). Then, the boundary detection unit 155 determines the magnitude relationship between the current energy and the previous energy (processing 6). If the current energy is larger than the previous energy as a result of the processing 6, the boundary detection unit 155 ends the processing.
- an error for example, a normalized square error
- the boundary detection unit 155 updates the current value of “b” (process 7). In the first time, since the current energy is smaller than the initial energy, processing 7 is performed. The boundary detection unit 155 repeats the processing from processing 2 to processing 7. If it is determined in process 6 that the current energy is greater than the previous energy, the boundary detection unit 155 obtains “b” that gave the previous energy as “b” that minimizes the energy.
- FIG. 34 is a diagram illustrating an example of detection by the boundary detection unit according to the sixth embodiment.
- the boundary detection unit 155 detects the myocardial boundary of the left ventricle of the four-chamber cross-sectional image that is the input image, and displays it on the display unit 12.
- control unit 15 performs the above-described processing on each of a plurality of input images in time series. Then, for example, in the myocardial perfusion examination, the control unit 15 uses the boundary of the myocardium around the left ventricle detected in each input image to obtain the temporal change in the contrast agent concentration (skin degree) of the myocardium, A perfusion image that can analyze the blood flow dynamics of the myocardium is generated.
- the control unit 15 stores the myocardial boundary detection result and the perfusion image in the image storage unit 141 and causes the display unit 12 to display the perfusion image.
- the control unit 15 uses the myocardial boundary around the left ventricle detected in each input image to calculate an index value of the heart wall motion, such as an ejection rate (EF) of the left ventricle, for example. May be.
- EF ejection rate
- FIG. 35 is a flowchart illustrating a processing example of the medical image processing apparatus 30 according to the sixth embodiment.
- the medical image processing apparatus 30 determines whether a request for detecting a myocardial boundary for an input image has been received (step S101).
- the medical image processing apparatus 30 waits until the detection request is received.
- the coordinate system detection unit 151 detects the left ventricular coordinate system of the heart in the input image (Step S102), and the region region detection unit 152 determines from the left ventricular coordinate system. A part region is detected (step S103). For example, the coordinate system detection unit 151 detects the long and short axes of the left ventricle in the input image, and the region region detection unit 152 detects the left ventricular region, the right ventricular region, and the myocardial region of the input image.
- the calculation unit 153 calculates the degree of staining of the region (step S104), and the correction unit 154 corrects the boundary pattern model using the degree of staining, and generates a corrected boundary pattern model (step S105). .
- the boundary detection unit 155 detects the myocardial boundary of the input image from the input image, the corrected boundary pattern model, and the boundary model (step S106), and ends the process.
- the boundary pattern model is corrected according to the degree of staining, and the boundary of the myocardium in the input image is detected using the corrected boundary pattern model.
- the luminance pattern around the myocardial boundary of the input image is matched with the boundary pattern model.
- the degree of staining of each part of the heart in the input image differs depending on the elapsed time after the injection of the contrast agent, the pulse of each individual, etc., and the boundary pattern model is model data that covers variations of the degree of staining. Absent.
- the matching accuracy between the myocardial boundary luminance pattern of the input image and the boundary pattern model is lowered, and the boundary detection accuracy may be lowered.
- the boundary pattern model in the part region is corrected so as to approach the luminance value of the input image in the part region according to the degree of staining of each part region of the input image for which boundary detection is performed.
- matching with the luminance pattern around the myocardial boundary of the input image is performed on the corrected boundary pattern model.
- the correction boundary pattern model is generated based on the degree of staining of each of the plurality of part regions.
- the correction boundary pattern model is generated based on the degree of staining of one part region. Even if it is, it is applicable.
- the calculation unit 153 calculates the degree of staining of the left ventricle region of the input image specified from the long axis, and the correction unit 154 uses the degree of staining of the left ventricular region to correct the correction boundary.
- the pattern model may be generated.
- the corrected boundary pattern model is a data obtained by correcting the luminance value of the pixel in the left ventricular lumen based on the degree of staining of the left ventricular region. Data in which all luminance values are corrected may be used.
- using the detection result of the boundary detection unit 155 using the detection result of the boundary detection unit 155, the process of the part region detection unit 152, the calculation unit 153, and the correction unit 154 is performed again, and the regenerated corrected boundary pattern model is used.
- the boundary detection unit 155 may perform boundary detection again.
- the number of repetitions when the processes of the part region detection unit 152, the calculation unit 153, the correction unit 154, and the boundary detection unit 155 are repeated is manually set by the operator, for example, a numerical value such as “three times”.
- the processes of the part region detection unit 152, the calculation unit 153, the correction unit 154, and the boundary detection unit 155 may be repeated until the energy calculated by the boundary detection unit 155 is minimized.
- the boundary pattern model and the boundary model are created from the four-chamber cross-sectional image group in order to detect the boundary of the myocardium around the left ventricle in the four-chamber cross-sectional image has been described.
- the boundary pattern model and the boundary model detect the boundary of the myocardium around the left ventricle in the two-chamber cross-sectional image and the three-chamber cross-sectional image
- the two-chamber cross-sectional image group and the three-chamber cross-sectional image It may be created from a group.
- the boundary pattern model and the boundary model may be created not only in the left ventricle but also in each of four heart chambers.
- the control unit 15 can also detect the right atrial myocardial boundary, the right ventricular myocardial boundary, and the left atrial myocardial boundary in the input image.
- boundary pattern model and the boundary model may be created as three-dimensional information by using a volume data group as a learning image group.
- volume data can be used as an input image to be subjected to boundary detection.
- the control unit 15 performs boundary detection by extracting corresponding cross-section information from the three-dimensional boundary pattern model and the three-dimensional boundary model. Can do.
- the sixth embodiment may be a case where the cardiac phases of the learning image groups are different, for example, a case where they are unified in the diastole. Further, the sixth embodiment may be a case where a boundary pattern model and a boundary model for each cardiac phase are created by aligning the cardiac phases of the learning image group. Furthermore, the sixth embodiment may be a case where the boundary pattern model and the boundary model are created by grouping according to physical characteristics such as age, sex, height, and weight of the subject.
- an X-ray image, an MRI image, and an ultrasonic image can be used as an input image by creating a boundary pattern model and a boundary model for each type of medical image.
- a medical image processing apparatus 40 according to the seventh embodiment will be described with reference to FIGS.
- FIG. 36 is a diagram illustrating a configuration example of the medical image processing apparatus 40 according to the seventh embodiment.
- the image processing system according to the seventh embodiment is similar to the image processing system according to the sixth embodiment described with reference to FIG. Device 200. As illustrated in FIG. 36, the image processing system according to the seventh embodiment replaces the medical image processing apparatus 30 according to the sixth embodiment with the medical image processing apparatus 40 according to the seventh embodiment.
- the medical image processing apparatus 40 includes an input unit 21, a display unit 22, a communication unit 23, a storage unit 24, and a control unit 25.
- the storage unit 24 includes an image storage unit 241, a site-specific template storage unit 242, a boundary pattern model storage unit 243, and a boundary model storage unit 244.
- the control unit 25 includes a coordinate system detection unit 251, a part region detection unit 252, a calculation unit 253, a correction unit 254, and a boundary detection unit 255.
- the input unit 21, the display unit 22, and the communication unit 23 illustrated in FIG. 36 have the same functions as the input unit 11, the display unit 12, and the communication unit 13 described with reference to FIG.
- the image storage unit 241, the part-specific template storage unit 242, the boundary pattern model storage unit 243, and the boundary model storage unit 244 included in the storage unit 24 illustrated in FIG. 36 are the same as the image storage unit 141 illustrated in FIG. 19.
- the same template storage unit 142, boundary pattern model storage unit 143, and boundary model storage unit 144 are stored.
- the coordinate system detection unit 251, the part region detection unit 252, and the calculation unit 253 included in the control unit 25 illustrated in FIG. 36 are the coordinate system detection unit 151, the part region detection unit 152, and the calculation illustrated in FIG. Processing similar to that of unit 153 is performed.
- the input unit 21 also serves as an acquisition unit that acquires three-dimensional volume data of the heart.
- the correction unit 254 performs the following correction processing.
- the correction unit 254 performs a correction process using the degree of staining on the input image to generate a corrected input image.
- the correction unit 254 according to the sixth embodiment approximates the luminance value of the input image of the part region to the luminance value of the boundary pattern model of the part region based on the degree of staining of the part region.
- a correction input image is generated by performing correction.
- the luminance value of the “i” -th pixel of the input image is “Ii”
- the corrected value obtained by correcting the “Ii” by the correction unit 254 is “I′i”.
- the correction unit 254 calculates “a ′” that is an average value of the luminance values of the pixels corresponding to the portion where the “i” -th pixel of the input image is located among the pixels of the boundary pattern model. . Then, the correction unit 254 acquires the degree of staining “d ′” of the portion where the “i” -th pixel of the input image is located from the processing result of the calculation unit 253.
- the correction unit 254 sets the part region for which the degree of shade closest to “Ii” is calculated in the degree of shade of the part region calculated by the calculation unit 253 as the “i” th pixel of the input image. Is identified as the position where is located.
- the correction unit 254 calculates “I′i” by the following equation (6).
- the correction unit 254 calculates “I′i” by the following equation (7).
- the correction unit 254 generates a corrected input image by performing such processing on all the pixels of the input image.
- the luminance value of the input image may be rounded according to the degree of staining.
- the method of correcting the input image is not limited to the arithmetic processing according to the above formula (6) or formula (7). Correction by using a combination of addition processing, subtraction processing, multiplication processing, division processing, or rounding processing to correct the brightness value of the input image so that it is close to the value of the boundary pattern model. If it is possible, the correction unit 254 may perform arbitrary arithmetic processing.
- the boundary detection unit 255 detects the boundary of the myocardium in the corrected input image using the boundary pattern model. Specifically, the boundary detection unit 255 detects the boundary of the myocardium in the corrected input image using the boundary pattern model and the boundary model.
- the boundary detection unit 255 performs matching between the boundary pattern model and the luminance pattern around the boundary when the boundary model is applied to the corrected input image while variously changing the boundary model. Then, the boundary detection unit 255 searches the boundary shape in which the luminance pattern most matched with the boundary pattern model is obtained from the boundary shapes obtained by changing the boundary model, so that the corrected input image, that is, the input image Detect the myocardial border around the left ventricle.
- the details of the correction process performed by the boundary detection unit 255 are as follows.
- a corrected input image is used instead of the input image, and a corrected boundary pattern model is used. Since this is the same except that a boundary pattern model is used instead of, a description thereof is omitted.
- FIG. 37 is a flowchart illustrating a processing example of the medical image processing apparatus 40 according to the seventh embodiment.
- the medical image processing apparatus 40 determines whether a request for detecting a myocardial boundary for an input image has been received (step S201).
- the medical image processing apparatus 40 stands by until the detection request is received.
- the coordinate system detection unit 251 detects the left ventricular coordinate system of the heart in the input image (Step S202), and the region region detection unit 252 determines from the left ventricular coordinate system. A region is detected (step S203).
- the calculation unit 253 calculates the degree of staining of the region (step S204), and the correction unit 254 corrects the input image using the degree of staining and generates a corrected input image (step S205).
- the boundary detection unit 255 detects the myocardial boundary of the input image from the corrected input image, the boundary pattern model, and the boundary model (step S206), and ends the process.
- the input image is corrected according to the degree of shadow, and the corrected input image is used as a matching target of the boundary pattern model, thereby detecting the myocardial boundary in the input image.
- the influence of the variation of the contrast degree of the contrast agent of an input image can be reduced, and matching precision can be improved.
- the correction unit 254 uses the statistical values described in the processing of the part region detection unit 152 according to the sixth embodiment, and uses the statistical information described in the long axis and the short axis as the “i” -th input image. It may be a case where a part where a pixel is located is specified. In such a case, the correction unit 254 specifies the heart chamber and the heart wall within a predetermined range, and thus generates a corrected input image in which a part of the input image is corrected. For this reason, in the seventh embodiment, the correction unit 254 uses the detection result of the boundary detection unit 255 to, for example, enlarge a predetermined range and generate a corrected input image again, and the boundary detection unit 255 The boundary may be detected again.
- the re-specification of the region by the correction unit 254 and the re-generation of the corrected input image and the boundary re-detection of the boundary detection unit 255 are performed a predetermined number of times. It is desirable to repeat.
- the corrected input image obtained by correcting the part of the input image by performing the part specification on a part of the pixels is provided. It may be generated. Also in such a case, it is possible to improve the myocardial boundary detection accuracy by repeatedly performing the re-specification of the part by the correction unit 254 and the re-generation of the corrected input image and the boundary re-detection of the boundary detection unit 255.
- the seventh embodiment applies the contents described in the sixth embodiment except that “the correction target is an input image and the boundary detection is performed using the corrected input image and the boundary pattern model”. Is possible.
- correction is performed so that the value of the predetermined part region is close to the boundary pattern model and the input image, and the myocardium is used using the corrected data.
- Boundary detection is performed.
- the sixth embodiment and the seventh embodiment described above the case where the boundary detection of the myocardium is performed using the boundary pattern model and the boundary model has been described.
- the sixth embodiment and the seventh embodiment may be cases where the boundary detection of the myocardium is performed without using the boundary model.
- the boundary pattern model is information in which a plurality of luminance value sequences of pixels associated with information on the heart region are arranged. Further, in the boundary pattern model, the spatial order information of the heart is included in the arrangement order of the pixels in the luminance value sequence and the arrangement order of the luminance value sequence. Therefore, in the sixth embodiment, the boundary detection unit 155 performs, for example, a distance between pixels in the corrected boundary pattern model under a condition limited by the spatial information of the heart in the left ventricular coordinate system and the corrected boundary pattern model. The boundary of the myocardium can be detected by matching the corrected boundary pattern model and the input image with the brute force while varying.
- the boundary detection unit 255 calculates, for example, the distance between pixels in the corrected boundary pattern model under a condition limited by the spatial information of the heart in the left ventricular coordinate system and the boundary pattern model.
- the boundary of the myocardium can be detected by matching the boundary pattern model and the corrected input image with brute force while changing.
- image processing methods described in the sixth embodiment and the seventh embodiment described above may be executed in the medical image diagnostic apparatus 100.
- the image processing program executed by the medical image processing apparatus 30 of the sixth embodiment and the medical image processing apparatus 40 of the seventh embodiment is provided by being incorporated in advance in a ROM or the like.
- the image processing program executed by the medical image processing apparatus 30 according to the sixth embodiment or the medical image processing apparatus 40 according to the seventh embodiment is a file in an installable format or an executable format and is a CD-ROM, flexible You may comprise so that it may record and provide on computer-readable recording media, such as a disk (FD), CD-R, and DVD (Digital Versatile Disk).
- FD disk
- CD-R Compact Disc
- DVD Digital Versatile Disk
- the image processing program executed by the medical image processing apparatus 30 of the sixth embodiment and the medical image processing apparatus 40 of the seventh embodiment is stored on a computer connected to a network such as the Internet, and the network You may comprise so that it may provide by making it download via.
- the image processing program executed by the medical image processing apparatus 30 of the sixth embodiment and the medical image processing apparatus 40 of the seventh embodiment is configured to be provided or distributed via a network such as the Internet. Also good.
- the image processing program executed by the medical image processing apparatus 30 of the sixth embodiment and the medical image processing apparatus 40 of the seventh embodiment includes the above-described units (coordinate system detection unit, part region detection unit, calculation unit,
- the module configuration includes a correction unit and a boundary detection unit.
- the CPU reads an image processing program from the ROM and executes the image processing program.
- a detection unit, a part region detection unit, a calculation unit, a correction unit, and a boundary detection unit are generated on the main storage device.
- the boundary detection accuracy of the myocardium can be improved.
- the medical image processing apparatuses 10, 20, 30, and 40 of each of the above embodiments can be realized by using, for example, a general-purpose computer as basic hardware. That is, the acquisition unit, the coordinate system detection unit, the part region detection unit, the calculation unit, the correction unit, the boundary detection unit, and the display unit can be realized by causing the processor mounted on the computer to execute a program.
- the medical image processing apparatuses 10, 20, 30, and 40 may be realized by installing the above program in a computer in advance, or may be stored in a storage medium such as a CD-ROM or via a network. The above program may be distributed, and this program may be installed on a computer as appropriate.
- the acquisition unit, coordinate system detection unit, and boundary detection unit include a memory, a hard disk or a storage medium such as a CD-R, a CD-RW, a DVD-RAM, a DVD-R, etc. incorporated in or external to the computer. It can be realized by appropriate use.
- the present invention is not limited to the above-described embodiment as it is, and can be embodied by modifying constituent elements without departing from the scope of the invention in the implementation stage.
- various inventions can be formed by appropriately combining a plurality of components disclosed in the embodiment. For example, some components may be deleted from all the components shown in the embodiment.
- constituent elements over different embodiments may be appropriately combined.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Radiology & Medical Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- High Energy & Nuclear Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Dentistry (AREA)
- Cardiology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Physiology (AREA)
- Software Systems (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
Claims (29)
- 心臓のボリュームデータを取得する取得部と、
前記心臓の左心室長軸を少なくとも含む3つの軸からなる3次元の左心室座標系を、前記ボリュームデータから検出する座標系検出部と、
前記左心室座標系で表現された境界モデルを用いて、前記境界モデルを前記ボリュームデータに当てはめて求めた境界パターンと、予め定めた境界パターンモデルとの誤差が小さくなるように、前記境界モデルを変形させて、前記ボリュームデータから左心室境界を検出する境界検出部と、
前記左心室座標系の前記3つの軸の少なくとも一つの軸に直交する断面像と共に、前記断面像上に検出した前記左心室境界を表示する表示部と、
を有する医用画像処理装置。 - 前記座標系検出部は、前記左心室長軸に加えて、前記心臓の前記四腔断面像、三腔断面像、又は、二腔断面像を設定するいずれか一つの軸を前記左心室座標系の軸として前記ボリュームデータから検出する、
請求項1に記載の医用画像処理装置。 - 前記座標系検出部は、前記3つの軸の方向が互いに直交し、かつ、前記心臓の左心室中心の位置を原点とする前記左心室座標系を前記ボリュームデータから検出する、
請求項1に記載の医用画像処理装置。 - 前記座標系検出部は、
前記ボリュームデータと、予め作成した左心室断面像パターンを照合して、前記左心室座標系を検出する、
請求項1に記載の医用画像処理装置。 - 前記座標系検出部は、
前記ボリュームデータと、予め学習した心尖位置周辺の画像パターン及び僧帽弁位置周辺の画像パターンを照合して、前記心臓の心尖位置と僧帽弁位置をそれぞれ求め、前記心尖位置と前記僧帽弁位置から前記左心室長軸を検出する、
請求項1に記載の医用画像処理装置。 - 前記座標系検出部は、前記左心室長軸を検出した後に、前記心臓の右心室角点の位置、三尖弁の位置、左室流出路の位置のいずれか一つの位置を用いて、前記左心室座標系を検出する、
請求項5に記載の医用画像処理装置。 - 前記座標系検出部は、前記左心室座標系が検出できない場合、前記表示部にユーザに対して検出できない旨を通知する、
請求項1に記載の医用画像処理装置。 - 前記座標系検出部は、前記左心室座標系が検出できない場合、
前記表示部は、前記ボリュームデータの座標系に基づいて前記ボリュームデータを表示する、
請求項1に記載の医用画像処理装置。 - 予め定めた境界パターンモデルと、検出された前記左心室境界に関する境界パターンとの誤差を評価する評価部を有し、
前記表示部は前記評価部で評価した前記誤差の評価結果を表示する、
請求項1に記載の医用画像処理装置。 - 前記評価部は、前記誤差が閾値より低いか否かを評価する、
請求項9に記載の医用画像処理装置。 - 前記評価部において前記誤差が前記閾値より低い場合は、前記表示部が、前記断面像と共に、前記断面像上に前記左心室境界を表示する、
請求項10に記載の医用画像処理装置。 - 前記評価部において前記誤差が前記閾値より高い場合は、前記表示部は、前記左心室境界を検出できない旨を表示する、
請求項10に記載の医用画像処理装置。 - 前記評価部において前記誤差が前記閾値より高い場合は、前記表示部は、前記左心室境界を除く前記ボリュームデーを表示する、
請求項10に記載の医用画像処理装置。 - 前記評価部において前記誤差が前記閾値より高い場合は、前記座標系検出部が、予め定めた異なる条件で左心室座標系を再度検出する、
請求項10に記載の医用画像処理装置。 - 前記評価部において前記誤差が前記閾値より高い場合は、前記座標系検出部に対し、ユーザが左心室座標系の入力を行う、
請求項10に記載の医用画像処理装置。 - 前記境界モデルは、前記左心室の心筋内側境界と心筋外側境界とが組み合わさった一つの境界モデルである、
請求項1に記載の医用画像処理装置。 - 前記境界モデルは、前記左心室の心筋内側の境界モデル、又は、心筋外側の境界モデルである、
請求項1に記載の医用画像処理装置。 - 前記ボリュームデータは、CT装置、又は、MR装置で撮像されたボリュームデータである、
請求項1に記載の医用画像診断装置。 - 前記境界パターンモデルは、造影剤により染影された画像における心筋及び心筋境界周辺の輝度値のパターンが学習によりモデル化され、
前記左心室座標系に基づいて、前記ボリュームデータにおける所定の部位領域の造影剤濃度を示す染影度を算出する算出部と、
前記部位領域の染影度を用いて、前記ボリュームデータの輝度値と前記境界パターンモデルの輝度値とが当該部位領域において近接する補正を行なう補正部と、を備え、
前記境界検出部は、前記補正部による補正後のデータを用いて、前記ボリュームデータにおける心筋の境界を検出する、
請求項1に記載の医用画像処理装置。 - 前記左心室座標系を用いて、前記ボリュームデータにおける前記部位領域を検出する部位領域検出部、
を更に備え、
前記算出部は、前記部位領域検出部により検出された部位領域を用いて染影度算出処理を行なう、
請求項19に記載の医用画像処理装置。 - 前記補正部は、前記染影度を用いた補正処理を前記境界パターンモデルに対して行なって、補正境界パターンモデルを生成し、
前記境界検出部は、前記補正境界パターンモデルを用いて、前記ボリュームデータにおける心筋の境界を検出する、
請求項19に記載の医用画像処理装置。 - 前記補正部は、前記染影度を用いた補正処理を前記ボリュームデータに対して行なって、補正ボリュームデータを生成し、
前記境界検出部は、前記境界パターンモデルを用いて、前記補正ボリュームデータにおける心筋の境界を検出する、
請求項19に記載の医用画像処理装置。 - 前記補正部は、前記染影度を用いた補正処理を、加算処理、減算処理、乗算処理、除算処理、又はラウンディング処理の組み合わせにより行なう、
請求項19に記載の医用画像処理装置。 - 前記算出部は、前記部位領域を構成する複数の画素の輝度値列における統計的な代表値を、当該部位領域の染影度として算出する、
請求項19に記載の医用画像処理装置。 - 前記部位領域検出部は、前記所定の部位領域として複数の部位領域を検出し、
前記算出部は、前記複数の部位領域それぞれの染影度を算出する、
請求項20に記載の医用画像処理装置。 - 前記部位領域検出部は、前記所定の部位領域として、心室、心房、左室流出路、弁輪、乳頭筋、心筋及び冠動脈の少なくとも1つを含む領域を検出する、
請求項20に記載の医用画像処理装置。 - 前記部位領域検出部は、前記ボリュームデータにおいて、前記左心室座標系に含まれる長軸により定まる所定の範囲を前記部位領域として検出する、
請求項20に記載の医用画像処理装置。 - 前記座標系検出部は、更に、前記ボリュームデータにおける心臓の短軸を前記左心室座標系として検出し、
前記部位領域検出部は、前記ボリュームデータにおいて、前記長軸の方向と前記短軸の方向とにより定まる所定の範囲を前記部位領域として検出する、
請求項20に記載の医用画像処理装置。 - 心臓のボリュームデータを取得し、
前記心臓の左心室長軸を少なくとも含む3つの軸からなる3次元の左心室座標系を、前記ボリュームデータから検出し、
前記左心室座標系で表現された境界モデルを用いて、前記境界モデルを前記ボリュームデータに当てはめて求めた境界パターンと、予め定めた境界パターンモデルとの誤差が小さくなるように、前記境界モデルを変形させて、前記ボリュームデータから左心室境界を検出し、
前記左心室座標系の前記3つの軸の少なくとも一つの軸に直交する断面像と共に、前記断面像上に検出した前記左心室境界を表示する、
医用画像処理方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201280001301.5A CN102883662B (zh) | 2011-05-11 | 2012-05-11 | 医疗图像处理设备以及其方法 |
US13/636,442 US9153033B2 (en) | 2011-05-11 | 2012-05-11 | Medical image processing apparatus and method thereof |
JP2012524015A JP5422742B2 (ja) | 2011-05-11 | 2012-05-11 | 医用画像処理装置とその方法 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011106223 | 2011-05-11 | ||
JP2011-106223 | 2011-05-11 | ||
JP2012001524 | 2012-01-06 | ||
JP2012-001524 | 2012-01-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012153539A1 true WO2012153539A1 (ja) | 2012-11-15 |
Family
ID=47139025
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/003093 WO2012153539A1 (ja) | 2011-05-11 | 2012-05-11 | 医用画像処理装置とその方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9153033B2 (ja) |
JP (1) | JP5422742B2 (ja) |
CN (1) | CN102883662B (ja) |
WO (1) | WO2012153539A1 (ja) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014112338A1 (ja) * | 2013-01-16 | 2014-07-24 | 富士フイルム株式会社 | 医用画像処理装置および方法並びにプログラム |
JP2014151114A (ja) * | 2013-02-13 | 2014-08-25 | Toshiba Corp | 医用画像診断装置、医用画像処理装置及び医用画像処理方法 |
WO2016009957A1 (ja) * | 2014-07-15 | 2016-01-21 | 富士フイルムRiファーマ株式会社 | コンピュータプログラム、画像処理装置及び方法 |
JP2016202920A (ja) * | 2015-04-24 | 2016-12-08 | パイ メディカル イメージング ビー ヴイPie Medical Imaging B.V. | 4d mr画像データのフロー分析 |
JP2017148438A (ja) * | 2016-02-26 | 2017-08-31 | 東芝メディカルシステムズ株式会社 | 医用画像処理装置、超音波診断装置、及び医用画像処理プログラム |
JP2017176381A (ja) * | 2016-03-29 | 2017-10-05 | ザイオソフト株式会社 | 医用画像処理装置、医用画像処理方法、及び医用画像処理プログラム |
JP2017205142A (ja) * | 2016-05-16 | 2017-11-24 | コニカミノルタ株式会社 | 動態解析システム |
US10674994B2 (en) | 2015-12-02 | 2020-06-09 | Hitachi, Ltd. | Ultrasonic imaging device and image processing method |
WO2021141135A1 (ja) * | 2020-01-09 | 2021-07-15 | 学校法人東京女子医科大学 | 冠動脈ct4dフローイメージによる機能的虚血検出技術 |
US12051192B2 (en) | 2018-01-24 | 2024-07-30 | Pie Medical Imaging B.V. | Flow analysis in 4D MR image data |
Families Citing this family (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA3035048C (en) | 2010-12-23 | 2021-05-04 | Mark Deem | System for mitral valve repair and replacement |
EP3964176A1 (en) | 2011-06-21 | 2022-03-09 | Twelve, Inc. | Prosthetic heart valve devices |
US11202704B2 (en) | 2011-10-19 | 2021-12-21 | Twelve, Inc. | Prosthetic heart valve devices, prosthetic mitral valves and associated systems and methods |
US9655722B2 (en) | 2011-10-19 | 2017-05-23 | Twelve, Inc. | Prosthetic heart valve devices, prosthetic mitral valves and associated systems and methods |
AU2012325809B2 (en) | 2011-10-19 | 2016-01-21 | Twelve, Inc. | Devices, systems and methods for heart valve replacement |
US9763780B2 (en) | 2011-10-19 | 2017-09-19 | Twelve, Inc. | Devices, systems and methods for heart valve replacement |
US10016271B2 (en) | 2011-10-19 | 2018-07-10 | Twelve, Inc. | Prosthetic heart valve devices, prosthetic mitral valves and associated systems and methods |
US9039757B2 (en) | 2011-10-19 | 2015-05-26 | Twelve, Inc. | Prosthetic heart valve devices, prosthetic mitral valves and associated systems and methods |
US9579198B2 (en) | 2012-03-01 | 2017-02-28 | Twelve, Inc. | Hydraulic delivery systems for prosthetic heart valve devices and associated methods |
WO2014084398A1 (ja) | 2012-11-30 | 2014-06-05 | 株式会社 東芝 | 医用画像診断装置 |
US10111747B2 (en) | 2013-05-20 | 2018-10-30 | Twelve, Inc. | Implantable heart valve devices, mitral valve repair devices and associated systems and methods |
CN104240226B (zh) * | 2013-06-20 | 2017-12-22 | 上海联影医疗科技有限公司 | 一种心脏图像的配准方法 |
US9750475B2 (en) * | 2013-11-05 | 2017-09-05 | Shimadzu Corporation | Contour image generating device and nuclear medicine diagnosis apparatus |
FR3015680B1 (fr) * | 2013-12-19 | 2016-01-15 | Snecma | Procede de caracterisation d'une piece |
JP5990834B2 (ja) | 2014-03-28 | 2016-09-14 | 株式会社日立製作所 | 診断画像生成装置および診断画像生成方法 |
KR101616029B1 (ko) * | 2014-07-25 | 2016-04-27 | 삼성전자주식회사 | 자기 공명 영상 처리 방법 및 방법을 실행하기 위한 장치 |
KR102329113B1 (ko) * | 2014-10-13 | 2021-11-19 | 삼성전자주식회사 | 초음파 영상 장치 및 초음파 영상 장치의 제어 방법 |
JP6411183B2 (ja) * | 2014-11-13 | 2018-10-24 | キヤノンメディカルシステムズ株式会社 | 医用画像診断装置、画像処理装置及び画像処理プログラム |
JP6513391B2 (ja) * | 2014-12-24 | 2019-05-15 | キヤノンメディカルシステムズ株式会社 | 医用画像処理装置、医用画像処理装置における画像データ表示方法およびx線ct装置 |
US9558561B2 (en) * | 2015-01-06 | 2017-01-31 | Varian Medical Systems International Ag | Semiautomatic drawing tool for image segmentation |
CN107920895B (zh) | 2015-08-21 | 2020-06-26 | 托尔福公司 | 可植入心脏瓣膜装置、二尖瓣修复装置以及相关系统和方法 |
US9799102B2 (en) * | 2015-12-02 | 2017-10-24 | Adobe Systems Incorporated | Smoothing images using machine learning |
WO2017109662A1 (en) * | 2015-12-22 | 2017-06-29 | Koninklijke Philips N.V. | Heart model guided coronary artery segmentation |
EP3448316B1 (en) | 2016-04-29 | 2023-03-29 | Medtronic Vascular Inc. | Prosthetic heart valve devices with tethered anchors |
WO2017193251A1 (zh) * | 2016-05-09 | 2017-11-16 | 深圳迈瑞生物医疗电子股份有限公司 | 识别超声图像中感兴趣区域轮廓的方法及系统 |
JP7080590B2 (ja) * | 2016-07-19 | 2022-06-06 | キヤノンメディカルシステムズ株式会社 | 医用処理装置、超音波診断装置、および医用処理プログラム |
US11517277B2 (en) * | 2016-12-15 | 2022-12-06 | Koninklijke Philips N.V. | Visualizing collimation errors |
CN109069114B (zh) | 2017-02-16 | 2021-10-22 | 深圳迈瑞生物医疗电子股份有限公司 | 超声医学检测设备及成像方法、成像系统、显示终端 |
US10433961B2 (en) | 2017-04-18 | 2019-10-08 | Twelve, Inc. | Delivery systems with tethers for prosthetic heart valve devices and associated methods |
US10702378B2 (en) | 2017-04-18 | 2020-07-07 | Twelve, Inc. | Prosthetic heart valve device and associated systems and methods |
US10575950B2 (en) | 2017-04-18 | 2020-03-03 | Twelve, Inc. | Hydraulic systems for delivering prosthetic heart valve devices and associated methods |
US10032281B1 (en) * | 2017-05-03 | 2018-07-24 | Siemens Healthcare Gmbh | Multi-scale deep reinforcement machine learning for N-dimensional segmentation in medical imaging |
US10792151B2 (en) | 2017-05-11 | 2020-10-06 | Twelve, Inc. | Delivery systems for delivering prosthetic heart valve devices and associated methods |
US10646338B2 (en) | 2017-06-02 | 2020-05-12 | Twelve, Inc. | Delivery systems with telescoping capsules for deploying prosthetic heart valve devices and associated methods |
US10709591B2 (en) | 2017-06-06 | 2020-07-14 | Twelve, Inc. | Crimping device and method for loading stents and prosthetic heart valves |
EP3412214A1 (en) * | 2017-06-08 | 2018-12-12 | Koninklijke Philips N.V. | Ultrasound imaging method |
US10786352B2 (en) | 2017-07-06 | 2020-09-29 | Twelve, Inc. | Prosthetic heart valve devices and associated systems and methods |
US10729541B2 (en) | 2017-07-06 | 2020-08-04 | Twelve, Inc. | Prosthetic heart valve devices and associated systems and methods |
US11373404B2 (en) * | 2018-05-18 | 2022-06-28 | Stats Llc | Machine learning for recognizing and interpreting embedded information card content |
US10950016B2 (en) | 2018-06-11 | 2021-03-16 | Shanghai United Imaging Healthcare Co., Ltd. | Systems and methods for reconstructing cardiac images |
CN108898582B (zh) * | 2018-06-11 | 2021-08-17 | 上海联影医疗科技股份有限公司 | 心脏图像重建方法、装置以及计算机设备 |
JP2020141748A (ja) * | 2019-03-04 | 2020-09-10 | 富士フイルム株式会社 | 巡回撮影管理装置、巡回撮影管理装置の作動方法、巡回撮影管理装置の作動プログラム、データ構造、および記録装置 |
JP7328156B2 (ja) * | 2020-01-22 | 2023-08-16 | キヤノンメディカルシステムズ株式会社 | 超音波診断装置、医用画像処理装置、および医用画像処理プログラム |
JP7538705B2 (ja) * | 2020-12-08 | 2024-08-22 | 富士フイルムヘルスケア株式会社 | 超音波診断システム及び操作支援方法 |
CN113096238B (zh) * | 2021-04-02 | 2022-05-17 | 杭州柳叶刀机器人有限公司 | 一种x射线图模拟方法、装置、电子设备及存储介质 |
CN114419032B (zh) * | 2022-03-14 | 2022-06-21 | 深圳科亚医疗科技有限公司 | 心脏左心室的心肌内膜和/或心肌外膜的分割方法和装置 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003503136A (ja) * | 1999-04-21 | 2003-01-28 | オークランド ユニサービシーズ リミティド | 器官の特性を測定する方法およびシステム |
JP2006198410A (ja) * | 2005-01-21 | 2006-08-03 | Siemens Ag | 心臓の3d画像データセットにおける左心室の位置および向きの自動決定方法 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5781195A (en) * | 1996-04-16 | 1998-07-14 | Microsoft Corporation | Method and system for rendering two-dimensional views of a three-dimensional surface |
US6106466A (en) * | 1997-04-24 | 2000-08-22 | University Of Washington | Automated delineation of heart contours from images using reconstruction-based modeling |
JP4373682B2 (ja) * | 2003-01-31 | 2009-11-25 | 独立行政法人理化学研究所 | 関心組織領域抽出方法、関心組織領域抽出プログラム及び画像処理装置 |
US20070014452A1 (en) * | 2003-12-01 | 2007-01-18 | Mitta Suresh | Method and system for image processing and assessment of a state of a heart |
US8265363B2 (en) | 2009-02-04 | 2012-09-11 | General Electric Company | Method and apparatus for automatically identifying image views in a 3D dataset |
-
2012
- 2012-05-11 US US13/636,442 patent/US9153033B2/en active Active
- 2012-05-11 JP JP2012524015A patent/JP5422742B2/ja active Active
- 2012-05-11 CN CN201280001301.5A patent/CN102883662B/zh active Active
- 2012-05-11 WO PCT/JP2012/003093 patent/WO2012153539A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003503136A (ja) * | 1999-04-21 | 2003-01-28 | オークランド ユニサービシーズ リミティド | 器官の特性を測定する方法およびシステム |
JP2006198410A (ja) * | 2005-01-21 | 2006-08-03 | Siemens Ag | 心臓の3d画像データセットにおける左心室の位置および向きの自動決定方法 |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10019804B2 (en) | 2013-01-16 | 2018-07-10 | Fujifilm Corporation | Medical image processing apparatus, method, and program |
JP2014135990A (ja) * | 2013-01-16 | 2014-07-28 | Fujifilm Corp | 医用画像処理装置および方法並びにプログラム |
WO2014112338A1 (ja) * | 2013-01-16 | 2014-07-24 | 富士フイルム株式会社 | 医用画像処理装置および方法並びにプログラム |
JP2014151114A (ja) * | 2013-02-13 | 2014-08-25 | Toshiba Corp | 医用画像診断装置、医用画像処理装置及び医用画像処理方法 |
WO2016009957A1 (ja) * | 2014-07-15 | 2016-01-21 | 富士フイルムRiファーマ株式会社 | コンピュータプログラム、画像処理装置及び方法 |
JPWO2016009957A1 (ja) * | 2014-07-15 | 2017-04-27 | 富士フイルムRiファーマ株式会社 | コンピュータプログラム、画像処理装置及び方法 |
US10102623B2 (en) | 2014-07-15 | 2018-10-16 | Fujifilm Ri Pharma Co., Ltd. | Computer program, and image processing device and method |
JP2016202920A (ja) * | 2015-04-24 | 2016-12-08 | パイ メディカル イメージング ビー ヴイPie Medical Imaging B.V. | 4d mr画像データのフロー分析 |
US10674994B2 (en) | 2015-12-02 | 2020-06-09 | Hitachi, Ltd. | Ultrasonic imaging device and image processing method |
JP2017148438A (ja) * | 2016-02-26 | 2017-08-31 | 東芝メディカルシステムズ株式会社 | 医用画像処理装置、超音波診断装置、及び医用画像処理プログラム |
US10318841B2 (en) | 2016-02-26 | 2019-06-11 | Toshiba Medical Systems Corporation | Medical-image processing apparatus, ultrasonic diagnostic apparatus, and medical-image processing method |
JP2017176381A (ja) * | 2016-03-29 | 2017-10-05 | ザイオソフト株式会社 | 医用画像処理装置、医用画像処理方法、及び医用画像処理プログラム |
JP2017205142A (ja) * | 2016-05-16 | 2017-11-24 | コニカミノルタ株式会社 | 動態解析システム |
US12051192B2 (en) | 2018-01-24 | 2024-07-30 | Pie Medical Imaging B.V. | Flow analysis in 4D MR image data |
WO2021141135A1 (ja) * | 2020-01-09 | 2021-07-15 | 学校法人東京女子医科大学 | 冠動脈ct4dフローイメージによる機能的虚血検出技術 |
Also Published As
Publication number | Publication date |
---|---|
CN102883662B (zh) | 2015-04-08 |
US9153033B2 (en) | 2015-10-06 |
JP5422742B2 (ja) | 2014-02-19 |
CN102883662A (zh) | 2013-01-16 |
US20140219524A1 (en) | 2014-08-07 |
JPWO2012153539A1 (ja) | 2014-07-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5422742B2 (ja) | 医用画像処理装置とその方法 | |
US8014578B2 (en) | Method and system for image segmentation using models | |
EP3652747B1 (en) | Methods and systems for guidance in cardiac resynchronization therapy | |
RU2595757C2 (ru) | Устройство совмещения изображений | |
US9384546B2 (en) | Method and system for pericardium based model fusion of pre-operative and intra-operative image data for cardiac interventions | |
Banerjee et al. | A completely automated pipeline for 3D reconstruction of human heart from 2D cine magnetic resonance slices | |
EP3370615B1 (en) | Collateral flow modelling for non-invasive fractional flow reserve (ffr) | |
JP2019504659A (ja) | 自動化された心臓ボリュームセグメンテーション | |
US10019804B2 (en) | Medical image processing apparatus, method, and program | |
WO2015168792A9 (en) | Method and system for analysis of myocardial wall dynamics | |
JP2013022463A (ja) | 運動対象輪郭抽出装置、左心室画像分離装置、運動対象輪郭抽出方法及び左心室画像分離方法 | |
US10628963B2 (en) | Automatic detection of an artifact in patient image | |
JP5498989B2 (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
US9224188B2 (en) | Image processing device, method and program | |
EP2220618B1 (en) | Apparatus for determining a parameter of a moving object | |
EP3244798B1 (en) | Adaptive segmentation for rotational c-arm computed tomography with a reduced angular range | |
Yang et al. | Automatic left ventricle segmentation based on multiatlas registration in 4D CT images | |
JP2017148438A (ja) | 医用画像処理装置、超音波診断装置、及び医用画像処理プログラム | |
Medina et al. | Left ventricle myocardium segmentation in multi-slice computerized tomography | |
EP3667618A1 (en) | Deep partial-angle coronary restoration | |
EP4231234A1 (en) | Deep learning for registering anatomical to functional images | |
Krishnaswamy et al. | Validation of a diffeomorphic registration algorithm using true deformation computed from thin plate spline interpolation | |
Colvert et al. | Novel measurement of LV twist using 4DCT: quantifying accuracy as a function of image noise | |
US20240202919A1 (en) | Medical image processing apparatus, method, and storage medium | |
KR20230021544A (ko) | Cta 영상에서 계층적 변형을 이용한 심혈관 비강체 정합 방법, 이를 수행하기 위한 기록매체 및 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201280001301.5 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2012524015 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13636442 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12781717 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12781717 Country of ref document: EP Kind code of ref document: A1 |