WO2007023723A1 - 画像処理方法、画像処理プログラム、及び画像処理装置 - Google Patents
画像処理方法、画像処理プログラム、及び画像処理装置 Download PDFInfo
- Publication number
- WO2007023723A1 WO2007023723A1 PCT/JP2006/316147 JP2006316147W WO2007023723A1 WO 2007023723 A1 WO2007023723 A1 WO 2007023723A1 JP 2006316147 W JP2006316147 W JP 2006316147W WO 2007023723 A1 WO2007023723 A1 WO 2007023723A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- dimensional
- normalized
- fusion
- image processing
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 24
- 230000004927 fusion Effects 0.000 claims abstract description 93
- 238000010606 normalization Methods 0.000 claims abstract description 29
- 238000000034 method Methods 0.000 claims description 88
- 238000006243 chemical reaction Methods 0.000 claims description 37
- 230000008569 process Effects 0.000 description 27
- 230000009466 transformation Effects 0.000 description 17
- 210000004027 cell Anatomy 0.000 description 13
- 239000011159 matrix material Substances 0.000 description 13
- 238000010586 diagram Methods 0.000 description 12
- 230000000052 comparative effect Effects 0.000 description 8
- 230000008859 change Effects 0.000 description 7
- HGBLNBBNRORJKI-WCABBAIRSA-N cyclacillin Chemical compound N([C@H]1[C@H]2SC([C@@H](N2C1=O)C(O)=O)(C)C)C(=O)C1(N)CCCCC1 HGBLNBBNRORJKI-WCABBAIRSA-N 0.000 description 7
- 238000002603 single-photon emission computed tomography Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 238000002595 magnetic resonance imaging Methods 0.000 description 6
- 238000002059 diagnostic imaging Methods 0.000 description 5
- 230000003902 lesion Effects 0.000 description 5
- 230000015654 memory Effects 0.000 description 5
- 230000000877 morphologic effect Effects 0.000 description 5
- 238000003325 tomography Methods 0.000 description 5
- PMATZTZNYRCHOR-CGLBZJNRSA-N Cyclosporin A Chemical compound CC[C@@H]1NC(=O)[C@H]([C@H](O)[C@H](C)C\C=C\C)N(C)C(=O)[C@H](C(C)C)N(C)C(=O)[C@H](CC(C)C)N(C)C(=O)[C@H](CC(C)C)N(C)C(=O)[C@@H](C)NC(=O)[C@H](C)NC(=O)[C@H](CC(C)C)N(C)C(=O)[C@H](C(C)C)NC(=O)[C@H](CC(C)C)N(C)C(=O)CN(C)C1=O PMATZTZNYRCHOR-CGLBZJNRSA-N 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 238000002945 steepest descent method Methods 0.000 description 3
- ZCXUVYAZINUVJD-AHXZWLDOSA-N 2-deoxy-2-((18)F)fluoro-alpha-D-glucose Chemical compound OC[C@H]1O[C@H](O)[C@H]([18F])[C@@H](O)[C@@H]1O ZCXUVYAZINUVJD-AHXZWLDOSA-N 0.000 description 2
- 210000003719 b-lymphocyte Anatomy 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000003745 diagnosis Methods 0.000 description 2
- 238000009206 nuclear medicine Methods 0.000 description 2
- 230000003936 working memory Effects 0.000 description 2
- 238000005481 NMR spectroscopy Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000017531 blood circulation Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000009194 climbing Effects 0.000 description 1
- 238000002939 conjugate gradient method Methods 0.000 description 1
- 230000001054 cortical effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000002599 functional magnetic resonance imaging Methods 0.000 description 1
- 230000008338 local blood flow Effects 0.000 description 1
- 229910000498 pewter Inorganic materials 0.000 description 1
- 239000010957 pewter Substances 0.000 description 1
- 238000002600 positron emission tomography Methods 0.000 description 1
- 229940121896 radiopharmaceutical Drugs 0.000 description 1
- 239000012217 radiopharmaceutical Substances 0.000 description 1
- 230000002799 radiopharmaceutical effect Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 210000004872 soft tissue Anatomy 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4053—Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/055—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/35—Determination of transform parameters for the alignment of images, i.e. image registration using statistical methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/037—Emission tomography
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30016—Brain
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30061—Lung
Definitions
- Image processing method image processing program, and image processing apparatus
- the present invention relates to an image processing method, an image processing program, and an image processing apparatus for generating a fusion image by superposing a pair of three-dimensional tomographic images.
- Image diagnosis includes single-photon emission tomography (hereinafter referred to as rsPECTj), positron emission tomography (hereinafter referred to as “PET”), magnetic resonance tomography (hereinafter referred to as “MRI”), and Images such as X-ray tomography (hereinafter referred to as “CT”) are used. According to these images, it is possible to obtain information on a lesion that is nondestructive and exists in the body of the subject. Therefore, diagnostic imaging is indispensable in current diagnostic medicine.
- rsPECTj single-photon emission tomography
- PET positron emission tomography
- MRI magnetic resonance tomography
- CT X-ray tomography
- fMRI functional magnetic resonance tomography
- Such a functional image is an image of functional changes in a living body and a lesion. Therefore, the functional image has an advantage that the specificity of the lesion detection is high. On the other hand, functional images also have the drawback of lacking anatomical location information of the lesion.
- Fusion images are used for the purpose of making up for such drawbacks of functional images!
- a fusion image is an image obtained by superimposing a functional image and a morphological image. According to this fusion image, the anatomical position of the lesion site detected in the functional image can be confirmed on the morphological image. Therefore, fusion images are useful for definitive diagnosis and treatment policy decisions.
- Fusion images include not only images derived from such different modalities, that is, images acquired by different devices, but also image power derived from the same modality. Can be created. For example, according to a fusion image based on multiple nuclear medicine images obtained by performing the same examination multiple times, it is possible to obtain a change in value at the same site, different blood flow information or receptor distribution of the same site power, etc. Can do.
- AMIR method Automatic Multimodality Image Registration method
- AC—PC line alignment method see Non-patent document 2
- Mutual information maximization method Non-patent document 3 Etc.
- Non-Patent Literature 1 Babak A. Ardekani et al., A Fully Automatic Multimodality Image Registration Algorithm., Journal of Computer Assisted Tomography, (USA), 1995, 19, 4, p615-623
- Non-Patent Document 2 "Dr.ViewZUNUX User's Manual (Third Edition)", Asahi Kasei Information Systems Co., Ltd., p.466-470
- Japanese Literature 3 F. Maes et al., 'Multimodality Image Registration by Maximization of Mutual Information., IEEE Transactions on Medical Imaging, (USA), 1997, 16, 2, p. 187-198
- fusion images are very useful in the field of diagnostic imaging, and many fusion image creation methods have been developed and put to practical use.
- the AMIR method is a method of creating a fusion image by dividing a contour-extracted image into segments and obtaining a condition under which an evaluation function takes a minimum value. This method is effective for images that can be divided into segments. This method is not suitable for images that have unclear edges and are difficult to divide into segments, such as images for soft tissues.
- the AC-PC line alignment method is a method of creating a fusion image by superimposing AC-PC lines determined in the mid-sagittal plane. According to this method, a fusion image can be easily created as long as the AC-PC line in each image to be superimposed is determined.
- the median sagittal plane as a premise
- the AC-PC line must be determined manually and the AC-PC line determination operation itself is complicated. Also, this method cannot be applied to images other than the head.
- the mutual information maximization method is a method of performing alignment using the information amount of each image. In other words, this method does not require operations such as segmentation and AC-PC line determination. Therefore, the mutual information maximization method is one of the most useful alignment methods at present.
- an object of the present invention is to provide an image processing method, an image processing program, and an image processing apparatus for creating a fusion image automatically and with high overlay accuracy.
- the inventor of the present application has obtained knowledge that can create a fusion image with high accuracy. That is, the inventor of the present application can create a fusion image with high accuracy by obtaining the corresponding positions of a pair of three-dimensional images after equalizing the number of both the bose cell sizes and the number of botacells of the pair of three-dimensional images. I found it. In the past, a pair of three-dimensional images differing in the size of the botacell and the number of botacells were input as they were to the arithmetic processing for deriving the corresponding positions of both.
- An image processing method based on powerful knowledge includes: (a) a first three-dimensional image based on a plurality of first tomographic images obtained and a same part force; Gain The first normalization corresponding to the first three-dimensional image is made by equalizing the size and the number of the button cells in the effective field of view of each of the second three-dimensional images based on the plurality of second tomographic images.
- a Botacel normalization step for generating a second normalized 3D image corresponding to the 3D image and the second 3D image; and (b) a first normalized 3D image and a second normalized tertiary.
- a fusion image generation step of generating a fusion image using the original image.
- the image processing method of the present invention includes a cubic unit for each of the first three-dimensional original image composed of a plurality of first tomographic images and the second three-dimensional original image composed of a plurality of second tomographic images.
- the method may further comprise a step of converting a botacell shape to generate a first 3D image and a second 3D image by converting into a shaped botacell! /.
- An image processing program causes a computer to execute the above-described box normalization step and fusion image generation step.
- the image processing program of the present invention may further cause the computer to execute the above-described buttonel shape conversion step.
- An image processing apparatus provides (a) a first three-dimensional image and a same-part force based on a plurality of first tomographic images obtained from an arbitrary part of a subject.
- the first normalized three-dimensional image corresponding to the first three-dimensional image by equalizing the voxel size and the number of bocells in the effective field of each of the second three-dimensional images based on the plurality of second tomographic images.
- a fusion image generating means for generating a fusion image.
- the image processing apparatus of the present invention includes the first three-dimensional original image composed of a plurality of first tomographic images and the second three-dimensional original image also composed of a plurality of second tomographic image forces. Further, it is possible to further include a button cell shape conversion means for generating the first three-dimensional image and the second three-dimensional image by converting into a rectangular-shaped button cell.
- the first normalized 3D image and the second normalized 3D image are preferably generated by a linear interpolation method.
- the first three-dimensional image and the second three-dimensional image are also generated by the linear interpolation method.
- the fusion image Produced by the method of maximizing the amount of money.
- an image processing method capable of automatically creating a fusion image with high overlay accuracy.
- FIG. 1 is a flowchart of an image processing method according to an embodiment of the present invention.
- FIG. 2 is a flowchart showing an example of processing in the step of converting the botacell shape shown in FIG.
- FIG. 3 is a flowchart showing an example of processing in the botacell normalization step shown in FIG. 1.
- FIG. 4 is a flowchart showing an example of processing in the fusion image generation step shown in FIG.
- FIG. 5 is a diagram showing a configuration of an image processing program according to an embodiment of the present invention together with a recording medium.
- FIG. 6 is a diagram showing a hardware configuration of a computer for executing a program stored in a recording medium.
- FIG. 7 is a perspective view of a computer for executing a program stored in a recording medium.
- FIG. 8 is a diagram showing a configuration of an image processing apparatus according to the embodiment of the present invention.
- FIG. 9 is a diagram showing an example of a head SPECT image.
- FIG. 10 is a diagram showing an example of a head CT image in the same subject as in FIG. 9.
- FIG. 11 is a diagram showing a fusion image generated only by the mutual information maximization method using the images shown in FIGS. 9 and 10.
- FIG. 12 is a diagram showing a fusion image generated by the image processing method according to the present invention using the images shown in FIGS. 9 and 10.
- FIG. 13 is a diagram showing an example of a chest SPECT image.
- FIG. 14 is a diagram showing an example of a chest MRI image in the same subject as in FIG.
- FIG. 16 is a view showing a fusion image generated by the image processing method according to the present invention using the images shown in FIGS. 13 and 14.
- 10 ... Image processing program, 11 ... Main module, 12 ... Three-dimensional original image acquisition module, 14 ... Botacel shape conversion module, 16 ... Botacel normalization module, 18 ... Fusion image generation module , 20 ... Output module, 30 ... Image processing device, 32 ... Three-dimensional original image acquisition unit, 34 ... Botacel shape conversion unit, 36 ... Botacel normal part, 38 ... Fusion image generation unit, 40 ... Output unit, 100 ... recording medium, 110 ... computer, 112 ... reading device, 114 ... working memory, 116 ... memory, 118 ... display device, 120 ... mouse, 122 ... keyboard, 124 ... communication device , 126- --CPU 0
- FIG. 1 is a flowchart of an image processing method according to an embodiment of the present invention.
- the image processing method shown in FIG. 1 can be executed, for example, by giving a command of each step described below to the computer.
- a first three-dimensional original image and a second three-dimensional original image for creating a fusion image are acquired (step S01).
- the first three-dimensional original image is composed of a first tomographic image of a plurality of cross sections obtained with an arbitrary part force in the subject.
- the second three-dimensional original image is also a second tomographic image force of a plurality of cross sections obtained from the same part.
- the first tomographic image and the second tomographic image are images acquired by different modalities.
- the first tomographic image is a functional image including a SPECT image and a PET image
- the second tomographic image is a morphological image including an MRI image and a CT image.
- a CT image is used as a morphological image and a SPECT image is used as a functional image is described as an example.
- first tomographic image and the second tomographic image may be obtained by the same modality.
- first tomogram and the second tomogram are taken for the same part.
- PET images or SPECT images of different radiopharmaceuticals at the image date and time, or MRI images with different imaging conditions.
- the plurality of first tomographic images and the plurality of second tomographic images are a plurality of cross sections substantially perpendicular to the body axis and acquired from a plurality of cross sections continuous in the body axis direction. It is.
- Each of these images can be acquired by a known method.
- the lateral direction is defined as the X-axis direction
- the depth direction is defined as the y-axis direction
- the body axis direction is defined as the z-axis direction.
- the image data of each of the first three-dimensional original image and the second three-dimensional original image need only be stored in a data format that can be read by a computer.
- data in DICOM format is used. be able to.
- These image data are provided in a form stored in a computer-readable storage medium such as a compact disk.
- a storage medium storing image data into a data reading device provided in the computer, the image data is read into the computer, and the following image processing using these images can be performed on the computer. It becomes possible. Further, the data may be obtained directly via a network as a computer data signal superimposed on a carrier wave.
- a botacell shape conversion step is performed (step S02).
- the first three-dimensional original image and the second three-dimensional original image that is, the three-dimensional original image having a plurality of tomographic forces, may have a cuboid shape extending in the axial direction.
- a process of converting each of the first three-dimensional original image and the second three-dimensional original image into a cube-shaped one is executed.
- the first three-dimensional original image and the second three-dimensional original image have a cubic shape
- this step is not executed, and the first three-dimensional original image is the first three-dimensional original image.
- a three-dimensional image is used, and a second three-dimensional original image is used as a second three-dimensional image.
- the botacell shape conversion step step S02
- the pixel size in the body axis direction is adjusted according to a known linear interpolation method such as a bilinear method or a bicubic method.
- FIG. 2 is a flowchart showing an example of the process in the button cell shape conversion step shown in FIG.
- the botacell shape conversion step shown in Fig. 2 processing based on the bilinear method is adopted.
- the processing of steps S11 to S13 described below is applied to both the first three-dimensional original image and the second three-dimensional original image, and the first three-dimensional image and the second three-dimensional image.
- a three-dimensional image is generated.
- the first three-dimensional original image and the second three-dimensional original image are referred to as “three-dimensional original images”.
- the first three-dimensional image and the second three-dimensional image generated by the botacell shape conversion are referred to as “three-dimensional images”.
- step Sl l the number of botacells in the z-axis direction after conversion into the effective field of view.
- the number of botasels in the z-axis direction is calculated by the calculation of the following equation (1).
- M is the number of bocelles in the z-axis direction after the transformation of the shape of the botacell
- FOV is the effective field of view in the z-axis z2 z-direction
- P is the length of one side in the X-axis and y-axis directions of the botasel . In this way, the number of cubic botasels in the z-axis direction with a side length of P is calculated.
- a new image space for the three-dimensional image after the conversion of the botacell shape is created on the memory (step S12).
- This image space is used to store the pixel values of each of the same number of Botacells as the product of the number of B-cells in the X-axis direction, the number of B-cells in the y-axis direction, and M of the 3D original image.
- a new pixel value is assigned to each button cell in the image space prepared in step S12. Create a free 3D image (step SI 3).
- a three-dimensional image is created by applying a linear interpolation by the bilinear method in the z-axis direction using the coronal or sagittal image in the three-dimensional original image.
- a description will be given by taking as an example the case of performing linear interpolation using a coronal image.
- step S13 four lattice points (j, k), (j +1, k), (j, k +1) and (j
- the pixel value g (x, z) at the point (X, z) is calculated from the pixel value of each 3D original image f.
- ⁇ , k), Kj +1, k), Kj, k +1) and Kj +1, k +1) are the three points surrounding the point (x, z), respectively.
- a new image ie, a three-dimensional image g , in which the shape of the botasels is changed to a cube is formed, and the process of changing the shapes of the botasels is completed.
- step S03 the botasel normalization step
- a process is performed in which the first three-dimensional image and the second three-dimensional image have the same value for the size of the botacell and the number of the botacells in the effective field of view.
- the effective field is small! /, The size of the Botacell and the number of Botacells in the other image. Conversion is made to be the same as the size and the number of botacells.
- the size of the botacell and the number of both cells in the first three-dimensional image are It is adapted to the size of the bot image and the number of bot cells of the dimensional image.
- a Null code ie, 0 value
- FIG. 3 is a flowchart showing an example of processing in the button cell regularity step shown in FIG. In the following, it is assumed that the second three-dimensional image has a larger effective field of view than the first three-dimensional image, and the botacell normalization step based on the bilinear method will be described with reference to FIG.
- a three-dimensional image space force computer having the same botacell size and number of botacells as the second three-dimensional image is prepared on the memory of the computer. (Step S21).
- a first three-dimensional normal image is generated by attaching a pixel value obtained by linear interpolation from the first three-dimensional image to each botacell in the image space.
- the in the present embodiment the second 3D image is directly used as the second 3D normalized image.
- linear interpolation is performed by the bilinear method to calculate a temporary pixel value, which is attached to each botacel in the image space (Ste S22).
- the interpolation processing in step S22 is referred to as “primary interpolation processing”.
- the first-order interpolation processing is completed by sequentially performing row-by-line processing for all the botasels and attaching the obtained pixel values to the respective botasels.
- similar interpolation processing is performed in the sagittal image or the coronal image (step S23).
- the processing in step S23 is referred to as secondary interpolation processing.
- the secondary interpolation process will be described using an example of performing the interpolation process with coronal images.
- xz coordinates are set on the coronal image.
- a grid point is assumed on this coordinate, and a three-dimensional image h is formed by applying a primary interpolation process.
- the pixel value h (x, z) to be calculated is calculated by the following equation (4).
- H 2 (x, z) (lr 3 )-(l-S3j-h 1 (j3, k 3 ) + r3-(1-s 3 ) ⁇ (j 3 + l, k 3 )
- the first normalized 3D image h is obtained by assigning a value to each buttonel.
- the same processing as in steps S21 to S23 is performed on the second three-dimensional image. Just do it.
- the number of voxels in an image with a large effective field of view may be matched with an image with a small effective field of view.
- the size of the button cell and the number of button cells of the second three-dimensional image are set to those in the first three-dimensional image. Processing that matches the cell size and the number of botacells can be performed.
- the part included in the effective field of view of the second 3D image after conversion is substantially the same as the part included in the effective field of view of the first 3D image.
- a fusion image generation step (step S04) is executed following the botasel normalization step.
- a fusion image is created by executing an overlay process of the first normalized 3D image and the second normalized 3D image.
- FIG. 4 is a flowchart showing an example of processing in the fusion image generation step shown in FIG.
- step S 31 coordinate transformation of the first normal 3D image is performed using the given coordinate transformation parameters.
- These coordinate transformation parameters use a total of six parameters: parameters for translating images (Tx, Ty, Tz) and parameters for rotating images (0 X, ⁇ ⁇ , ⁇ ⁇ ). It is done.
- An arbitrarily selected value can be used as the initial value of the coordinate conversion parameter. For example, all the coordinate conversion parameters can be set to 0 as initial values.
- step S32 the mutual information amount of the fusion image using the second normalized 3D image and the first normalized 3D image after coordinate transformation is calculated (step S32).
- the value of this mutual information 1 ( ⁇ , ⁇ ) is calculated by the following equations (5) to (8).
- H (A) V ⁇ log, ⁇ -...
- N is a button having pixel value A in the second normal 3D image.
- N is the number of cells, and N is the pixel value B in the first normal 3D image after coordinate transformation.
- the number of voxels that have Bi i. N is the same for pixel values A and B in the fusion image.
- the fusion image generation step it is repeatedly performed while updating the coordinate transformation parameters for the first regular ⁇ 3D image (step S3 4), and the mutual information amount is maximized. Is extracted (step S33). Then, a fusion image is generated between the first normal 3D image and the second normal 3D image that have undergone coordinate conversion using the coordinate conversion parameter that maximizes the mutual information amount (step S 35). ).
- Update and optimization of the coordinate transformation parameters can be performed using various known algorithms. For example, direct search methods represented by the simplex method and the Bowell method, steepest descent method (maximum gradient method), and gradient methods represented by the conjugate gradient method (mountain climbing method) can be used (Tomoharu Nagao, "Optimization algorithm", first edition, Shoshodo Co., Ltd., 2000; Frederi Maes et al., IEEE Transactions on Medical Imaging, 1997, 16, 2, p.187-198).
- the steepest descent method will be described below as an example of an optimization algorithm.
- the first normalized 3D image is transformed using arbitrary coordinate transformation parameters (Tx, Ty, ⁇ , ⁇ ⁇ , ⁇ ⁇ , 0 ⁇ ), and the first Calculated using normalized 3D images
- Tx, Ty, ⁇ , ⁇ ⁇ , ⁇ ⁇ , 0 ⁇ The rate of change between the mutual information and the mutual information calculated using the first normalized 3D image after conversion is obtained.
- This calculation is repeated while changing the coordinate conversion parameters in various ways, and a combination of conversion parameters that maximizes the rate of change in the mutual information is extracted.
- the mutual information calculated using the first normal 3D image transformed using the extracted coordinate transformation parameters and the transformation using arbitrary different coordinate transformation parameters The rate of change between the mutual information calculated using the first normalized 3D image is obtained. Perform the same operation as above to extract conversion parameters that maximize the rate of change in mutual information, and reconvert to the first regular 3D image. This operation is repeated and finally the mutual information change rate converges to zero.
- the condition for converging the rate of change of mutual information to 0 corresponds to the transformation condition (coordinate transformation parameter) for maximizing the mutual information.
- a fusion image is created using the first normal 3D image and the second normalized 3D image that have been converted in position and orientation using this condition.
- FIG. 5 is a diagram showing the configuration of the image processing program according to the embodiment of the present invention together with a recording medium.
- An image processing program 10 shown in FIG. 5 is provided by being stored in a recording medium 100.
- the recording medium 100 include a flexible disk, a CD-ROM, a DVD, a ROM, and other recording media, and a semiconductor memory.
- FIG. 6 is a diagram illustrating a hardware configuration of a computer for executing a program stored in the recording medium
- FIG. 7 is a perspective view of the computer for executing the program stored in the recording medium.
- the computer 110 includes a reading device 112, such as a flexible disk drive device, a CD-ROM drive device, a DVD drive device, a working memory (RAM) 114 in which an operating system is resident, and a recording medium.
- the computer 110 can access the image processing program 10 stored in the recording medium 100 from the reading device 112, and the image processing program 10 makes it possible to access the present invention.
- the image processing program 10 makes it possible to access the present invention.
- the image processing program 10 may be provided via a network as a computer data signal 130 superimposed on a carrier wave.
- the computer 110 can store the image processing program 10 received by the communication device 124 in the memory 116 and execute the image processing program 10.
- the image processing program 10 includes a main module 11 that supervises processing, a three-dimensional original image acquisition module 12, a botacell shape conversion module 14, a botacell normalization module 16, and a fusion image.
- a generation module 18 and an output module 20 are provided.
- the three-dimensional original image acquisition module 12 causes the computer to execute the process of step SO1
- the botacell shape conversion module 14 causes the computer to execute the process of step S02.
- the process of step S03 is executed by the computer
- the fusion image generation module 18 causes the computer to execute the process of step S04.
- the output module 20 outputs the obtained fusion image to a display device such as a display.
- the fusion image is displayed simultaneously using multiple windows for different cross-sectional images. In this case, it is preferable to display a coronal image in one window and display a cross-sectional image in the other window because the position information of the diseased part is reflected more.
- FIG. 8 is a diagram showing the configuration of the image processing apparatus according to the embodiment of the present invention.
- the image processing apparatus 30 shown in FIG. 8 functionally includes a three-dimensional original image acquisition unit 32, a botacel shape conversion unit 34, a botacell normalization unit 36, a fusion image generation unit 38, and an output unit 40. I have.
- the three-dimensional original image acquisition unit 32 is a part that executes the process of step S01
- the botacel shape conversion unit 34 is a part that executes the process of step S02.
- 36 is a part for executing the process of step S03
- a fusion image generating unit 38 is a part for executing the process of step S04.
- the output unit 40 is a part that outputs the obtained fusion image to a display,! /, And a display device.
- the powerful image processing device 30 is a computer that operates according to the image processing program 10 described above. Can be pewter.
- the image processing device 30 is also configured with dedicated circuit power for executing the processes of the three-dimensional original image acquisition unit 32, the voxel shape conversion unit 34, the botacell normalization unit 36, the fusion image generation unit 38, and the output unit 40. Even a device! / Example
- Head FDG PET image (Fig. 9, Matrix: 128 x 128, Number of slices: 14 slices, Botacel size: 2.00mm x 2.00mm x 6.50mm) as the first 3D original image
- Head MRI image (Fig. 10)
- Matrix: 256 X 256 Number of slices: 99 slices
- the Fusion image was created using the mutual information maximization method (Cost Function 5) using the Corege.exe ver. That is, the fusion image was generated only by the mutual information maximization method without performing the botacell shape conversion and the botacell normalization.
- the following values were used for various setting parameters in the Corege.exe ver.5 program.
- FIG. 11 shows the created fusion image.
- images of multiple cross sections in a fusion image are displayed using multiple windows.
- the overlay accuracy in the created fusion image was not good, and a pair of images shifted from each other and overlapped in each cross section.
- the second three-dimensional original image (MRI image) is interpolated in the slice direction (ie z-axis direction), matrix: 256 X 256, number of slices: 167 slices, botacell size: 0.879 mm X 0.879
- a second 3D image was obtained by converting to a mm x 0.879 mm image.
- the first three-dimensional original image was used as it was as the first three-dimensional image.
- an interpolation process was performed on the cross section of the first three-dimensional image (PET image) to convert the matrix to 256 x 256 and pixel size: 0.879 mm x 0.879 mm.
- PTT image first three-dimensional image
- conversion to an image of matrix: 256 x 256, number of slices: 167 slices, voxel size: 0.879 mm x 0.879 mm x 0.879 mm, and the first normal A three-dimensional image was obtained.
- the second 3D image was directly used as the second normalized 3D image.
- Figure 12 shows the created fusion image.
- a plurality of cross-sectional images in the fusion image are displayed using a plurality of windows.
- the overlay accuracy in the obtained fusion image is good, and a good fusion image can be automatically created by the processing according to the present invention.
- Chest FDG PET image ( Figure 13, Matrix: 128 x 128, Number of slices: 136 slices, Botacell size: 4.29mm x 4.29mm x 4.29mm) as the first 3D original image
- Chest CT image ( Figure 14, Matrix: 256 x 256, Number of slices: 81 slices, Botacel size: 1.875mm x 1.8 75mm X 5.000mm) as the second 3D original image
- Corege.exe ver.5 program installed in NEUROSTAT (supplied by Prof. Satoshi Kajijima, University of Washington Medical School)
- a fusion image was created using (Cost Function 5). That is, the fusion image was generated only by the mutual information maximization method without performing the botacell shape conversion and the botacell normalization.
- the same values as in Comparative Example 1 were used for various setting parameters in the Corege.exe ver.5 program.
- Figure 15 shows the created fusion image.
- a plurality of cross-sectional images in a fusion image are displayed using a plurality of windows.
- the overlay accuracy in the created fusion image was not good, and a pair of images shifted from each other and overlapped in each cross section.
- the second 3D original image (CT image) is interpolated in the slice direction (ie z-axis direction), matrix: 256 X 256, number of slices: 312 slices, botacell size: 1. 875mm X 1.875
- a second 3D image was obtained by converting to a mm x 1.875 mm image.
- the first three-dimensional original image was used as it was as the first three-dimensional image.
- interpolation processing was performed on the cross section of the first three-dimensional image (PET image), and the matrix was converted to 256 X 256 and the pixel size: 1.875 mm X I.875 mm.
- the image is converted into an image of matrix: 256 x 256, number of slices: 312 slices, voxel size: 1.875 mm x 1.875 mm x 1.875 mm, and the first regular A three-dimensional image was obtained.
- the second 3D image was directly used as the second normalized 3D image.
- FIG. 16 shows the created fusion image.
- images of a plurality of cross sections in a fusion image are displayed using a plurality of windows.
- the overlay accuracy in the obtained fusion image is good
- the overlay accuracy in the obtained fusion image is good
- a good fusion image is automatically converted by the processing according to the present invention. It was confirmed that it could be created automatically.
- the present invention is useful for automatically and accurately creating a fusion image, and can be used in the field of image diagnostic apparatuses.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Medical Informatics (AREA)
- Animal Behavior & Ethology (AREA)
- Probability & Statistics with Applications (AREA)
- Evolutionary Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Bioinformatics & Computational Biology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- High Energy & Nuclear Physics (AREA)
- Computer Graphics (AREA)
- Radiology & Medical Imaging (AREA)
- Image Processing (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Image Generation (AREA)
- Processing Or Creating Images (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/064,430 US8126243B2 (en) | 2005-08-23 | 2006-02-21 | Image processing method, image processing program, and image processing device |
AU2006282500A AU2006282500A1 (en) | 2005-08-23 | 2006-08-17 | Image processing method, image processing program, and image processing device |
JP2007532076A JP4879901B2 (ja) | 2005-08-23 | 2006-08-17 | 画像処理方法、画像処理プログラム、及び画像処理装置 |
CA002620216A CA2620216A1 (en) | 2005-08-23 | 2006-08-17 | Image processing method, image processing program, and image processing device |
EP06796504A EP1926053A4 (en) | 2005-08-23 | 2006-08-17 | PICTURE PROCESSING, PICTURE PROCESSING PROGRAM AND PICTURE PROCESSING DEVICE |
IL189660A IL189660A0 (en) | 2005-08-23 | 2008-02-21 | Image processing method, image processing program, and image processing device |
NO20081344A NO20081344L (no) | 2005-08-23 | 2008-03-13 | Fremgangsmate, program og anordning ved billedbehandling |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005-241624 | 2005-08-23 | ||
JP2005241624 | 2005-08-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007023723A1 true WO2007023723A1 (ja) | 2007-03-01 |
Family
ID=37771471
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/316147 WO2007023723A1 (ja) | 2005-08-23 | 2006-08-17 | 画像処理方法、画像処理プログラム、及び画像処理装置 |
Country Status (12)
Country | Link |
---|---|
US (1) | US8126243B2 (ja) |
EP (1) | EP1926053A4 (ja) |
JP (1) | JP4879901B2 (ja) |
KR (1) | KR20080042140A (ja) |
CN (1) | CN101248461A (ja) |
AU (1) | AU2006282500A1 (ja) |
CA (1) | CA2620216A1 (ja) |
IL (1) | IL189660A0 (ja) |
NO (1) | NO20081344L (ja) |
RU (1) | RU2008110951A (ja) |
TW (1) | TW200729075A (ja) |
WO (1) | WO2007023723A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6092336B1 (ja) * | 2015-09-28 | 2017-03-08 | 国立大学法人 筑波大学 | 画像処理システム、画像処理方法及び画像処理プログラム |
US10155123B2 (en) | 2015-10-29 | 2018-12-18 | Sumitomo Heavy Industries, Ltd. | Neutron capture therapy system |
Families Citing this family (58)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8189962B2 (en) * | 2006-12-19 | 2012-05-29 | Hitachi Kokusai Electric Inc. | Image processing apparatus |
JP2008306512A (ja) * | 2007-06-08 | 2008-12-18 | Nec Corp | 情報提供システム |
US9635285B2 (en) | 2009-03-02 | 2017-04-25 | Flir Systems, Inc. | Infrared imaging enhancement with fusion |
US9208542B2 (en) | 2009-03-02 | 2015-12-08 | Flir Systems, Inc. | Pixel-wise noise reduction in thermal images |
US9517679B2 (en) | 2009-03-02 | 2016-12-13 | Flir Systems, Inc. | Systems and methods for monitoring vehicle occupants |
US9843742B2 (en) | 2009-03-02 | 2017-12-12 | Flir Systems, Inc. | Thermal image frame capture using de-aligned sensor array |
US9986175B2 (en) | 2009-03-02 | 2018-05-29 | Flir Systems, Inc. | Device attachment with infrared imaging sensor |
WO2012170949A2 (en) | 2011-06-10 | 2012-12-13 | Flir Systems, Inc. | Non-uniformity correction techniques for infrared imaging devices |
US10244190B2 (en) | 2009-03-02 | 2019-03-26 | Flir Systems, Inc. | Compact multi-spectrum imaging with fusion |
US10757308B2 (en) | 2009-03-02 | 2020-08-25 | Flir Systems, Inc. | Techniques for device attachment with dual band imaging sensor |
US9756264B2 (en) | 2009-03-02 | 2017-09-05 | Flir Systems, Inc. | Anomalous pixel detection |
USD765081S1 (en) | 2012-05-25 | 2016-08-30 | Flir Systems, Inc. | Mobile communications device attachment with camera |
US9998697B2 (en) | 2009-03-02 | 2018-06-12 | Flir Systems, Inc. | Systems and methods for monitoring vehicle occupants |
US9451183B2 (en) | 2009-03-02 | 2016-09-20 | Flir Systems, Inc. | Time spaced infrared image enhancement |
US9948872B2 (en) | 2009-03-02 | 2018-04-17 | Flir Systems, Inc. | Monitor and control systems and methods for occupant safety and energy efficiency of structures |
US9674458B2 (en) | 2009-06-03 | 2017-06-06 | Flir Systems, Inc. | Smart surveillance camera systems and methods |
US9235876B2 (en) | 2009-03-02 | 2016-01-12 | Flir Systems, Inc. | Row and column noise reduction in thermal images |
US9473681B2 (en) | 2011-06-10 | 2016-10-18 | Flir Systems, Inc. | Infrared camera system housing with metalized surface |
US9843743B2 (en) | 2009-06-03 | 2017-12-12 | Flir Systems, Inc. | Infant monitoring systems and methods using thermal imaging |
US9819880B2 (en) | 2009-06-03 | 2017-11-14 | Flir Systems, Inc. | Systems and methods of suppressing sky regions in images |
US9756262B2 (en) | 2009-06-03 | 2017-09-05 | Flir Systems, Inc. | Systems and methods for monitoring power systems |
US9292909B2 (en) | 2009-06-03 | 2016-03-22 | Flir Systems, Inc. | Selective image correction for infrared imaging devices |
US10091439B2 (en) | 2009-06-03 | 2018-10-02 | Flir Systems, Inc. | Imager with array of multiple infrared imaging modules |
US9716843B2 (en) | 2009-06-03 | 2017-07-25 | Flir Systems, Inc. | Measurement device for electrical installations and related methods |
US9179888B2 (en) * | 2009-08-28 | 2015-11-10 | Dartmouth College | System and method for providing patient registration without fiducials |
US9848134B2 (en) | 2010-04-23 | 2017-12-19 | Flir Systems, Inc. | Infrared imager with integrated metal layers |
US9207708B2 (en) | 2010-04-23 | 2015-12-08 | Flir Systems, Inc. | Abnormal clock rate detection in imaging sensor arrays |
US9706138B2 (en) | 2010-04-23 | 2017-07-11 | Flir Systems, Inc. | Hybrid infrared sensor array having heterogeneous infrared sensors |
US9098904B2 (en) | 2010-11-15 | 2015-08-04 | Dartmouth College | System and method for registering ultrasound and magnetic resonance images |
US10841508B2 (en) | 2011-06-10 | 2020-11-17 | Flir Systems, Inc. | Electrical cabinet infrared monitor systems and methods |
US10079982B2 (en) | 2011-06-10 | 2018-09-18 | Flir Systems, Inc. | Determination of an absolute radiometric value using blocked infrared sensors |
US10169666B2 (en) | 2011-06-10 | 2019-01-01 | Flir Systems, Inc. | Image-assisted remote control vehicle systems and methods |
US9235023B2 (en) | 2011-06-10 | 2016-01-12 | Flir Systems, Inc. | Variable lens sleeve spacer |
US9058653B1 (en) | 2011-06-10 | 2015-06-16 | Flir Systems, Inc. | Alignment of visible light sources based on thermal images |
US9961277B2 (en) | 2011-06-10 | 2018-05-01 | Flir Systems, Inc. | Infrared focal plane array heat spreaders |
US9509924B2 (en) | 2011-06-10 | 2016-11-29 | Flir Systems, Inc. | Wearable apparatus with integrated infrared imaging module |
EP2719166B1 (en) | 2011-06-10 | 2018-03-28 | Flir Systems, Inc. | Line based image processing and flexible memory system |
US9706137B2 (en) | 2011-06-10 | 2017-07-11 | Flir Systems, Inc. | Electrical cabinet infrared monitor |
US10389953B2 (en) | 2011-06-10 | 2019-08-20 | Flir Systems, Inc. | Infrared imaging device having a shutter |
US9143703B2 (en) | 2011-06-10 | 2015-09-22 | Flir Systems, Inc. | Infrared camera calibration techniques |
US10051210B2 (en) | 2011-06-10 | 2018-08-14 | Flir Systems, Inc. | Infrared detector array with selectable pixel binning systems and methods |
US9900526B2 (en) | 2011-06-10 | 2018-02-20 | Flir Systems, Inc. | Techniques to compensate for calibration drifts in infrared imaging devices |
CN109618084B (zh) | 2011-06-10 | 2021-03-05 | 菲力尔系统公司 | 红外成像系统和方法 |
US9811884B2 (en) | 2012-07-16 | 2017-11-07 | Flir Systems, Inc. | Methods and systems for suppressing atmospheric turbulence in images |
US9552533B2 (en) * | 2013-03-05 | 2017-01-24 | Toshiba Medical Systems Corporation | Image registration apparatus and method |
US9684674B2 (en) * | 2013-04-02 | 2017-06-20 | Blackford Analysis Limited | Image data processing |
US9973692B2 (en) | 2013-10-03 | 2018-05-15 | Flir Systems, Inc. | Situational awareness by compressed display of panoramic views |
US11297264B2 (en) | 2014-01-05 | 2022-04-05 | Teledyne Fur, Llc | Device attachment with dual band imaging sensor |
JP5689205B1 (ja) * | 2014-11-21 | 2015-03-25 | 日本メジフィジックス株式会社 | 頭部核医学画像の評価法 |
KR101923183B1 (ko) * | 2016-12-14 | 2018-11-28 | 삼성전자주식회사 | 의료 영상 표시 방법 및 의료 영상 표시 장치 |
EP3684463A4 (en) | 2017-09-19 | 2021-06-23 | Neuroenhancement Lab, LLC | NEURO-ACTIVATION PROCESS AND APPARATUS |
US11717686B2 (en) | 2017-12-04 | 2023-08-08 | Neuroenhancement Lab, LLC | Method and apparatus for neuroenhancement to facilitate learning and performance |
WO2019133997A1 (en) | 2017-12-31 | 2019-07-04 | Neuroenhancement Lab, LLC | System and method for neuroenhancement to enhance emotional response |
US11364361B2 (en) | 2018-04-20 | 2022-06-21 | Neuroenhancement Lab, LLC | System and method for inducing sleep by transplanting mental states |
US11452839B2 (en) | 2018-09-14 | 2022-09-27 | Neuroenhancement Lab, LLC | System and method of improving sleep |
CN110146880B (zh) * | 2019-06-06 | 2021-06-08 | 深圳市重投华讯太赫兹科技有限公司 | 成像方法、终端设备以及计算机存储介质 |
CN111429571B (zh) * | 2020-04-15 | 2023-04-07 | 四川大学 | 一种基于时空图像信息联合相关的快速立体匹配方法 |
US11494955B2 (en) * | 2020-06-10 | 2022-11-08 | Siemens Medical Solutions Usa, Inc. | Data driven reconstruction in emission tomography |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000040145A (ja) * | 1998-07-23 | 2000-02-08 | Godai Kk | 画像処理装置、画像処理方法及び画像処理プログラムを記録した記録媒体 |
JP2004508856A (ja) * | 2000-09-15 | 2004-03-25 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 尤度最大化を利用した画像位置合わせ(registration)システム及び方法 |
US6904163B1 (en) | 1999-03-19 | 2005-06-07 | Nippon Telegraph And Telephone Corporation | Tomographic image reading method, automatic alignment method, apparatus and computer readable medium |
-
2006
- 2006-02-21 US US12/064,430 patent/US8126243B2/en not_active Expired - Fee Related
- 2006-08-17 CN CNA2006800310815A patent/CN101248461A/zh active Pending
- 2006-08-17 CA CA002620216A patent/CA2620216A1/en not_active Abandoned
- 2006-08-17 EP EP06796504A patent/EP1926053A4/en not_active Withdrawn
- 2006-08-17 AU AU2006282500A patent/AU2006282500A1/en not_active Abandoned
- 2006-08-17 JP JP2007532076A patent/JP4879901B2/ja active Active
- 2006-08-17 RU RU2008110951/09A patent/RU2008110951A/ru not_active Application Discontinuation
- 2006-08-17 WO PCT/JP2006/316147 patent/WO2007023723A1/ja active Application Filing
- 2006-08-17 KR KR1020087006939A patent/KR20080042140A/ko not_active Application Discontinuation
- 2006-08-22 TW TW095130883A patent/TW200729075A/zh unknown
-
2008
- 2008-02-21 IL IL189660A patent/IL189660A0/en unknown
- 2008-03-13 NO NO20081344A patent/NO20081344L/no not_active Application Discontinuation
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000040145A (ja) * | 1998-07-23 | 2000-02-08 | Godai Kk | 画像処理装置、画像処理方法及び画像処理プログラムを記録した記録媒体 |
US6904163B1 (en) | 1999-03-19 | 2005-06-07 | Nippon Telegraph And Telephone Corporation | Tomographic image reading method, automatic alignment method, apparatus and computer readable medium |
JP2004508856A (ja) * | 2000-09-15 | 2004-03-25 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 尤度最大化を利用した画像位置合わせ(registration)システム及び方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1926053A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6092336B1 (ja) * | 2015-09-28 | 2017-03-08 | 国立大学法人 筑波大学 | 画像処理システム、画像処理方法及び画像処理プログラム |
JP2017068308A (ja) * | 2015-09-28 | 2017-04-06 | 国立大学法人 筑波大学 | 画像処理システム、画像処理方法及び画像処理プログラム |
US10155123B2 (en) | 2015-10-29 | 2018-12-18 | Sumitomo Heavy Industries, Ltd. | Neutron capture therapy system |
Also Published As
Publication number | Publication date |
---|---|
NO20081344L (no) | 2008-05-23 |
CN101248461A (zh) | 2008-08-20 |
AU2006282500A2 (en) | 2008-07-03 |
AU2006282500A1 (en) | 2007-03-01 |
EP1926053A1 (en) | 2008-05-28 |
IL189660A0 (en) | 2008-06-05 |
US8126243B2 (en) | 2012-02-28 |
JPWO2007023723A1 (ja) | 2009-02-26 |
RU2008110951A (ru) | 2009-09-27 |
US20090148019A1 (en) | 2009-06-11 |
JP4879901B2 (ja) | 2012-02-22 |
KR20080042140A (ko) | 2008-05-14 |
EP1926053A4 (en) | 2011-08-10 |
TW200729075A (en) | 2007-08-01 |
CA2620216A1 (en) | 2007-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2007023723A1 (ja) | 画像処理方法、画像処理プログラム、及び画像処理装置 | |
US11132792B2 (en) | Cross domain medical image segmentation | |
JP6316671B2 (ja) | 医療画像処理装置および医用画像処理プログラム | |
EP3550515A1 (en) | Cross-modality image synthesis | |
US20180253838A1 (en) | Systems and methods for medical imaging of patients with medical implants for use in revision surgery planning | |
JP5194138B2 (ja) | 画像診断支援装置およびその動作方法、並びに画像診断支援プログラム | |
JP2014014647A (ja) | 医用画像処理装置及び医用画像処理プログラム | |
Lappas et al. | Automatic contouring of normal tissues with deep learning for preclinical radiation studies | |
Bianchi et al. | 3D slicer craniomaxillofacial modules support patient-specific decision-making for personalized healthcare in dental research | |
CN107077718B (zh) | 在考虑待检查对象的解剖结构时重新格式化 | |
Anchling et al. | Automated orientation and registration of cone-beam computed tomography scans | |
Puggelli et al. | Accuracy assessment of CT-based 3D bone surface reconstruction | |
US20240127613A1 (en) | Disease label creation device, disease label creation method, disease label creation program, learning device, and disease detection model | |
US20240005498A1 (en) | Method of generating trained model, machine learning system, program, and medical image processing apparatus | |
Chen et al. | The research and practice of medical image enhancement and 3D reconstruction system | |
Noorda et al. | Registration of CT to pre-treatment MRI for planning of MR-HIFU ablation treatment of painful bone metastases | |
WO2020175445A1 (ja) | 学習方法、学習装置、生成モデル及びプログラム | |
Tran et al. | An improved method for building a 3D model from 2D DICOM | |
Karner et al. | Single-shot deep volumetric regression for mobile medical augmented reality | |
US20210282733A1 (en) | Edge noise reduction | |
Khodadad et al. | CT and PET Image Registration: Application to Thorax Area | |
Lu et al. | Three-dimensional multimodal image non-rigid registration and fusion in a high intensity focused ultrasound system | |
Lijo et al. | Comparative Analysis of Volume Rendering Techniques on Craniofacial CT Images | |
US20230022549A1 (en) | Image processing apparatus, method and program, learning apparatus, method and program, and derivation model | |
Kan | Organ segmentation of pediatric computed tomography (ct) with generative adversarial networks |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200680031081.5 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2007532076 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006282500 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 566113 Country of ref document: NZ |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12064430 Country of ref document: US Ref document number: 189660 Country of ref document: IL |
|
ENP | Entry into the national phase |
Ref document number: 2620216 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006796504 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1262/CHENP/2008 Country of ref document: IN |
|
ENP | Entry into the national phase |
Ref document number: 2006282500 Country of ref document: AU Date of ref document: 20060817 Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008110951 Country of ref document: RU |