WO2005009242A1 - 医用画像処理装置及び方法 - Google Patents
医用画像処理装置及び方法 Download PDFInfo
- Publication number
- WO2005009242A1 WO2005009242A1 PCT/JP2004/011111 JP2004011111W WO2005009242A1 WO 2005009242 A1 WO2005009242 A1 WO 2005009242A1 JP 2004011111 W JP2004011111 W JP 2004011111W WO 2005009242 A1 WO2005009242 A1 WO 2005009242A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- value
- medical image
- density
- local
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 49
- 230000002159 abnormal effect Effects 0.000 claims description 20
- 238000001514 detection method Methods 0.000 claims description 13
- 238000003672 processing method Methods 0.000 claims 10
- 238000010191 image analysis Methods 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 42
- 206010028980 Neoplasm Diseases 0.000 description 25
- 201000011510 cancer Diseases 0.000 description 25
- 238000006073 displacement reaction Methods 0.000 description 11
- 210000004204 blood vessel Anatomy 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 238000003384 imaging method Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 238000011946 reduction process Methods 0.000 description 3
- 238000011410 subtraction method Methods 0.000 description 3
- 238000002059 diagnostic imaging Methods 0.000 description 2
- 238000002595 magnetic resonance imaging Methods 0.000 description 2
- 239000003550 marker Substances 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- 238000003325 tomography Methods 0.000 description 2
- 206010058467 Lung neoplasm malignant Diseases 0.000 description 1
- 241000219977 Vigna Species 0.000 description 1
- 235000010726 Vigna sinensis Nutrition 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 210000004072 lung Anatomy 0.000 description 1
- 201000005202 lung cancer Diseases 0.000 description 1
- 208000020816 lung neoplasm Diseases 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000002601 radiography Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30061—Lung
Definitions
- the present invention relates to a medical image processing apparatus and method, and more particularly to a medical image processing apparatus and method for creating a shadow enhanced image and a difference image from a medical image.
- the medical image processing apparatus of the present invention includes: a first storage unit configured to store a predetermined part of a subject captured by the medical image diagnostic apparatus as a first medical image; and a first storage unit configured to store the first part stored in the first storage unit.
- the same subject different from the date and time when one image was taken and a second of the same site Second storage means for storing a medical image, and first setting for setting a plurality of first local regions each having at least one pixel as an element in the first medical image stored by the first storage means Means, and a plurality of second local areas corresponding to each of the plurality of first local areas set by the first setting means and having a size equal to or larger than the first local areas.
- a second setting unit for setting the second image stored in the second storage unit; and calculating a reference value of a density value for each of the plurality of second local regions set by the second setting unit. Based on the reference value calculated by the reference value calculating means, and the density values of the pixels in the plurality of first local regions set by the first setting means.
- An image creation method for creating an enhanced image for each of the first local regions. When, and display means for displaying an enhanced image that was created Te cowpea to said image forming means.
- the reference value referred to here means a local maximum value, a local average value, a local median, a constant multiple of the local maximum value or the local average value, a density gradient, and a value relating to the density gradient determined according to the characteristics of the medical image.
- the use of the maximum value is the simplest and most effective value
- the use of the average value or the median suppresses the noise component of the image from the maximum value
- the use of the density gradient reduces the values of adjacent pixels and neighboring pixels. Can respond to sudden changes.
- the emphasized image is created by the image creating means based on a difference value between the reference value and the density value of each pixel in the first local area, and based on the calculated difference value. Is the most desirable, but if it can perform the function of enhancing only the cancer shadow, it can be performed in all calculations such as addition, multiplication, and division.
- the shading of an image is emphasized and the abnormal shading of cancer etc. can be easily found, even if the position shift occurs between the image on the day and the past image.
- the image creating unit includes: a contour image generating unit that generates a contour image of a first or second medical image; and superimposing the contour image and the enhanced image.
- Image superimposing means, and the display means displays a superimposed image superimposed by the image superimposing means.
- the shadow is emphasized in the shadow-enhanced image by superimposing the outline on the shadow-enhanced image. Even if it is difficult to determine where the position is located, it is possible to easily grasp the position of the abnormal shadow while emphasizing the abnormal shadow such as cancer.
- the apparatus further includes a positioning unit that performs positioning between the first medical image and the second medical image.
- pseudo-shading caused by imperfect registration by the positioning unit is reduced in density as compared with the emphasized image created by the image creating unit, and the pseudo-shading is reduced.
- a density reduction unit configured to generate a reduced image, wherein the display unit displays the pseudo-shadow reduction image generated by the density reduction unit.
- FIG. 1 is a diagram showing a configuration of a medical image processing apparatus according to an embodiment of the present invention
- FIG. 2 is a functional block diagram of FIG.
- FIG. 3 is a diagram showing an operation screen of the medical image processing apparatus
- Figure 4 is a diagram showing the outline of the difference processing
- Figure 5 shows an example of an image of the day
- Figure 6 shows an example of a simple difference image
- Figure 7 is a diagram showing an example of the procedure of the difference processing
- Figure 8 is a diagram showing an example of finding the maximum concentration gradient direction
- Figure 9 is a diagram showing another example of finding the maximum direction of the concentration gradient.
- Figures 10 (a) and 10 (b) show a comparison between simple difference processing and anisotropic difference processing
- FIG. 11 is a diagram showing an example of an image obtained by anisotropic difference processing
- Figure 12 shows an image with an abnormal shadow circled:
- Figure 13 is a diagram showing another example of the image of the day.
- FIG. 14 is a diagram showing another example of an image obtained by the difference processing according to the present embodiment
- FIG. 15 is a diagram showing the combination of a separately processed difference image and a contour image
- FIG. 17 is a diagram showing a procedure for synthesizing a difference image and a contour image for each pixel;
- FIG. 18 is a diagram showing an example in which the maximum density gradient direction is different from FIG. 9;
- FIG. 19 is a diagram showing another example of the procedure of the difference processing.
- FIG. 20 is a diagram showing another example of obtaining the maximum direction of the concentration gradient
- FIG. 21 is a diagram showing another example of the specific direction area
- FIG. 22 is a diagram showing an example of difference processing on a DR image
- Figure 23 shows the schematic registration of the current day image and the past image by the simple difference method
- Figure 24 is a diagram showing the detailed registration of the current day image and the past image by the simple difference method
- FIG. 25 is a diagram showing a schematic alignment of an image according to the present embodiment.
- FIG. 26 is a diagram showing detailed registration of an image according to the present embodiment.
- FIG. 27 is a diagram showing another example of the detailed registration of the image according to the present embodiment
- FIG. 28 is a diagram showing an example of a screen of the medical image processing apparatus
- FIG. 29 is a diagram showing a flow of screen operation of the medical image processing apparatus.
- FIG. 30 is a diagram showing a density reduction process using a feature amount
- FIG. 31 is a diagram showing a false-positive density reduction process using a correlation image between edge images
- FIG. 32 is another diagram showing another false-positive density reduction process using a correlation image between edge images. Figure showing an example
- FIG. 33 is a diagram showing the density reduction processing using the correlation between the edge images in which the edges of the DR image are emphasized;
- Figure 34 is a diagram showing the concentration reduction processing by the two-point replacement processing
- FIG. 35 is a diagram showing an example of cursor-linked display on the display screen
- FIG. 36 is a diagram showing another example of cursor-linked display on the display screen
- FIG. 37 is a diagram showing an example of how blood vessels are captured in a plurality of images of the same day;
- FIG. 38 is a diagram showing an example of how blood vessels are captured in a plurality of past images;
- FIG. FIGS. 37 and 38 are diagrams illustrating the difference processing between the current day image and the past image in FIGS. 37 and 38;
- FIG. 40 is a diagram illustrating an example of how to take a local region differently;
- Fig. 41 is a diagram showing another example of how to take local regions differently.
- Fig. 42 is a diagram showing another example of how to take different local regions.
- Fig. 43 is a diagram showing another example of how to take different local regions.
- FIG. 4 is a diagram showing another example of how to take different local regions.
- Fig. 45 is a diagram showing another example of how to take different local regions.
- FIG. 46 is a diagram showing another processing example of the shadow enhancement.
- FIG. 47 is a diagram showing a subroutine used in the processing of step 162 in FIG. BEST MODE FOR CARRYING OUT THE INVENTION
- FIG. 1 shows a configuration of a medical image processing apparatus 10 according to an embodiment of the present invention.
- the medical image processing unit 10 is a central processing unit (hereinafter referred to as CPU) 11, main memory 12, magnetic disk 13, display memory 14, CRT 15, controller 16, mouse 16, mouse 17, keyboard , which are connected via a common bus 19. Further, the medical image photographing device 30 is connected to the medical image processing device 10 via, for example, RAN 32.
- CPU central processing unit
- the medical imaging equipment 30 is an X-ray CT (Co-immediate uted Tomography) equipment, a PET (Positron Emission ion Tomography) equipment, an MR I (Magnetic Resonance Imaging) gS ⁇ DR (Digital Radiography) equipment, an ultrasonic wave It may be an imaging device, a fundus camera, or the like, but is not limited thereto, and may be any device that captures a medical image. Also, medical images Medical images to be processed by the processing device 10 include medical images obtained by the various medical image capturing devices described above.
- the main memory 12 is used as an area for temporary storage and processing of images and data, and the processing results are displayed on the CRT 15 via the display memory 14 and stored on the magnetic disk 13 for redisplay and results. Used for reference.
- the magnetic disk 13 has a medical imaging device 3
- a medical image captured by 0 is stored.
- the first medical image (in this case, the image is taken at a predetermined examination date and time, so also referred to as “the image of the day”) is obtained by imaging a predetermined part of the subject and the patient by the medical image imaging device 30.
- the information on the patient and the imaging site is transmitted to the magnetic disk 13 via the network (LAN) 32 via the common bus 19.
- the second medical image (here, the image was taken on a date different from the date and time when the image was taken on the day, but most of the images were taken earlier than that, so it is also called “past image”) is a magnetic disk 1
- the state stored in 3 is read out to the main memory 12 based on the transmitted patient and imaging region information.
- the image on the day is transmitted to the display memory 14 via the common bus 19 via the network (LAN) 32 and displayed on the CRT 15.
- FIG. 3 shows an operation screen displayed on the CRT 15 of the medical image processing apparatus 10.
- a simple difference image 47 obtained by the simple difference method between the image of the day and the past image, a shadow emphasized image 48 by a process called “anisotropic difference processing” described in this embodiment, Marker display image 49 to indicate can be selected.
- a first local area a is set by the mouse 17 in a part of the display area of the image of the day.
- a plurality of first local regions may be set.
- the CPU 11 sets a second local area A having an area equal to or larger than the local area in which the first local area a is set.
- the local reference value of the second local area A is calculated by the local reference value calculation unit 111 of the CPU 11.
- the local reference value is determined by using a representative value such as the maximum value, average value, or median in the second local area A as it is, by combining the representative values, or by multiplying the representative values by a constant. May be multiplied.
- the shadow-enhanced image is created by performing a difference between the density value of the first local area a and the local reference value of the second local area A by the image creating unit 112 of the CPU 110. Further, the shadow-enhanced image is calculated by a difference value between the reference value and the density value of each pixel of the first local region, and is created by the image creating unit 112 based on the calculated difference value. It is most desirable to perform the calculations in any calculation, such as addition, multiplication, and division, as long as the function that emphasizes only the cancer shadows can be demonstrated.
- the shade-emphasized image is transmitted from the image creation unit 112 to the display memory 14 and displayed on the CRT 15.
- the image density of the medical image is represented by a density value of each pixel constituting the medical image.
- the density value includes, for example, but not limited to, a CT value, a pixel value, and the like, and includes all values representing the density, transparency, opacity, lightness, brightness, signal strength, and the like of an image.
- the density value is inverted depending on the conditions such as the type of medical image capturing device that captured the medical image, the method of calculating and expressing the density value, and whether the image is a negative image or a positive image. I do.
- the density value of the part A when comparing a part A where a living tissue is shown in a medical image and a part B where air or water is shown in a certain medical image, regardless of the magnitude of the density value, the density value of the part A Is defined to be greater than the density value of Part B. According to this definition, for example, in the image shown in FIG. 4, the density value of the portion displayed in white is larger than the density value of the portion displayed in black.
- Fig. 4 explains the principle of the present invention in comparison with the prior art.
- Image processing is performed on a CT tomogram of a lung field of a subject by a simple difference method and an anisotropic difference method, respectively.
- the previous image 2 is subtracted from the image 1 of the day
- the density of the cancer shadow area is high and the density of the other areas is low.
- the difference image is obtained. In other words, cancer shadows are emphasized, making them easier to find.
- FIGS. 5 The image (Fig. 5), in which false positive shadows are embedded in the image of the day corresponding to faint microscopic lung cancer, is aligned with the past image (not shown) so that the correlation of each part of the image is the highest.
- Figure 6 shows the result of subtracting the past image from the image of the day after the alignment.
- cancer shadows are hard to find.
- FIG. 6 This flowchart is performed on behalf of only one pixel in the image. In the main memory 12 for storing the difference image, for example, a sufficiently large value is recorded so that the difference result does not become negative.
- a partial area 52 of the image 51 in FIG. Take the small area 54 of the 3 ⁇ 3 image matrix, for example, which is wider than the pixels of the image on the day, and pass the center point of the small area 54 in the direction where the density gradient is the largest among the vertical, horizontal, and oblique directions.
- Ask for. consider a region of 3 pixels vertically and 3 pixels horizontally centered on point A22 as small region 54, and abs (Al1-A33), abs (A21-A23), abs (A31-A13), From the maximum value of the absolute values of abs (A12-A32), find the direction of the maximum concentration gradient at point A22. Note that the maximum value among the absolute values of abs (A11-A22), abs (A21-A22), abs (A12-A22), and abs (A31-A22) may be used.
- FIG. 9 shows the case where the concentration gradient in the All and A33 directions is the largest.
- a region along the maximum density gradient direction is referred to as a "specific direction region (i, j) (or an anisotropic region (i, j))".
- All, A22. A33 are the same, but ⁇ ⁇ . ⁇ 44 in the same direction may be included.
- B ij is subtracted from the area for which the concentration gradient is obtained. This indicates that the target areas do not necessarily need to be the same. If the area for which the density gradient is to be calculated is the same as the subtraction target area, it is included in Aij. Therefore, there may be B55 (not shown) outside of B44.
- FIG. 9 illustrates the principle of calculating the maximum density gradient of the image 901 of the day, the past image 902, and the difference image 903, and determining whether the maximum directions of the density gradients match or not.
- the procedure is as follows.
- the maximum value max (local maximum value) of the density value in the specific direction area (i, j) of the past image 902 shown in FIG. 9 is obtained.
- Cxmax is subtracted for each pixel (i, j) in the specific direction area (i, j) of the image 901 of the day shown in FIG.
- C is a constant including 1.0, and may be set according to the image. That is, after calculating the maximum value max of All, A22, A33, Bll, and B44 of the past image 902, for example, the maxx constant is subtracted from the pixel All of the image 901 on the day, and is set as sal.
- the pixels of the image 901 of the current day and the image 902 of the past corresponding to the same position of the subject are subtracted from each other.
- step 93 When the result of the subtraction in step 93 is “pixel (i, j) value of difference image SUB> sal”, pixel (i, j) of the difference image is set to sal. That is, the smaller value sal is stored in SUB (same coordinates as All). If the subtraction result is negative, it may be replaced with a specific value, for example, zero.
- Similar processing is performed for the other pixels A22, A33, Bll, and B44 in the specific direction area (i, j).
- the processing of steps 90 to 94 is executed for all pixels of the image. On this occasion, It is preferable that the small areas 54 move while overlapping little by little. By moving in this way, the subtraction for one pixel is performed several times, and the result of the smallest difference is stored.
- Figure 13 is the same-day image including real cancer shadows. From this, the past image is subtracted according to the procedure of the preferred embodiment of the present invention, as shown in FIG. When the position of the cancer shadow emphasized by the subtraction process is difficult to understand, as shown in the example of FIG. 15, when the contour image is separately extracted and combined with the difference image, as shown in FIG. The positional relationship of the abnormal shadow location becomes easier to understand.
- Figure 17 shows the procedure for synthesizing the difference image and the contour image for each pixel.
- a threshold value determination is made as to whether the outline image and the emphasized image can be superimposed. Then, as a result of the determination, if it is within the range of the threshold, the process proceeds to step 21;
- the difference result is stored in the memory, and the processing ends.
- the contour density is stored in the memory, and the process ends.
- FIG. 18 shows the case where the concentration gradient in the A21.A23 direction is the largest in the above-mentioned small region. Subtract the maximum value X constant of A21, A22, A23, 21, B24 of the past image 9 02 from the pixel A21 of the image 9 01 of the day, compare the subtraction result with the current SUB (the same coordinate as A21), and Is stored in SUB (same coordinates as A21). The same applies to A22, A23, B21, and B24.
- the method of subtraction is not limited to the above example.
- the “function of the angle formed by the gradient maximum direction” is a function of the angle 0 formed between the maximum direction of the past image density gradient and the maximum direction of the image density gradient of the day, and changes similarly to cos 0. However, it doesn't have to be exact c os 0,
- the subtraction method can perform not only the procedure shown in FIG. 7 but also the processing shown in FIG. That is,
- the maximum value max of the density value of the central pixel of the small area 54 in the specific direction area (i, j) of the past image is obtained.
- Cxmax is subtracted for each pixel (i, j) in the specific direction area (i, j) of the image 901 on the day. Let the result be sal.
- the method of obtaining the maximum density gradient direction is not limited to only Fig. 8, but as shown in Fig. 20, the pixel density differences between two points passing through the circle C are sequentially compared, and the maximum density It is also possible to determine the gradient direction.
- the small area 54 is 3 pixels vertically and 3 pixels horizontally, but the small area 54 is different in size, such as 4 pixels vertically ⁇ 4 pixels horizontally and 5 pixels vertically ⁇ 5 pixels horizontally. Is also good. Further, the size of the small area 54 may be changed depending on the type of the image such as the CT image and the DR image.
- the second or third largest value may be used instead of the local maximum value of the density value, or a constant multiple of the average value may be used. In this case, the effect of shadow enhancement can be obtained. Also, if data on the average value is used, a shadow-enhanced image with less noise than the local maximum value can be obtained.
- the specific direction area is the area (3 pixels) in the direction of the maximum density gradient.
- the specific direction area may be a small rectangular area or a small circular area. It may be an area.
- the shape and size of the specific direction area may be different between the past image 902 and the current day image 901.
- the specific direction area may have a size of one vertical pixel ⁇ one horizontal pixel, that is, one pixel. In this case, the density value of that one pixel becomes the density local maximum value as it is.
- the present invention can be applied to a DR image as shown in FIG.
- the density value of the part displayed in black is larger than the density value of the part displayed in white.
- the result is 183 in FIG. 22. When combined, it becomes 1 84.
- a displacement vector is calculated and the divided image is deformed, and this is repeated until a predetermined number of divisions is reached to obtain a final displacement vector.
- an image obtained by deforming the original image (the image before division and deformation) with the final displacement vector is used.
- a and b are reduced (reduction ratio r) to al and bl, and the correlation 1 between al and b 1 (subtract to determine whether the difference is minimum) is taken and b 1 is translated. To obtain image b2. Next, the correlation between the images after rotation of a1 and b2 is taken to obtain image b3.
- the image b is translated in parallel by dx / reduction ratio !: and dy / reduction ratio r, and an image b4 rotated by the angle ang is obtained.
- a displacement vector for alignment is obtained.
- a is divided into four and A4 is divided into b4, and B4 is divided into B4.
- the image b4 (the original image before division and deformation) is deformed using the displacement vector N, and the image after alignment is obtained.
- the number of image divisions is not limited to 32 ⁇ 32 divisions, and different values may be set according to the images.
- Fig. 28 and Fig. 29 show the flow of screen operation.
- Step 5 2 Judge “Next slice display button pressed?”. If yes, jump to step 57. If NO, proceed to step 53.
- Step 5 3 Judge “Previous slice display button pressed?”. If yes, jump to step 5-8. If no, go to step 54.
- Step 54 Judge "positioning & difference button pressed?” If yes, jump to step 5-9. If NO, proceed to steps 55.
- Step 55 Judge "Difference FP delete button pressed?" If yes, jump to step 60. If no, go to step 56.
- Step 56 Judge "End button pressed?” If Yes, end. If NO, return to step 52.
- Step 5 Display the next slice and jump to step 52.
- Step 6 1 a density reduction method using a shadow feature will be described with reference to FIG. (Step 6 1)
- the narrowed portion of the binarized region is cut to make an isolated region.
- step 69 Perform other determination processing. If yes, jump to step 69 to reduce the concentration as a false positive.
- a circle is marked as a cancer shadow candidate at a high density portion of the difference image.
- step 64 It is determined whether all the shadows have been determined. If yes, end. If no, jump to step 64.
- image 3 is obtained by performing difference processing between image 1 of the day and image 2 of the past. If there is no cancer shadow in the past image and there is a cancer shadow in the current day image, the difference image mainly shows the cancer shadow and the portion due to the displacement of the entire image. Here, except for the cancer shadow, Avoiding it makes it easier to find cancer shadows.
- FIG. 32 shows a binary image after emphasizing the edges of the CT images of the current day image and the past image.
- the image 20 is obtained by correlating these images.
- the area where the value of the edge image 20 is 1 in 3 in FIG. 31 is reduced in density.
- a pseudo-shadow reduced image 21 in which the false positive after the difference is reduced in density is obtained.
- the image 20 can be obtained by correlating a region having a density larger than the threshold value without binarizing in advance.
- FIG. 33 shows a case where this processing is applied to a DR image.
- Images with reduced false-positive densities may contain shadows that are suspected of cancer. Without subtraction, it is also possible to combine and display the results of detecting abnormal shadow candidates in each image alone, and seek the doctor's judgment.
- p1 to p5 may be replaced with the maximum value of p1 to p5. In this case, even if p2 to p4 are replaced with the maximum values of p2 to p4, the effect of lowering the concentration can be obtained.
- the medical image processing apparatus 10 makes it difficult to generate false positive shadows (pseudo shadows) by accurately aligning the image of the day and the past image, and reduces the density of the generated false positive shadows. Abnormal shadows such as cancer can be easily found.
- a CT image 21 by the simple difference method and a shadow-enhanced image 22 by the anisotropic difference method are displayed on the CRT 15.
- the cursor of the mouse 17 is displayed on the CRT 15 in the CT image 21 and the shadow-enhanced image 22, respectively.
- the respective cursors are also linked.
- the position is easy to grasp.
- the current day image 31, the past image 32, and the difference may be displayed and the cursor of the mouse 17 may be linked.
- the method of finding the maximum value in the search area is to scan the same address of the Mth image of the past image having the highest correlation with the Nth image of the current day at points (sequential scanning points). At the time of the scanning, assume a two-dimensional image parallel to the maximum direction of the density gradient as shown in Fig. 8, 9 or 18.
- the N + 1st and N ⁇ 1st images before and after (or up and down) the Nth image of the current day image shown in FIG. ) Performs three-dimensional processing using the M + 1st and M-1st images.
- the subtraction value of the (N, XI, Y1) point is The density value VI of the previous image (M, xl, y1) is subtracted from the density value VI of the image (N, XI, Y1) point.
- the subtraction value of the (N, X2, Y2) point is the image of the day. The value is obtained by subtracting the density value v2 of the past image (M_l, X2, y2) from the density value V2 at the point ( ⁇ , ⁇ 2, ⁇ 2).
- the search area is parallel to the maximum density gradient direction of the sequential scanning points of the M-th image. If the subtraction value is negative, the subtraction value may be replaced with zero or the absolute value of the subtraction value. Conversely, the subtraction may be performed by subtracting the current day image from the past image.
- the search area for finding the maximum value may be a cube including the scanning points.
- the boundary of the search area may be on the interpolated image.
- the boundary of the search area may extend over a plurality of original images.
- FIG. 44 shows a case where a MIP (Maximum Intensity Projection) image is created from at least two images including a past image at a slice position corresponding to the current image.
- MIP Maximum Intensity Projection
- the position calculation of the blood vessels and the like is not performed, so the position calculation is performed. Then, the maximum value of the pixel values near (X, y) of the past image b is subtracted from the small area center coordinate (X, y) pixel value of the image a on the day.
- the principle is the same as that described in FIG.
- the maximum value of the pixel values near (X, y) of the MIP image including the past image b is subtracted from the (x, y) pixel value of the image a of the day.
- the 1 ⁇ 1 image 81 is created from at least two images including an image b3 corresponding to the image a1.
- the 1 ⁇ 1? Image 81 may be created by reconstructing a thick slice image from the measurement data.
- a MIP image Bn is similarly created.
- the difference image 1 is a difference between the image a1 and the MIP image B1
- the difference image 2 is a difference between the image a2 and the MIP image B2.
- the difference image n is a difference between the image an and the MIP image Bn.
- FIG. 45 shows, as a modified example of the interpolation image creation, a case where the MIP image Bn is created from four images including the image b6 corresponding to the image a4 in a vertically asymmetric manner.
- the target part is extracted from a plurality of images including the image b whose date is earlier. This step may be omitted when the part is the head.
- a MIP image c is created using the extracted image. If the extraction step is omitted, the MIP image c is created as it is. Further, when a MIP image is created, the range in which the maximum value is found, that is, the range in which the maximum value is found from adjacent slice images, may be defined as the range of CT values CT1 to CT2. Further, a tomographic image having a large slice thickness may be reconstructed and used instead of the MIP image c.
- Align images a and c At the time of this alignment, if the average density and the density histogram are also matched as necessary, the alignment processing becomes difficult.
- Substitute initial values for x and y means the start address of the image read address.
- the maximum pixel value (including the value obtained by multiplying the subtraction value by a certain value) in the small area including (X, y) of the image is subtracted from the pixel value of the (x, y) point of the image a.
- Step 16 It is determined whether the read address X of the image is the maximum value. If it is not the maximum value, go to Step 16; if it is the maximum value, go to Step 16A.
- Step 16B Add 1 to y. That is, the image read address y is updated.
- step 16C It is determined whether or not the image read address y is the maximum value. If it is not the maximum value, proceed to step 16C. If it is the maximum value, the process ends.
- Substitute X with the initial value is set to the address of the initial value with the updated address of the image read address y.
- the detection process (B) using the difference image is performed.
- step 174 It is determined whether or not the operator has preferentially selected the detection rate. If the detection rate priority is selected, go to step 174; otherwise, go to step 1 75.
- the OR processing is performed on the detection points of both A and B, and then the processing ends.
- the difference processing is not performed between all images of the current day image and the past image, but only the past image corresponding to the image where the abnormal location was detected by the abnormal shadow detection process using the original day image is automatically databased. , And may perform a difference process with the image of the day. In the above-described processing, the processing may be performed by exchanging the past image and the image of the day, and a similar result may be obtained.
- the shadow of an image is enhanced, and an abnormal shadow such as cancer can be easily found.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Image Processing (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005512118A JP4408863B2 (ja) | 2003-07-28 | 2004-07-28 | 医用画像処理装置及び方法 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-280987 | 2003-07-28 | ||
JP2003280987 | 2003-07-28 | ||
JP2004-097873 | 2004-03-30 | ||
JP2004097873 | 2004-03-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005009242A1 true WO2005009242A1 (ja) | 2005-02-03 |
Family
ID=34106914
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/011111 WO2005009242A1 (ja) | 2003-07-28 | 2004-07-28 | 医用画像処理装置及び方法 |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP4408863B2 (ja) |
WO (1) | WO2005009242A1 (ja) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006223739A (ja) * | 2005-02-21 | 2006-08-31 | Hitachi Medical Corp | 差分画像生成装置 |
JP2007000428A (ja) * | 2005-06-24 | 2007-01-11 | Ge Medical Systems Global Technology Co Llc | 画像判定装置、画像判定方法およびx線ct装置 |
JP2008259703A (ja) * | 2007-04-12 | 2008-10-30 | Fujifilm Corp | 対象領域表示方法、装置およびプログラム |
WO2010016293A1 (ja) * | 2008-08-08 | 2010-02-11 | コニカミノルタエムジー株式会社 | 医用画像表示装置、医用画像表示方法及びプログラム |
JP2011024763A (ja) * | 2009-07-24 | 2011-02-10 | Hitachi Ltd | 画像処理方法および画像処理装置 |
JP2015503778A (ja) * | 2011-12-24 | 2015-02-02 | エコール ドゥ テクノロジー スペリウールEcole De Technologie Superieure | ノイズに対して堅牢な画像レジストレーション方法およびシステム |
JP2016167233A (ja) * | 2015-03-10 | 2016-09-15 | 富士通株式会社 | 補間画像生成装置、補間画像生成方法及び補間画像生成用コンピュータプログラム |
WO2017098888A1 (ja) * | 2015-12-10 | 2017-06-15 | 株式会社日立製作所 | 画像処理装置及び画像処理方法 |
JP2017142553A (ja) * | 2016-02-08 | 2017-08-17 | コニカミノルタ株式会社 | 表示制御装置、医用画像調整方法、プログラム及び医用画像管理システム |
WO2019026680A1 (ja) * | 2017-08-03 | 2019-02-07 | キヤノン株式会社 | 画像処理装置、画像処理方法、画像処理システム及びプログラム |
JP2020058623A (ja) * | 2018-10-10 | 2020-04-16 | キヤノンメディカルシステムズ株式会社 | 医用画像処理装置、医用画像処理方法及び医用画像処理プログラム |
JPWO2021177157A1 (ja) * | 2020-03-05 | 2021-09-10 | ||
US11941179B2 (en) | 2010-10-06 | 2024-03-26 | Nuvasive, Inc. | Imaging system and method for use in surgical and interventional medical procedures |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001169182A (ja) * | 1999-12-07 | 2001-06-22 | Fuji Photo Film Co Ltd | 画像表示方法および画像表示装置 |
JP2001170035A (ja) * | 1999-12-16 | 2001-06-26 | Fuji Photo Film Co Ltd | 画像表示方法および画像表示装置 |
JP2001325583A (ja) * | 2000-03-08 | 2001-11-22 | Fuji Photo Film Co Ltd | 画像処理方法および画像処理装置 |
JP2003126076A (ja) * | 2001-10-24 | 2003-05-07 | Hitachi Medical Corp | X線ct装置 |
JP2003153082A (ja) * | 2001-08-27 | 2003-05-23 | Fuji Photo Film Co Ltd | 画像の位置合わせ装置および画像処理装置 |
-
2004
- 2004-07-28 JP JP2005512118A patent/JP4408863B2/ja not_active Expired - Fee Related
- 2004-07-28 WO PCT/JP2004/011111 patent/WO2005009242A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001169182A (ja) * | 1999-12-07 | 2001-06-22 | Fuji Photo Film Co Ltd | 画像表示方法および画像表示装置 |
JP2001170035A (ja) * | 1999-12-16 | 2001-06-26 | Fuji Photo Film Co Ltd | 画像表示方法および画像表示装置 |
JP2001325583A (ja) * | 2000-03-08 | 2001-11-22 | Fuji Photo Film Co Ltd | 画像処理方法および画像処理装置 |
JP2003153082A (ja) * | 2001-08-27 | 2003-05-23 | Fuji Photo Film Co Ltd | 画像の位置合わせ装置および画像処理装置 |
JP2003126076A (ja) * | 2001-10-24 | 2003-05-07 | Hitachi Medical Corp | X線ct装置 |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4626984B2 (ja) * | 2005-02-21 | 2011-02-09 | 株式会社日立メディコ | 差分画像生成装置及び方法 |
JP2006223739A (ja) * | 2005-02-21 | 2006-08-31 | Hitachi Medical Corp | 差分画像生成装置 |
JP2007000428A (ja) * | 2005-06-24 | 2007-01-11 | Ge Medical Systems Global Technology Co Llc | 画像判定装置、画像判定方法およびx線ct装置 |
JP2008259703A (ja) * | 2007-04-12 | 2008-10-30 | Fujifilm Corp | 対象領域表示方法、装置およびプログラム |
WO2010016293A1 (ja) * | 2008-08-08 | 2010-02-11 | コニカミノルタエムジー株式会社 | 医用画像表示装置、医用画像表示方法及びプログラム |
JP2011024763A (ja) * | 2009-07-24 | 2011-02-10 | Hitachi Ltd | 画像処理方法および画像処理装置 |
US11941179B2 (en) | 2010-10-06 | 2024-03-26 | Nuvasive, Inc. | Imaging system and method for use in surgical and interventional medical procedures |
JP2015503778A (ja) * | 2011-12-24 | 2015-02-02 | エコール ドゥ テクノロジー スペリウールEcole De Technologie Superieure | ノイズに対して堅牢な画像レジストレーション方法およびシステム |
JP2016167233A (ja) * | 2015-03-10 | 2016-09-15 | 富士通株式会社 | 補間画像生成装置、補間画像生成方法及び補間画像生成用コンピュータプログラム |
WO2017098888A1 (ja) * | 2015-12-10 | 2017-06-15 | 株式会社日立製作所 | 画像処理装置及び画像処理方法 |
JP2017142553A (ja) * | 2016-02-08 | 2017-08-17 | コニカミノルタ株式会社 | 表示制御装置、医用画像調整方法、プログラム及び医用画像管理システム |
WO2019026680A1 (ja) * | 2017-08-03 | 2019-02-07 | キヤノン株式会社 | 画像処理装置、画像処理方法、画像処理システム及びプログラム |
EP3662835A4 (en) * | 2017-08-03 | 2021-04-28 | Canon Kabushiki Kaisha | IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, IMAGE PROCESSING SYSTEM, AND PROGRAM |
US11302007B2 (en) | 2017-08-03 | 2022-04-12 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, image processing system, and storage medium |
JP7140475B2 (ja) | 2017-08-03 | 2022-09-21 | キヤノン株式会社 | 画像処理装置、画像処理方法、画像処理システム及びプログラム |
JP2019025240A (ja) * | 2017-08-03 | 2019-02-21 | キヤノン株式会社 | 画像処理装置、画像処理方法、画像処理システム及びプログラム |
JP2020058623A (ja) * | 2018-10-10 | 2020-04-16 | キヤノンメディカルシステムズ株式会社 | 医用画像処理装置、医用画像処理方法及び医用画像処理プログラム |
JP7220542B2 (ja) | 2018-10-10 | 2023-02-10 | キヤノンメディカルシステムズ株式会社 | 医用画像処理装置、医用画像処理方法及び医用画像処理プログラム |
JPWO2021177157A1 (ja) * | 2020-03-05 | 2021-09-10 | ||
JP7394959B2 (ja) | 2020-03-05 | 2023-12-08 | 富士フイルム株式会社 | 医用画像処理装置、医用画像処理方法及びプログラム、医用画像表示システム |
Also Published As
Publication number | Publication date |
---|---|
JP4408863B2 (ja) | 2010-02-03 |
JPWO2005009242A1 (ja) | 2006-09-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7653263B2 (en) | Method and system for volumetric comparative image analysis and diagnosis | |
US7916918B2 (en) | Diagnostic system for multimodality mammography | |
US8682054B2 (en) | Method and system for propagation of myocardial infarction from delayed enhanced cardiac imaging to cine magnetic resonance imaging using hybrid image registration | |
US5359513A (en) | Method and system for detection of interval change in temporally sequential chest images | |
CN107886508B (zh) | 差分减影方法和医学图像处理方法及系统 | |
EP1704532B1 (en) | System and method for filtering a medical image | |
US8423124B2 (en) | Method and system for spine visualization in 3D medical images | |
JP5643304B2 (ja) | 胸部トモシンセシスイメージングにおけるコンピュータ支援肺結節検出システムおよび方法並びに肺画像セグメント化システムおよび方法 | |
JP4640845B2 (ja) | 画像処理装置およびそのプログラム | |
KR101028365B1 (ko) | 연속된 컴퓨터 단층촬영의 폐결절 다단계 정합 방법 및 장치 | |
US9675317B2 (en) | Interface identification apparatus and method | |
Neubert et al. | Automated 3D segmentation of vertebral bodies and intervertebral discs from MRI | |
JP4408863B2 (ja) | 医用画像処理装置及び方法 | |
EP2601637B1 (en) | System and method for multi-modality segmentation of internal tissue with live feedback | |
JP2009226043A (ja) | 医用画像処理装置及び異常陰影検出方法 | |
US8306354B2 (en) | Image processing apparatus, method, and program | |
US8577101B2 (en) | Change assessment method | |
JP2002219123A (ja) | 投影変換装置及び方法並びに経時差分画像作成装置及び方法 | |
KR101274530B1 (ko) | 이미지 정합 기반 흉부영상 진단 시스템 및 방법 | |
JP2008029798A (ja) | 胸部x線像とその左右反転像を用いるサブトラクション方法 | |
KR101028798B1 (ko) | 다중 페이즈 간 ct영상들의 정합을 이용한 간세포암 검출방법 | |
JP4082718B2 (ja) | 画像認識方法および画像表示方法および画像認識装置 | |
US8165375B2 (en) | Method and system for registering CT data sets | |
JP6642048B2 (ja) | 医療画像表示システム、医療画像表示プログラム及び医療画像表示方法 | |
JP2015136480A (ja) | 3次元医用画像表示制御装置およびその作動方法並びに3次元医用画像表示制御プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005512118 Country of ref document: JP |
|
122 | Ep: pct application non-entry in european phase |