WO2006051831A1 - Image creating method and device - Google Patents

Image creating method and device Download PDF

Info

Publication number
WO2006051831A1
WO2006051831A1 PCT/JP2005/020564 JP2005020564W WO2006051831A1 WO 2006051831 A1 WO2006051831 A1 WO 2006051831A1 JP 2005020564 W JP2005020564 W JP 2005020564W WO 2006051831 A1 WO2006051831 A1 WO 2006051831A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
color
pixel
image generation
luminance value
Prior art date
Application number
PCT/JP2005/020564
Other languages
French (fr)
Japanese (ja)
Inventor
Osamu Arai
Original Assignee
Hitachi Medical Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Medical Corporation filed Critical Hitachi Medical Corporation
Priority to JP2006544930A priority Critical patent/JP4980723B2/en
Publication of WO2006051831A1 publication Critical patent/WO2006051831A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Definitions

  • the present invention uses image data acquired by imaging a subject into which a contrast medium has been injected by a medical image diagnostic apparatus such as an X-ray CT apparatus, an MRI apparatus, or an ultrasonic diagnostic apparatus.
  • a medical image diagnostic apparatus such as an X-ray CT apparatus, an MRI apparatus, or an ultrasonic diagnostic apparatus.
  • the present invention relates to an image generation apparatus and an image generation method for generating a color image representing an enhancement effect of a contrast agent.
  • Some medical diagnostic imaging apparatuses such as an X-ray CT apparatus, an MRI apparatus, and an ultrasonic diagnostic apparatus, for example, photograph a subject into which a contrast medium for contrasting blood flow is injected.
  • Some of the acquired images display the enhancement effect (enhancement effect) by the contrast agent that the brightness value of a certain tissue is higher than before the injection of the contrast agent.
  • Patent Document 1 using an electronic endoscope apparatus which is an example of a medical image diagnostic apparatus, an image obtained by imaging a disease site that changes with time using a contrast agent is obtained. The state of change of the diseased part by the cosmetic agent is displayed with ease and ease. Specifically, the three primary color (R, G, B) image data from an electronic endoscope device is converted into an HSV color system consisting of hue, saturation, and lightness, which is easy to understand for humans, and displayed. is doing.
  • R, G, B three primary color
  • the operator can display only one time phase from among multiple time phase images, as in the case where only the image can be displayed due to the size of the display unit. Assuming that an image is selected and displayed, the operator cannot determine whether the tissue with a high luminance value in the selected image is due to the Henno, or the luminance effect, or because the luminance value is originally high. .
  • Patent Document 1 tissue having a high luminance value on one image is enhanced by a contrast agent. Means and methods for making it easy to understand information on changes in the organization over time, such as due to effects or because the luminance value is originally large, are disclosed. Patent Document 1: JP-A 63-79632
  • An object of the present invention is to provide an image generation apparatus and an image generation method for generating an image that allows easy understanding of information on temporal changes in tissue caused by a contrast agent.
  • the image generation method of the present invention uses the image data acquired by imaging the subject into which the contrast medium is injected, and changes the tissue over time by the contrast medium.
  • the image generating apparatus of the present invention includes an input unit that inputs a plurality of image data having different time phases acquired by imaging a subject into which a contrast medium has been injected.
  • Storage means for storing a plurality of image data having different time phases, and calculation means for generating a medical image using the plurality of image data having different time phases, wherein the calculation means comprises the time phases.
  • One or more feature amounts representing temporal changes in the tissue due to the contrast agent are extracted for each pixel from a plurality of different image data and converted into different color information, and at least for each pixel in a partial region It is characterized by generating a color image colored so as to represent the information of the temporal change.
  • the contrast of whether a tissue having a high luminance value is due to the enhancement effect by the contrast agent or the luminance value is originally high on one image It becomes easy to understand the information on changes in the tissue over time due to the drug.
  • FIG. 1 is a schematic configuration diagram of an X-ray CT apparatus employing an image generating apparatus in a medical image diagnostic apparatus according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram showing color composition processing of the X-ray CT apparatus shown in FIG. 1.
  • FIG. 3 is a diagram showing a characteristic curve representing a change in luminance value of each tissue.
  • FIG. 4 is a diagram showing a characteristic curve of blood vessel A shown in FIG.
  • FIG. 5 is a flowchart showing a color composition processing operation of the X-ray CT apparatus shown in FIG.
  • FIG. 6 is a flowchart showing a synthesis process that is a main part shown in FIG.
  • FIG. 7 is a polar coordinate plane showing an example of calculating time T when the luminance value becomes maximum (minimum).
  • FIG. 8 is another polar coordinate plane showing an example of calculating time T when the luminance value becomes maximum (minimum).
  • FIG. 9 is a front view showing a display example of the image generation device in the medical diagnostic imaging apparatus shown in FIG. 1.
  • FIG. 10 is an explanatory diagram showing a main part of the image generation apparatus in the medical diagnostic imaging apparatus shown in FIG.
  • FIG. 11 is an explanatory diagram showing another main part of the image generation apparatus in the medical image diagnostic apparatus shown in FIG. 9.
  • FIG. 12 is an explanatory view showing another display example in the image generating device in the medical diagnostic imaging apparatus shown in FIG.
  • FIG. 13 is a schematic configuration diagram of an ultrasonic apparatus employing an image generating apparatus in a medical diagnostic imaging apparatus according to a second embodiment of the present invention.
  • FIG. 14 is a diagram showing an example of displaying an ultrasonic image and a color composite image in parallel.
  • FIG. 1 shows an embodiment of the present invention and is a schematic configuration diagram of an example in which the image generation apparatus of the present invention is used in an X-ray CT apparatus as a medical image diagnostic apparatus.
  • the scanner 1 rotatably supported by the scanner control unit 17 is provided with a bed 2 on which an object is placed in an opening la formed in the center thereof so that the position can be adjusted by the bed control unit 16.
  • the X-ray source 3 and the multi 1J X-ray detector 4 are arranged opposite to each other with the subject on top of it. When the X-ray beam from the X-ray generation source 3 is exposed by the high-voltage switching unit 13, the high-voltage generator 14 and the X-ray control unit 15, it is detected by the multi-row X-ray detector 4 after passing through the subject. Is done.
  • the output from the multiple IJX-ray detector 4 is digitized by an A / D converter via an amplifier and input to the image processing device 5 of the processing device 18, and image reconstruction processing is performed by the image processing device 5. Is done.
  • the reconstructed image data is stored in a storage device (not shown) in the image processing device 5, for example.
  • the reconstructed image is displayed on the display unit 11 on the console 12 by the display processing device 6.
  • An image generating device 19 and an image display device 6 are connected to the image processing device 5.
  • the image generation device 19 acquires a feature value acquisition unit 7 that acquires a feature value corresponding to a CT value that is a brightness value, and a feature value for the degree of enhancement effect that acquires a feature value corresponding to the degree of the non-sense effect.
  • a feature value acquisition unit 7 that acquires a feature value corresponding to a CT value that is a brightness value
  • a feature value for the degree of enhancement effect that acquires a feature value corresponding to the degree of the non-sense effect.
  • an enhancement effect time feature acquisition unit 9 that acquires a feature quantity corresponding to the time of the enhancement effect, and these three feature quantity acquisition units?
  • Pre-processing arithmetic unit 20 that performs common operations in 9 and these three feature quantity acquisition units?
  • a color composite image generation unit 10 that generates a color image using feature amounts from .about.9.
  • Each feature amount acquisition unit 7 to 9 is connected to the image processing device 5 via the preprocessing arithmetic unit 20, and image data is input from the image processing device 5, and each feature amount is obtained based on the image data. Acquired and output to the color composite image generation unit 10.
  • the preprocessing arithmetic unit 20 has an input unit (not shown) for inputting image data.
  • the image display device 6 is connected to the image processing device 5, the color composite image generation unit 10, and the display unit 11.
  • the reconstructed image from the image processing device 5 and the color image from the color composite image generation unit 10 are connected. Are displayed on the display unit 11 respectively.
  • the image generating device 19 may be configured to include the image processing device 5, the display processing device 6, the display unit 11, and the console 12.
  • FIG. 2 is a block explanatory diagram showing the color image generation processing operation in the image generation device 19 described above.
  • a conventional CT image is a grayscale representation of an image based only on CT values (that is, luminance values)
  • the image generation device 19 of the present invention obtains it with a luminance value feature amount acquisition unit 7.
  • the luminance value (CT value) data 19 is used as a lightness (Value) component
  • the Henhans effect time characteristic acquisition unit 9 obtains the Henhans effect time data 20 acquired from the time when the Jenno effect due to the contrast agent appears.
  • Component, and the effect effect degree data 21 obtained by the feature effect degree feature acquisition unit 8 is used as a Saturation component, and each of those components is used by the color composition image generation unit 10 to produce a single color composition.
  • the luminance value feature quantity is assigned to the lightness component
  • one of the enhancement effect time feature quantity and the enhancement effect degree feature quantity is assigned to either the hue component or the saturation component. May be used to generate a color image with only two components including the brightness component. Alternatively, a color image may be generated using only the hue component and the saturation component, or only one of the components. In either case, the missing component that is not assigned is fixed to an arbitrary constant value, and a color image is generated.
  • FIG. 3 shows characteristic curves representing temporal changes in the luminance values of blood vessel 47, blood vessel 48, liver 49, and bone 50, respectively.
  • the time change of the blood vessel 47 is the characteristic curve 23
  • the time change of the blood vessels 48 and 49 is the characteristic curve 24
  • the time change of the bone 50 Is indicated by a straight line 25.
  • the blood vessel 47 is enhanced by the contrast agent
  • the entire liver 49 and the blood vessel 48 are enhanced by the contrast agent.
  • the CT value of the bone 50 remains constant regardless of time.
  • FIG. 4 shows a characteristic curve 23 representing the temporal change in the luminance value of the blood vessel 47 in FIG. 3. From this characteristic curve 23, the time T47 when the luminance value in the blood vessel 47 becomes maximum, the maximum luminance value 147, The luminance value change amount D47 can be acquired respectively. Similarly, each S container or pixel Each time, the time Tn at which the brightness value of the organ or pixel becomes the maximum, the maximum brightness value ⁇ , and the brightness value change amount Dn can be obtained from the characteristic curve representing the temporal change in the brightness value of the organ or pixel. .
  • the luminance value feature quantity acquisition unit 7 that acquires the feature quantity corresponding to the CT value shown in FIG. 2 is the maximum (minimum) value that is the maximum (minimum) value of the CT value as the feature quantity of the luminance value I.
  • the luminance value In is acquired.
  • the CT value of any one tomographic image may be the maximum (minimum) luminance value In.
  • the enhancement effect degree feature amount acquisition unit 8 that acquires a feature amount corresponding to the degree of the non-sense effect is a luminance value change amount that is a change amount of the CT value as a feature amount corresponding to the degree D of the enhance effect. Try to get Dn.
  • the non-sense effect time feature quantity acquisition unit 9 that acquires the feature quantity corresponding to the time when the Enhansse effect appears has a luminance value (that is, CT value) as the feature quantity corresponding to the time T when the Enhansse effect appears.
  • a luminance value that is, CT value
  • ijTn may be used when the amount of change in the luminance value is equal to or greater than a predetermined threshold.
  • the threshold value in this case can be set to 1/2 of the maximum (minimum) change amount of the luminance value.
  • the time Tn when the gradient of the luminance value change exceeds a predetermined threshold is acceptable.
  • the threshold in this case can be set to 1/2 of the maximum (minimum) gradient of the luminance value.
  • At least one of the force S, the enhancement effect degree feature amount, and the enhancement effect time feature amount described to acquire three feature amounts may be used, or four or more You can acquire feature values and select two or three feature values from them.
  • the color composite image generation unit 10 After that, the color composite image generation unit 10 generates a color image that simultaneously displays the feature amounts. The operator must compare the display of each feature value to determine whether the luminance value of the organization with a high luminance value is due to the enhancement effect by the contrast agent, or whether the CT value is originally large. Will be able to. [0024] In order to generate a color image that simultaneously displays each feature quantity, the color composite image generation unit 10 combines a color image that is displayed in color by, for example, the HSV color system. That is, the power error composite image generation unit 10 uses the Hue (hue) component as the time T when the Hannhans effect appears, the degree of the Hannhans effect, D as the Saturation component, and the CT value I as Value. Assign to each (lightness) component.
  • Hue Hue
  • the brightness value change amount Dn is used as the feature amount representing the degree of the enhancement effect, D, for example, in the case of a tissue having no enhancement effect such as the bone 50, the brightness value change amount D50 is reduced.
  • the luminance value change D47 increases, so this luminance value change Dn is acquired for each organ or pixel and is associated with the saturation component of the HSV color system. be able to.
  • the color composite image generation unit 10 compares the acquired luminance value change amount Dn with a preset threshold value, and determines that there is no hen-nos effect when the value is equal to or less than the threshold value, as in the conventional case. Let it be displayed in one scale. On the other hand, if the color composite image generation unit 10 determines that the luminance value change amount Dn is equal to or greater than the threshold value and there is an enhancement effect, the color value change amount Dn is assigned to the saturation component as a color display of the HSV color system.
  • the threshold value in this case can be set to 1/2 of the maximum (minimum) value of the luminance value change amount Dn.
  • the operator can easily identify that the tissue displayed in color by the saturation component is due to the enhancement effect.
  • the display of the part where there is no hennosence effect is displayed in grayscale by the lightness component as in the conventional CT value. For this reason, the operator can easily determine that the part has no enhancement effect, and can easily obtain the same information as obtained from the CT value information in the image display device so far. These can be easily done by an operator who is not familiar with color images.
  • FIG. 5 is a flowchart showing a specific processing operation of the image generation apparatus 19 described above.
  • step SI the volume data force obtained by imaging the subject at time 1, time 2, time 3,... IjN after injection of the contrast agent.
  • the preprocessing operation unit from the storage device in the image processing device 5 A total of N are read into 20.
  • step S2 the volume data are aligned with each other so that the corresponding organizations in the N volume data are located at the same position. This is because the position of the subject in the image varies depending on the time phase when the subject's position changes during dynamic CT imaging, or when the subject's breathing moves due to respiration. This is done to correct the end. Specifically, based on any one volume data among the N volume data, at least 3 reference points are provided on the N volume data, and their positions vary from the reference volume data. The amount of movement and rotation of the tissue is calculated from the amount, and it is only necessary to reversely convert the amount and match all the volume data with the reference volume data.
  • the processing in step S2 is performed by, for example, the preprocessing arithmetic unit 20.
  • step S3 a desired tomographic image is cut out from each volume data. Specifically, the position and angle information of a section to be cut out is specified using any one volume data, and a tomographic image including a desired section is cut out from all volume data based on these specifications. The images cut out at this time are the same cross-sectional images having different time phases.
  • the processing in step S3 is performed by, for example, the preprocessing arithmetic unit 20.
  • step S2 If the data read in step S1 is not volume data but data of a tomogram including the desired cross section as described above, in step S2, the tomograms are aligned with each other. Step S3 is not necessary.
  • step S4 color composition processing in the time axis direction is performed for at least a part of the organs or at least a part of the pixels in a part of the region. Perform (coloring) calculation.
  • the processing in step S4 is performed by the feature amount acquisition units 7, 8, 9 and the color composite image generation unit 10.
  • step S5 the color image generated in step S4 is displayed on the display unit 11 in the medical image display device. It should be noted that before the above-described step SI, there is a step of actually photographing the subject into which the contrast medium has been injected, acquiring volume data of different phases, and storing it in a storage device in the image processing device 5, for example. Even if it is good ,.
  • the color composition process in step S4 is a step of extracting the luminance value of the same organ or the same pixel for at least some of the organs or for every pixel in at least some of the regions.
  • step S4e for coloring the organ or pixel from the maximum (minimum) value In and the amount of change Dn and the time Tn when the luminance value reaches the maximum (minimum), changing the organ or pixel. .
  • steps S4b, S4c, S4d, and S4e may be repeated independently with different organs or pixels.
  • Figure 6 shows only the pixel and maximum cases.
  • the organ or region is designated on an arbitrary tomographic image using a mouse or the like before step S4a.
  • each step of step S4 will be described in detail.
  • step S4a the luminance value of the same organ or the same pixel is extracted for each tomographic image having a different time phase, and the temporal change in the luminance value of the organ or pixel is acquired.
  • the characteristic curves shown in Fig. 3 and Fig. 4 are obtained.
  • the characteristic curve data extracted in the process of step S4a is data that is commonly used in the subsequent steps S4b to S4d, and is performed by, for example, the preprocessing arithmetic unit 20 in FIG.
  • the process of step S4b is performed by the luminance value feature quantity acquisition unit 7. Note that the luminance value of any one tomographic image may be In as the maximum luminance value.
  • the processing of step S4c is performed by the enhancement effect level and feature quantity acquisition unit 8.
  • step S4d a time Tn at which the change of the luminance value in the time axis direction becomes the maximum (minimum) value is acquired.
  • the processing of step S4d is performed by the Jenhans effect time feature quantity acquisition unit 9. It is. It should be noted that the time Tn when the change amount of the luminance value becomes equal to or greater than a predetermined threshold is also acceptable. Alternatively, it may be the time ⁇ when the gradient of the luminance value change becomes a predetermined threshold value or more.
  • step S4d a method of acquiring time T at which the luminance value becomes maximum (minimum) with relatively excellent resolution even when the value of volume data number N is small will be described.
  • N 3 is shown as an example
  • VI, V2, and V3 are the luminance values at time 1, time 2, and hour 1J3.
  • the combined vector of these vectors is calculated, the declination angle Theta of this combined vector is calculated, and this is converted into time to obtain the time T when the luminance value becomes maximum (minimum).
  • the maximum (minimum) luminance value In (x, y), the luminance value change Dn (x, y), and the luminance value at the maximum (minimum) of the pixel (x, y) shown in Equation 1 Tn (x, y) can be obtained from the maximum value Max (x, y), minimum value Min (x, y), and maximum (minimum) time Theta of the pixel (x, y).
  • step S4e for each organ or pixel, the time Tn at which the luminance value is the maximum (minimum) as the characteristic amount representing the time T at which the enhancement effect appears, the degree of the non-sense effect, and D As shown in Fig. 2, the brightness value change amount Dn as the feature value to be expressed and the maximum (minimum) value In of the brightness value as the CT value as the feature value representing the tissue emphasized by the Enhans effect, respectively, are shown in HSV color. It converts the hue component, saturation component, and brightness component of the system, and colors the organ or pixel.
  • Steps S4a to S4e described above are performed for at least some of the organs or in at least some of the regions.
  • at least some organs or at least some areas are colored.
  • different coloring is performed on organs or pixels having temporal changes in luminance values.
  • different coloring is performed on pixels with different temporal changes in luminance values.
  • the same coloring or different coloring is performed on each pixel in different organs.
  • the color image power synthesized based on the HSV color system is displayed on the display unit 11, for example, in step S5 as shown in FIG.
  • FIG. 9 is a diagram showing a display example on the display unit 22 of the image display device.
  • a color bar 27 is placed near the bottom of the color image 26.
  • the color gradation is displayed when the color is changed from to 255.
  • the time phase corresponding to the hue (hour 1J) is marked as a scale. It is now possible to understand the relationship with the time T when the Hanns-Hans effect appears, in which case the saturation S and the brightness V should be constant, for example, the saturation and brightness of the point on the image specified by the operator.
  • this color bar 27 is not limited to expressing the time T of the Hannhans effect, but is emphasized by the degree D of the Hannhans effect, the Jenno effect, CT value (ie, brightness value) I representing tissue
  • CT value ie, brightness value
  • this color bar 27 is not limited to expressing the time T of the Hannhans effect, but is emphasized by the degree D of the Hannhans effect, the Jenno effect, CT value (ie, brightness value) I representing tissue
  • the time T when the Hanns effect appears in the hue component of the HSV color system, the degree D of the Hanns effect as the chroma component, and the CT value I as the lightness component are not allotted. It differs from the time T when the effect appears, the degree D of the non-sense effect D, and the CT value I. Even if the components are assigned, the identification becomes easy. For example, the time T when the Hanns effect appears in the saturation component, the degree D of the ensemble effect D as the lightness component, and the CT value I as the hue component.
  • the time T when the Hannshang effect appears in one of the different components of those color systems You can assign either the degree D or the luminance value I.
  • the time T when the enhancement effect appears in the R (red) component, the degree of the enhancement effect in the G (green) component, D, and the brightness value in the B (blue) component You can assign each I.
  • a display window (Window) for displaying the feature amount is arbitrarily set. It can be adjusted.
  • the operator selects each of the three feature quantities of CT value I, time of effect effect T, and degree of effect effect D with radio buttons 28a, 28b, and 29c.
  • the operator can independently set the window level of each feature amount with the scroll bar 30a as the Windows level adjustment means 30, and the window width of each feature amount with the scroll bar 31a as the Windows width adjustment means 31.
  • the display window of each feature value is adjusted.
  • the Window level is the center value of the display window
  • the Window width means the width of the display window, that is, the difference between the maximum value and the minimum value.
  • the saturation component of the pixel in which the enhancement effect appears at time T outside the display window is set to zero.
  • the display window is adjusted, only the organs or pixels corresponding to the inside of the display window are displayed in color, and the outside is displayed in grayscale. A color image suitable for viewing images can be obtained.
  • the color bar 27 is displayed in color only on the inside of the display window and grayscale on the outside.
  • the conversion from the time T of the Hanengs effect to the hue is performed by converting the time from the time Hannjé effect to the hue as in the second conversion method 33 shown in FIG. May be.
  • the intercept and slope of the conversion formula are made constant.
  • the position of the display window is adjusted by the window level, and the width of the display window is adjusted by the window width.
  • the display window only moves on the conversion line indicated by the dotted line. Therefore, the hue of each tissue does not change when the display window is adjusted compared to the first conversion method 32.
  • the color gradation on the power rubber 27 and the position of the time scale are fixedly displayed.
  • the saturation component of the pixel where the enhancement effect appears at time T outside the display window is set to zero, so that the inner side of the display window is the same as in the first conversion method 32. At this time, it is possible to display in color only the structures that show the Hanhans effect at 1 JT, and to display the tissues that show the Hannhans effect at other times in gray scale.
  • FIG. 12 shows moving images 35a to 35e when the window level is changed from 34a to 34e while the window width at time T is narrowed. With such moving images 35a to 35e, the operator can easily observe how the contrast agent flows every moment.
  • the window width may be changed with the window level set to a constant value, or a movie may be generated by changing both the window level and the window width.
  • a conventional CT image is a CT. While only the value I is expressed, not only the CT value I as described above, but also the information of the time T when the enhancement effect by the contrast agent appears, the information of the degree D of the non-sense effect D and the force Newly incorporated into the CT image and displayed in color, the information is represented on a single image.
  • X-ray CT systems have displayed CT values I in grayscale.
  • the image generation apparatus of the present invention adds the information of the time T when the Hannhans effect appears, the degree of the Hannhans effect, and the D information to the grayscale information without changing the grayscale display.
  • the image generation apparatus of the present invention adds the information of the time T when the Hannhans effect appears, the degree of the Hannhans effect, and the D information to the grayscale information without changing the grayscale display.
  • the three feature amounts are displayed in color using the HSV color system.
  • the hue component is assigned the time T when the Hannsheng effect appears
  • the saturation component is assigned the degree D of the Hannes effect
  • the brightness component is assigned the luminance value (ie CT value) I.
  • the luminance value change amount Dn is used as the feature amount representing the degree of the enhancement effect, the Hannhans like a bone is used. If the tissue has no effect, the change in luminance value Dn is small and displayed in grayscale, while if there is an enhanced effect such as a blood vessel, the change in luminance value Dn is large and displayed in power. It becomes easy for a person to distinguish between the two and grasp the presence or absence of blood flow.
  • the maximum (minimum) luminance value In is adopted as a feature value representing the organization emphasized by the Enhans effect
  • the luminance value is the maximum (minimum) value as a feature value representing the time T at which the Nennence effect appears.
  • the time Tn is adopted, and each is converted into color information and reflected in the image display. For this reason, depending on the time at which the Enhansse effect appears on the image, the hue changes to red, yellow, green, and blue, and the contrast agent progresses from moment to moment on the image. Will be displayed. This makes it easier for the operator to grasp the blood circulation state.
  • FIG. 13 shows another embodiment of the present invention, and an ultrasonic diagnosis as a medical image diagnostic apparatus. It is a schematic block diagram of the example which used the image generation apparatus of this invention for the cutting device.
  • the ultrasonic diagnostic apparatus 38 receives the volume data transfer described in the first embodiment from the medical image diagnostic apparatus 37 such as an X-ray CT apparatus and an MRI apparatus using a network or other portable storage medium, and receives data. It is stored in the storage unit 43.
  • the main parts of the ultrasonic diagnostic apparatus 38 are roughly classified into a system for reconstructing an ultrasonic image and a system for reconstructing a tomographic image corresponding to the reconstructed ultrasonic image.
  • the former includes an ultrasonic image acquisition unit 42 that reconstructs an ultrasonic image based on the reflected echo signal output from the probe 39.
  • the latter includes a data storage unit 43, various detection means 40 including a position sensor and a body position sensor for detecting a magnetic field attached to the probe 39, and a magnetic field generator and a respiration sensor as a main body in cooperation with these,
  • a position associating unit 44 and a tomographic image acquiring unit 45 are provided.
  • the ultrasonic image of the same section and the color image of the tomographic image output from the ultrasonic image acquisition unit 42 and the slice image acquisition unit 45 are drawn on the display unit 11 by the display processing device 46, and here the ultrasonic image and the tomographic image are displayed.
  • the images are displayed side by side on the same screen.
  • the image generation apparatus includes at least the data storage unit 43 and the tomographic image acquisition unit 45, and may further include a display processing unit 46 and the display unit 11.
  • the probe 39 transmits and receives ultrasonic waves to and from the subject 41, and includes a plurality of transducers that generate ultrasonic waves and receive reflected echoes.
  • the ultrasonic image acquisition unit 42 inputs the reflected echo signal output from the probe 39 and converts it into a digital signal, and for example, a black and white tomographic image (B mode image) or a color flow mapping image (CFM image) of the diagnostic part. Is generated.
  • B mode image black and white tomographic image
  • CFM image color flow mapping image
  • the position associating unit 44 acquires various pieces of information from the various detection means 40 in order to obtain a tomographic image including substantially the same cross section as the ultrasonic image.
  • the various detection means 40 are information for correcting a shift in the coordinate system between the tomographic image and the ultrasonic image due to a change in the posture of the subject 41 and a change in the coordinate system of the subject 41 due to movement of the internal organs due to respiration. Acquire by adding.
  • the tomographic image acquisition unit 45 uses the volume data in the data storage unit 43 based on the various types of information acquired by the position association unit 44 to obtain a slice image including substantially the same cross section as the ultrasonic image. As described above, there are various methods for obtaining a tomographic image having the same cross section as that of the ultrasonic image. A calculation method can be employed (for example, WO2004 / 098414).
  • the tomographic image acquisition unit 45 as shown in FIG. 2, data 19 of the luminance value (that is, CT value) I in the data storage unit 43 and data 20 of time T when the enhancement effect by the contrast agent appears.
  • the color image data 22 described in the first embodiment is generated from the data 21 of the degree D of the Hannhans effect.
  • FIG. 14 is a diagram showing a display example on the display unit 11.
  • An ultrasonic image 70 and a CT color image 71 of the same cross section are displayed.
  • a 3D body mark 72 is displayed by superimposing the current ultrasonic scan surface on a 3D image that visualizes the body surface and internal organs of the subject.
  • the 3D image is generated by the volume rendering method using the volume data of the color image data 22.
  • the volume rendering method is a typical process for generating a three-dimensional visualization image, and the display target can be selected and the internal structure can be seen through by adjusting the opacity. (For details, see ⁇ , for example, “Barthold Lichtenbelt, et al, Introduction to Volume rendering, Hewlett-Packard Professional Books”)
  • the present image generation apparatus is configured so that the opacity can be set according to at least one feature amount among the time when the enhancement effect appears, the luminance value I, the degree of the enhancement effect, and D. deep.
  • the opacity 0 is defined as a function of the IJT when the Hannhans effect appears, the degree D of the Nennons effect, and the CT value I.
  • the opacity 0 may be a function of one or two of the time ⁇ when the Hannhans effect appears, the degree D of the Hannhans effect, and the CT value I.
  • the step of obtaining the opacity 0 is inserted, for example, between step S4d and step S4e in FIG. 6 described above, and the opacity is obtained for each organ or each pixel. Alternatively, it may be inserted between step S4 and step S5 to obtain the opacity 0 for each organ or each pixel. Then, after the coloring and opacity of all the organs or pixels are obtained, a color 3D composite image is generated using the volume rendering method. After that, in the display step S5 described above, the display unit 11 A color 3D image is displayed on the screen.
  • an opacity curve 73 is displayed in which the relationship between the opacity 0 and the time T at which the enhancement effect appears is shown as a graph.
  • the operator should be able to change the opacity curve using a mouse. As a result, the operator can select a display target or perform a fluoroscopic view of the internal structure every time the enhancement effect appears.
  • the opacity curve 73 may display a relationship between the opacity 0 and the degree of enhancement effect D, or may display a relationship between the opacity 0 and the luminance value I. Yo! The display should be switched in response to the operator's selection of radio buttons 28a, 28b, 28c.
  • an ultrasonic image and a tomographic image having the same cross section can be displayed simultaneously. Further, as described in the first embodiment, the tomographic image is emphasized by color display, so that the operator can more easily make a comprehensive diagnosis while comparing the two.
  • the tissue has no enhancement effect such as a bone.
  • the brightness value change amount Dn is small and displayed in grayscale.
  • the luminance value change amount Dn is displayed in a large color. Because of these differences, it is easier for the operator to distinguish between the two and grasp the presence or absence of blood flow.
  • the maximum (minimum) value In of the luminance value is adopted as the feature value that represents the organization emphasized by the Jenno and Gons effect, and the luminance value is used as the feature value that represents the time at which the Enhans effect appears.
  • the maximum (minimum) time Tn is used so that it is reflected in the color image, so depending on the time when the Enhansse effect appears, the hue will be displayed as red, yellow, green, and blue. It is possible to display the progress of the contrast medium from moment to moment on a single image.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)
  • Image Processing (AREA)

Abstract

A method and a device for creating an image enabling a user to easily comprehend the information on temporal variation of a tissue by a contrast medium. A color image where at least a partial area is colored is created by repeating following steps while changing the pixel, i.e. a step for acquiring a plurality of image data of different time phases and then acquiring the luminance value of a pixel at the same position from each of the plurality of image data of different time phases, a step for acquiring one or more feature amounts representatiing temporal variation in luminance value by the contrast medium based on the time variation in luminance value, a step for converting one or more feature amounts into different color information, and a step for coloring the pixel so as to represent the information on temporal variation in luminance value based on one or more color information.

Description

明 細 書  Specification
画像生成方法及び画像生成装置  Image generation method and image generation apparatus
技術分野  Technical field
[0001] 本発明は、 X線 CT装置、 MRI装置、超音波診断装置などの医療用画像診断装置に より、造影剤が注入された被検体を撮影して取得された画像データを用いて、造影 剤のェンハンス効果を表すカラー画像を生成する画像生成装置及び画像生成方法 に関する。  [0001] The present invention uses image data acquired by imaging a subject into which a contrast medium has been injected by a medical image diagnostic apparatus such as an X-ray CT apparatus, an MRI apparatus, or an ultrasonic diagnostic apparatus. The present invention relates to an image generation apparatus and an image generation method for generating a color image representing an enhancement effect of a contrast agent.
背景技術  Background art
[0002] X線 CT装置、 MRI装置、超音波診断装置などの医療用画像診断装置の中には、例 えば、血流にコントラストをつけるための造影剤が注入された被検体を撮影して取得 した画像において、ある組織の輝度値が造影剤注入前よりも高くなるという造影剤に よるェンハンス効果 (増強効果)を表示するものがある。  [0002] Some medical diagnostic imaging apparatuses such as an X-ray CT apparatus, an MRI apparatus, and an ultrasonic diagnostic apparatus, for example, photograph a subject into which a contrast medium for contrasting blood flow is injected. Some of the acquired images display the enhancement effect (enhancement effect) by the contrast agent that the brightness value of a certain tissue is higher than before the injection of the contrast agent.
[0003] 例えば、(特許文献 1)では、医療用画像診断装置の一例である電子内視鏡装置を 用いて、造影剤により時間的に変化する疾患部位を撮影して取得された画像を、造 影剤により疾患部位が変化する様子をわ力、りやすくして表示している。具体的には、 電子内視鏡装置からの三原色 (R, G, B)画像データを人間が把握しやすい表色系で ある色相、彩度及び明度からなる HSV表色系に変換して表示している。  [0003] For example, in (Patent Document 1), using an electronic endoscope apparatus which is an example of a medical image diagnostic apparatus, an image obtained by imaging a disease site that changes with time using a contrast agent is obtained. The state of change of the diseased part by the cosmetic agent is displayed with ease and ease. Specifically, the three primary color (R, G, B) image data from an electronic endoscope device is converted into an HSV color system consisting of hue, saturation, and lightness, which is easy to understand for humans, and displayed. is doing.
[0004] このように造影剤注入後の時相が異なる複数の画像を取得するとともに、各々の画 像に対して人間が把握しやすい様に表色変換を行い、それらの造影剤によるェンハ ンス効果を見比べることによって、操作者は造影剤により疾患部位の変化する様子 や血行状態若しくは血流動態を容易に把握することが可能になる。  [0004] As described above, a plurality of images having different time phases after injection of the contrast agent are obtained, and color conversion is performed on each image so that it can be easily understood by humans. By comparing the effects, the operator can easily grasp the change of the diseased part, the blood circulation state or the blood flow dynamics with the contrast medium.
[0005] しかし、従来の医療用画像診断装置において、その表示部のサイズの制限から画 像を 夂しか表示できない場合のように、操作者が複数時相の画像のうちから一つの 時相の画像を選択して表示したとすると、操作者は、その選択した画像における輝度 値の高い組織がェンノ、ンス効果によるものカ それとも輝度値がもともと大きいためな のかを判別することができなレ、。  [0005] However, in the conventional medical diagnostic imaging apparatus, the operator can display only one time phase from among multiple time phase images, as in the case where only the image can be displayed due to the size of the display unit. Assuming that an image is selected and displayed, the operator cannot determine whether the tissue with a high luminance value in the selected image is due to the Henno, or the luminance effect, or because the luminance value is originally high. .
[0006] (特許文献 1)には、一つの画像上で、輝度値の高い組織が造影剤によるェンハンス 効果によるものカ それとも輝度値がもともと大きいためなのか等の組織の時間的変 化の情報を容易に理解できるようにするための手段や方法は開示されてレ、なレ、。 特許文献 1 :特開昭 63-79632号公報 [0006] In Patent Document 1, tissue having a high luminance value on one image is enhanced by a contrast agent. Means and methods for making it easy to understand information on changes in the organization over time, such as due to effects or because the luminance value is originally large, are disclosed. Patent Document 1: JP-A 63-79632
発明の開示  Disclosure of the invention
[0007] 本発明の目的は、造影剤による組織の時間的変化の情報を容易に理解可能な画 像を生成する、画像生成装置及び画像生成方法を提供することにある。  [0007] An object of the present invention is to provide an image generation apparatus and an image generation method for generating an image that allows easy understanding of information on temporal changes in tissue caused by a contrast agent.
[0008] 上記目的を達成するために、本発明の画像生成方法は、造影剤が注入された被検 体を撮影して取得された画像データを用いて、前記造影剤による組織の時間的変化 を表す医用画像を生成する方法であって、  [0008] In order to achieve the above object, the image generation method of the present invention uses the image data acquired by imaging the subject into which the contrast medium is injected, and changes the tissue over time by the contrast medium. A method for generating a medical image representing
(a)時相の異なる複数の前記画像データを取得するステップと、  (a) obtaining a plurality of the image data having different time phases;
(b)前記時相の異なる複数の画像データの各々から同一位置の画素の輝度値を取 得するステップと、  (b) obtaining a luminance value of a pixel at the same position from each of the plurality of image data having different time phases;
(c)前記輝度値の時間的変化に基づいて、前記造影剤による前記輝度値の時間的 変化を表す一以上の特徴量を取得するステップと、  (c) obtaining one or more feature amounts representing a temporal change in the luminance value due to the contrast agent based on the temporal change in the luminance value;
(d)前記一以上の特徴量をそれぞれ異なる色情報に変換するステップと、 (d) converting the one or more feature quantities into different color information,
(e)前記一以上の色情報に基づいて、前記輝度値の時間的変化の情報を表すよう に前記画素の色づけを行うステップと、 (e) coloring the pixel based on the one or more color information so as to represent information on temporal change of the luminance value;
(f)少なくとも一部の領域の異なる画素毎に前記ステップ (b)〜(e)を繰り返して、該少 なくとも一部の領域が前記色づけされたカラー画像を生成するステップと、 を含むことを特徴とする。  (f) repeating the steps (b) to (e) for each different pixel in at least a part of the region to generate a color image in which the part of the region is colored. It is characterized by.
[0009] また、記目的を達成するために、本発明の画像生成装置は、造影剤が注入された 被検体を撮影して取得された時相の異なる複数の画像データを入力する入力手段と 、前記時相の異なる複数の画像データを記憶する記憶手段と、前記時相の異なる複 数の画像データを用いて医用画像を生成する演算手段と、を備え、前記演算手段は 、前記時相の異なる複数の画像データから、前記造影剤による組織の時間的変化を 表す一以上の特徴量を画素毎に抽出してそれぞれ異なる色情報に変換し、少なくと も一部の領域の画素毎にその時間的変化の情報を表すように色づけされたカラー画 像を生成することを特徴とすることを特徴とする。 [0010] 上記本発明の画像生成方法及び画像生成装置によれば、一つの画像上で、輝度 値の高い組織が造影剤によるェンハンス効果によるものカ それとも輝度値がもともと 大きいためなのか等の造影剤による組織の時間的変化の情報を容易に理解できるよ うなる。 [0009] In order to achieve the above object, the image generating apparatus of the present invention includes an input unit that inputs a plurality of image data having different time phases acquired by imaging a subject into which a contrast medium has been injected. Storage means for storing a plurality of image data having different time phases, and calculation means for generating a medical image using the plurality of image data having different time phases, wherein the calculation means comprises the time phases. One or more feature amounts representing temporal changes in the tissue due to the contrast agent are extracted for each pixel from a plurality of different image data and converted into different color information, and at least for each pixel in a partial region It is characterized by generating a color image colored so as to represent the information of the temporal change. [0010] According to the image generation method and the image generation apparatus of the present invention described above, the contrast of whether a tissue having a high luminance value is due to the enhancement effect by the contrast agent or the luminance value is originally high on one image. It becomes easy to understand the information on changes in the tissue over time due to the drug.
図面の簡単な説明  Brief Description of Drawings
[0011] [図 1]本発明の第 1実施形態による医療用画像診断装置における画像生成装置を採 用した X線 CT装置の概略構成図である。  FIG. 1 is a schematic configuration diagram of an X-ray CT apparatus employing an image generating apparatus in a medical image diagnostic apparatus according to a first embodiment of the present invention.
[図 2]図 1に示した X線 CT装置のカラー合成処理を示すブロック図である。  FIG. 2 is a block diagram showing color composition processing of the X-ray CT apparatus shown in FIG. 1.
[図 3]各組織の輝度値変化を表す特性曲線を示す図である。  FIG. 3 is a diagram showing a characteristic curve representing a change in luminance value of each tissue.
[図 4]図 3に示した血管 Aの特性曲線を示す図である。  4 is a diagram showing a characteristic curve of blood vessel A shown in FIG.
[図 5]図 2に示した X線 CT装置のカラー合成処理動作を示すフローチャートである。  FIG. 5 is a flowchart showing a color composition processing operation of the X-ray CT apparatus shown in FIG.
[図 6]図 5に示した要部である合成処理を示すフローチャートである。  6 is a flowchart showing a synthesis process that is a main part shown in FIG.
[図 7]輝度値が最大 (最小)になった時刻 Tの算出例を示す極座標平面である。  FIG. 7 is a polar coordinate plane showing an example of calculating time T when the luminance value becomes maximum (minimum).
[図 8]輝度値が最大 (最小)になった時刻 Tの算出例を示す他の極座標平面である。  FIG. 8 is another polar coordinate plane showing an example of calculating time T when the luminance value becomes maximum (minimum).
[図 9]図 1に示した医療用画像診断装置における画像生成装置の表示例を示す正面 図である。  FIG. 9 is a front view showing a display example of the image generation device in the medical diagnostic imaging apparatus shown in FIG. 1.
[図 10]図 9に示した医療用画像診断装置における画像生成装置における要部を示 す説明図である。  10 is an explanatory diagram showing a main part of the image generation apparatus in the medical diagnostic imaging apparatus shown in FIG.
[図 11]図 9に示した医療用画像診断装置における画像生成装置における他の要部を 示す説明図である。  FIG. 11 is an explanatory diagram showing another main part of the image generation apparatus in the medical image diagnostic apparatus shown in FIG. 9.
[図 12]図 9に示した医療用画像診断装置における画像生成装置における他の表示 例を示す説明図である。  12 is an explanatory view showing another display example in the image generating device in the medical diagnostic imaging apparatus shown in FIG.
[図 13]本発明の第 2の実施形態による医療用画像診断装置における画像生成装置 を採用した超音波装置の概略構成図である。  FIG. 13 is a schematic configuration diagram of an ultrasonic apparatus employing an image generating apparatus in a medical diagnostic imaging apparatus according to a second embodiment of the present invention.
[図 14]超音波画像とカラー合成画像とを並列表示する一例を示す図である。  FIG. 14 is a diagram showing an example of displaying an ultrasonic image and a color composite image in parallel.
発明を実施するための最良の形態  BEST MODE FOR CARRYING OUT THE INVENTION
[0012] 以下、本発明の画像生成方法及び画像生成装置についての各実施の形態を図面 に基づいて説明する。 [0013] (第 1の実施形態) Hereinafter, embodiments of an image generation method and an image generation apparatus according to the present invention will be described with reference to the drawings. [0013] (First embodiment)
図 1は、本発明の一実施の形態であり、医療用画像診断装置としての X線 CT装置 に本発明の画像生成装置を用いた例の概略構成図である。  FIG. 1 shows an embodiment of the present invention and is a schematic configuration diagram of an example in which the image generation apparatus of the present invention is used in an X-ray CT apparatus as a medical image diagnostic apparatus.
スキャナ制御部 17によって可回転的に支持されたスキャナ 1には、その中央部に形 成した開孔 laに被検体を載せる寝台 2が寝台制御部 16によって位置調整可能に配 置され、この寝台 2上の被検体を挟んで X線発生源 3と多歹 1JX線検出器 4が対向配置 されている。高電圧スイッチングユニット 13、高電圧発生装置 14および X線制御部 15 によって X線発生源 3からの X線ビームが曝射されると、被検体を通過した後に多列 X 線検出器 4によって検出される。多歹 IJX線検出器 4からの出力は、増幅器を介して A/ D変換器によってデジタル化され処理装置 18の画像処理装置 5に入力され、この画 像処理装置 5により画像再構成処理が施される。再構成された画像データは、例え ば画像処理装置 5内の図示してない記憶装置に記憶される。また、再構成された画 像が表示処理装置 6によって操作卓 12上の表示部 11に表示される。  The scanner 1 rotatably supported by the scanner control unit 17 is provided with a bed 2 on which an object is placed in an opening la formed in the center thereof so that the position can be adjusted by the bed control unit 16. 2 The X-ray source 3 and the multi 1J X-ray detector 4 are arranged opposite to each other with the subject on top of it. When the X-ray beam from the X-ray generation source 3 is exposed by the high-voltage switching unit 13, the high-voltage generator 14 and the X-ray control unit 15, it is detected by the multi-row X-ray detector 4 after passing through the subject. Is done. The output from the multiple IJX-ray detector 4 is digitized by an A / D converter via an amplifier and input to the image processing device 5 of the processing device 18, and image reconstruction processing is performed by the image processing device 5. Is done. The reconstructed image data is stored in a storage device (not shown) in the image processing device 5, for example. The reconstructed image is displayed on the display unit 11 on the console 12 by the display processing device 6.
[0014] 画像処理装置 5には、画像生成装置 19と画像表示装置 6とが接続されている。画像 生成装置 19は、輝度値である CT値に対応する特徴量を取得する輝度値特徴量取得 部 7と、ェンノヽンス効果の度合いに対応する特徴量を取得するェンハンス効果度合 い特徴量取得部 8と、ェンハンス効果の時刻に対応する特徴量を取得するェンハン ス効果時刻特徴量取得部 9と、これらの 3つの特徴量取得部?〜 9における共通の演 算を行う前処理演算部 20と、これらの 3つの特徴量取得部?〜 9からの特徴量を用い てカラー画像を生成するカラー合成画像生成部 10とを有している。各特徴量取得部 7〜9は、前処理演算部 20を介して画像処理装置 5に接続されて、画像処理装置 5か ら画像データが入力され、その画像データに基づいて各特徴量をそれぞれ取得し、 カラー合成画像生成部 10に出力する。 An image generating device 19 and an image display device 6 are connected to the image processing device 5. The image generation device 19 acquires a feature value acquisition unit 7 that acquires a feature value corresponding to a CT value that is a brightness value, and a feature value for the degree of enhancement effect that acquires a feature value corresponding to the degree of the non-sense effect. Part 8, an enhancement effect time feature acquisition unit 9 that acquires a feature quantity corresponding to the time of the enhancement effect, and these three feature quantity acquisition units? ~ Pre-processing arithmetic unit 20 that performs common operations in 9 and these three feature quantity acquisition units? And a color composite image generation unit 10 that generates a color image using feature amounts from .about.9. Each feature amount acquisition unit 7 to 9 is connected to the image processing device 5 via the preprocessing arithmetic unit 20, and image data is input from the image processing device 5, and each feature amount is obtained based on the image data. Acquired and output to the color composite image generation unit 10.
また、前処理演算部 20は画像データを入力するための入力部 (図示せず)を有して いる。  In addition, the preprocessing arithmetic unit 20 has an input unit (not shown) for inputting image data.
[0015] 画像表示装置 6は、画像処理装置 5とカラー合成画像生成部 10と表示部 11とに接 続され、画像処理装置 5からの再構成画像とカラー合成画像生成部 10からのカラー 画像が入力されて、それぞれ表示部 11に表示する。 なお、画像生成装置 19が、上記画像処理装置 5と上記表示処理装置 6と上記表示 部 11と操作卓 12とを含んで構成されても良レ、。 The image display device 6 is connected to the image processing device 5, the color composite image generation unit 10, and the display unit 11. The reconstructed image from the image processing device 5 and the color image from the color composite image generation unit 10 are connected. Are displayed on the display unit 11 respectively. The image generating device 19 may be configured to include the image processing device 5, the display processing device 6, the display unit 11, and the console 12.
[0016] 図 2は、上述した画像生成装置 19におけるカラー画像の生成処理動作を示すプロ ック説明図である。 FIG. 2 is a block explanatory diagram showing the color image generation processing operation in the image generation device 19 described above.
従来の CT画像は、 CT値 (つまり輝度値)のみに基づいて画像をグレースケール表 現したものであるのに対して、本発明の画像生成装置 19は、輝度値特徴量取得部 7 で取得した輝度値 (CT値)データ 19を明度 (Value)成分とし、ェンハンス効果時刻特徴 量取得部 9で造影剤によるェンノ、ンス効果が現れた時刻から取得したェンハンス効 果時刻データ 20を色相(Hue)成分とし、ェンハンス効果度合い特徴量取得部 8で取 得したェンハンス効果度合いデータ 21を Saturation (彩度)成分として、それらの各成 分を用いてカラー合成画像生成部 10によって一枚のカラー合成画像 (以下、単位力 ラー画像と表記する)データ 22として生成し、 HSV表色系によるカラー表示する。  Whereas a conventional CT image is a grayscale representation of an image based only on CT values (that is, luminance values), the image generation device 19 of the present invention obtains it with a luminance value feature amount acquisition unit 7. The luminance value (CT value) data 19 is used as a lightness (Value) component, and the Henhans effect time characteristic acquisition unit 9 obtains the Henhans effect time data 20 acquired from the time when the Jenno effect due to the contrast agent appears. ) Component, and the effect effect degree data 21 obtained by the feature effect degree feature acquisition unit 8 is used as a Saturation component, and each of those components is used by the color composition image generation unit 10 to produce a single color composition. Generated as image 22 (hereinafter referred to as unit power error image) data 22 and displayed in color using the HSV color system.
[0017] なお、輝度値特徴量を明度成分に割り当て、ェンハンス効果時刻特徴量とェンハ ンス効果度合い特徴量との内のいずれか 1つの特徴量を、色相成分と彩度成分の内 のいずれかに割り当てて、明度成分を含む 2つの成分のみでカラー画像を生成して もよい。或いは、色相成分と彩度成分のみ、又は、いずれか一方の成分のみでカラ 一画像を生成しても良い。いずれの場合も、割り当ての無い不足の成分は、任意の 一定値に固定されてカラー画像が生成される。  [0017] It should be noted that the luminance value feature quantity is assigned to the lightness component, and one of the enhancement effect time feature quantity and the enhancement effect degree feature quantity is assigned to either the hue component or the saturation component. May be used to generate a color image with only two components including the brightness component. Alternatively, a color image may be generated using only the hue component and the saturation component, or only one of the components. In either case, the missing component that is not assigned is fixed to an arbitrary constant value, and a color image is generated.
[0018] 図 3は、血管 47,血管 48および肝臓 49,骨 50におけるそれぞれの輝度値の時間変 化を表す特性曲線を示す。  FIG. 3 shows characteristic curves representing temporal changes in the luminance values of blood vessel 47, blood vessel 48, liver 49, and bone 50, respectively.
縦軸に輝度値 (CT値)を、横軸に時間を取ると、血管 47の時間変化は特性曲線 23で 、血管 48および肝臓 49の時間変化は特性曲線 24で、また骨 50の時間変化は直線 25 で示される。先ず時刻 1において血管 47が造影剤によって増強され、その後、時刻 2 において肝臓 49全体と血管 48が造影剤によって増強される。なお、骨 50には造影剤 が流れ込まないので、骨 50の CT値は時間に依らず一定に推移している。  If the luminance value (CT value) is plotted on the vertical axis and time is plotted on the horizontal axis, the time change of the blood vessel 47 is the characteristic curve 23, the time change of the blood vessels 48 and 49 is the characteristic curve 24, and the time change of the bone 50 Is indicated by a straight line 25. First, at time 1, the blood vessel 47 is enhanced by the contrast agent, and then at time 2, the entire liver 49 and the blood vessel 48 are enhanced by the contrast agent. In addition, since the contrast agent does not flow into the bone 50, the CT value of the bone 50 remains constant regardless of time.
[0019] 図 4は、図 3における血管 47の輝度値の時間変化を表す特性曲線 23を示すもので、 この特性曲線 23から血管 47における輝度値が最大となる時刻 T47、最大輝度値 147、 輝度値変化量 D47をそれぞれ取得することができる。同様にして、 S蔵器毎又は画素 毎に、その臓器又は画素の輝度値の時間変化を表す特性曲線から、その臓器又は 画素の輝度値が最大となる時刻 Tn、最大輝度値 Ιη、輝度値変化量 Dnをそれぞれ取 得すること力できる。 FIG. 4 shows a characteristic curve 23 representing the temporal change in the luminance value of the blood vessel 47 in FIG. 3. From this characteristic curve 23, the time T47 when the luminance value in the blood vessel 47 becomes maximum, the maximum luminance value 147, The luminance value change amount D47 can be acquired respectively. Similarly, each S container or pixel Each time, the time Tn at which the brightness value of the organ or pixel becomes the maximum, the maximum brightness value Ιη, and the brightness value change amount Dn can be obtained from the characteristic curve representing the temporal change in the brightness value of the organ or pixel. .
[0020] なお、図 2,図 3及び図 4の例では、造影剤によって臓器の輝度値が増強されて、輝 度値の最大値が存在する場合を説明したが、造影剤によって臓器の輝度値が抑制 されて、輝度値の最小値が存在する場合 (例えば、 MR perfusion撮像の場合)は、前 述の最大を最小に置き換えれば同様の説明となる。以下、最大値が存在する場合の み説明するが、最小値が存在する場合も同様である。  In the examples of FIGS. 2, 3 and 4, the case where the brightness value of the organ is enhanced by the contrast agent and the maximum brightness value exists is described. However, the brightness of the organ is enhanced by the contrast agent. When the value is suppressed and there is a minimum luminance value (for example, in MR perfusion imaging), the same explanation can be obtained by replacing the maximum described above with the minimum. Hereinafter, only the case where the maximum value exists will be described, but the same applies when the minimum value exists.
[0021] そこで、図 2に示した CT値に対応する特徴量を取得する輝度値特徴量取得部 7は、 輝度値 Iの特徴量として CT値の最大 (最小)値である最大 (最小)輝度値 Inを取得するよ うにする。或いは、任意の一の断層像の CT値を最大 (最小)輝度値 Inとしても良い。ま た、ェンノヽンス効果の度合いに対応する特徴量を取得するェンハンス効果度合い特 徴量取得部 8は、ェンハンス効果の度合い Dに対応する特徴量として CT値の変化量 である輝度値変化量 Dnを取得するようにする。そして、ェンハンス効果が現れた時刻 に対応する特徴量を取得するェンノヽンス効果時刻特徴量取得部 9は、ェンハンス効 果が現れた時刻 Tに対応する特徴量として輝度値 (つまり CT値)が最大 (最小)となる時 刻 Tnを取得するようにする。或いは、輝度値の変化量が所定の閾値以上となった時 亥 ijTnとしても良い。この場合の閾値を輝度値の最大 (最小)変化量の 1/2とすることが できる。或いは、輝度値の変化の勾配が所定の閾値以上となった時刻 Tnとしても良 レ、。この場合の閾値を輝度値の最大 (最小)勾配の 1/2とすることができる。  Therefore, the luminance value feature quantity acquisition unit 7 that acquires the feature quantity corresponding to the CT value shown in FIG. 2 is the maximum (minimum) value that is the maximum (minimum) value of the CT value as the feature quantity of the luminance value I. The luminance value In is acquired. Alternatively, the CT value of any one tomographic image may be the maximum (minimum) luminance value In. In addition, the enhancement effect degree feature amount acquisition unit 8 that acquires a feature amount corresponding to the degree of the non-sense effect is a luminance value change amount that is a change amount of the CT value as a feature amount corresponding to the degree D of the enhance effect. Try to get Dn. Then, the non-sense effect time feature quantity acquisition unit 9 that acquires the feature quantity corresponding to the time when the Enhansse effect appears has a luminance value (that is, CT value) as the feature quantity corresponding to the time T when the Enhansse effect appears. Get the maximum (minimum) time Tn. Alternatively, ijTn may be used when the amount of change in the luminance value is equal to or greater than a predetermined threshold. The threshold value in this case can be set to 1/2 of the maximum (minimum) change amount of the luminance value. Alternatively, the time Tn when the gradient of the luminance value change exceeds a predetermined threshold is acceptable. The threshold in this case can be set to 1/2 of the maximum (minimum) gradient of the luminance value.
[0022] なお、上述の説明では、 3つの特徴量を取得することを説明した力 S、ェンハンス効果 度合い特徴量とェンハンス効果時刻特徴量の内の少なくとも 1つでも良ぐ或いは、 4 つ以上の特徴量を取得して、それらの中から 2つ又は 3つの特徴量を選択しても良い  [0022] In the above description, at least one of the force S, the enhancement effect degree feature amount, and the enhancement effect time feature amount described to acquire three feature amounts may be used, or four or more You can acquire feature values and select two or three feature values from them.
[0023] その後、カラー合成画像生成部 10は、各特徴量を同時に表示するカラー画像を生 成する。操作者は、各特徴量の表示を比較考慮することによって、輝度値の高い組 織の輝度値が造影剤によるェンハンス効果によるもの力、、それとも CT値がもともと大 きいためなのかを判別することができるようになる。 [0024] 各特徴量を同時に表示するカラー画像を生成するためには、カラー合成画像生成 部 10は、例えば HSV表色系によるカラー表示としたカラー画像を合成する。つまり、力 ラー合成画像生成部 10は、ェンハンス効果が現れた時刻 Tを Hue (色相)成分に、ェン ハンス効果の度合レ、Dを Saturation (彩度)成分に、また CT値 Iを Value (明度)成分にそ れぞれ割り当てる。 [0023] After that, the color composite image generation unit 10 generates a color image that simultaneously displays the feature amounts. The operator must compare the display of each feature value to determine whether the luminance value of the organization with a high luminance value is due to the enhancement effect by the contrast agent, or whether the CT value is originally large. Will be able to. [0024] In order to generate a color image that simultaneously displays each feature quantity, the color composite image generation unit 10 combines a color image that is displayed in color by, for example, the HSV color system. That is, the power error composite image generation unit 10 uses the Hue (hue) component as the time T when the Hannhans effect appears, the degree of the Hannhans effect, D as the Saturation component, and the CT value I as Value. Assign to each (lightness) component.
[0025] ェンハンス効果の度合レ、Dを表す特徴量としての輝度値変化量 Dnを採用すると、 例えば、骨 50のようにェンハンス効果がない組織の場合は輝度値変化量 D50が小さ くなり、血管 47などのようにェンハンス効果がある場合は輝度値変化量 D47が大きくな るので、この輝度値変化量 Dnを臓器毎又は画素毎に取得して HSV表色系の彩度成 分に対応付けることができる。  [0025] When the brightness value change amount Dn is used as the feature amount representing the degree of the enhancement effect, D, for example, in the case of a tissue having no enhancement effect such as the bone 50, the brightness value change amount D50 is reduced. When there is an enhancement effect such as blood vessel 47, the luminance value change D47 increases, so this luminance value change Dn is acquired for each organ or pixel and is associated with the saturation component of the HSV color system. be able to.
[0026] 例えば、カラー合成画像生成部 10は、取得した輝度値変化量 Dnを予め設定した閾 値と比較し、閾値以下の場合にはェンノヽンス効果がないと判断し、従来と同様にダレ 一スケールで表示するようにする。一方、カラー合成画像生成部 10は、輝度値変化 量 Dnが閾値以上でェンハンス効果があると判断した場合は、 HSV表色系のカラー表 示としてその彩度成分に輝度値変化量 Dnを割り当てる。この場合の閾値として、例え ば輝度値変化量 Dnの最大 (最小)値の 1/2とすることができる。  [0026] For example, the color composite image generation unit 10 compares the acquired luminance value change amount Dn with a preset threshold value, and determines that there is no hen-nos effect when the value is equal to or less than the threshold value, as in the conventional case. Let it be displayed in one scale. On the other hand, if the color composite image generation unit 10 determines that the luminance value change amount Dn is equal to or greater than the threshold value and there is an enhancement effect, the color value change amount Dn is assigned to the saturation component as a color display of the HSV color system. . For example, the threshold value in this case can be set to 1/2 of the maximum (minimum) value of the luminance value change amount Dn.
[0027] このため、操作者は、彩度成分によってカラー表示された組織がェンハンス効果に よるものであることを簡単に識別することができる。一方、ェンノヽンス効果がない部位 の表示は、従来の CT値と同じように明度成分によってグレースケール表示される。こ のため、操作者は、ェンハンス効果がない部位であることを容易に判定でき、これま での画像表示装置における CT値情報から得ていたのと同じ情報を簡単に得ることが できる。これらのことは、カラー画像に十分慣れていない操作者でも容易に行うことが できる。  [0027] Therefore, the operator can easily identify that the tissue displayed in color by the saturation component is due to the enhancement effect. On the other hand, the display of the part where there is no hennosence effect is displayed in grayscale by the lightness component as in the conventional CT value. For this reason, the operator can easily determine that the part has no enhancement effect, and can easily obtain the same information as obtained from the CT value information in the image display device so far. These can be easily done by an operator who is not familiar with color images.
[0028] 同様に、輝度値が最大 (最小)となる時刻 Tnが、ェンノ、ンス効果が現れた時刻 Τとし て、 HSV表色系のカラー表示としてその色相成分に割り当てられているため、ェンノヽ ンス効果が現れた時刻が現在表示されている画像の時刻か、それよりも前の時刻か 、あるいは後の時刻かを容易に識別することができる。  [0028] Similarly, since the time Tn at which the luminance value is maximum (minimum) is assigned to the hue component as the color display of the HSV color system as the time た at which the effect of the Ennos and the effect appears, It is possible to easily identify whether the time when the effect appears is the time of the currently displayed image, the time before or after it.
[0029] 図 5は、上述した画像生成装置 19の具体的な処理動作を示すフローチャートである ステップ SIでは、造影剤注入後の時刻 1、時刻 2、時刻 3、…時亥 ijNに被検体を撮影 して取得されたボリュームデータ力 例えば画像処理装置 5内の記憶装置から前処 理演算部 20に、合計 N個読み込まれる。 FIG. 5 is a flowchart showing a specific processing operation of the image generation apparatus 19 described above. In step SI, the volume data force obtained by imaging the subject at time 1, time 2, time 3,... IjN after injection of the contrast agent. For example, the preprocessing operation unit from the storage device in the image processing device 5 A total of N are read into 20.
[0030] ステップ S2では、 N個のボリュームデータ中の対応する組織が同一位置にくるように するために、ボリュームデータ同士の位置合わせを行う。これは、ダイナミック CT撮影 中に被検体の体位が変化してしまったり、被検体の呼吸による内臓移動などが生じ てしまったりした場合に、画像中の組織の位置が時相によってまちまちになってしまう のを補正するために行う。具体的には、 N個のボリュームデータ中の任意の一つのボ リュームデータを基準とし、 N個のボリュームデータ上に、基準点を少なくとも 3点ずつ 設け、それらの位置の基準ボリュームデータからの変動量から、組織の移動量および 回転量を算出し、その分だけ逆変換して全てのボリュームデータを基準ボリュームデ ータに合わせればよい。このステップ S2の処理は、例えば前処理演算部 20で行われ る。 [0030] In step S2, the volume data are aligned with each other so that the corresponding organizations in the N volume data are located at the same position. This is because the position of the subject in the image varies depending on the time phase when the subject's position changes during dynamic CT imaging, or when the subject's breathing moves due to respiration. This is done to correct the end. Specifically, based on any one volume data among the N volume data, at least 3 reference points are provided on the N volume data, and their positions vary from the reference volume data. The amount of movement and rotation of the tissue is calculated from the amount, and it is only necessary to reversely convert the amount and match all the volume data with the reference volume data. The processing in step S2 is performed by, for example, the preprocessing arithmetic unit 20.
[0031] ステップ S3では、各ボリュームデータより所望の断層像を切り出す。具体的には、任 意の一つのボリュームデータを用いて切り出したい断面の位置および角度情報を指 定し、これらの指定に基づいて、全ボリュームデータから所望の断面を含む断層像を それぞれ切り出す。このとき切り出された画像は、時相の異なる同一断面画像である 。このステップ S3の処理は、例えば前処理演算部 20で行われる。  In step S3, a desired tomographic image is cut out from each volume data. Specifically, the position and angle information of a section to be cut out is specified using any one volume data, and a tomographic image including a desired section is cut out from all volume data based on these specifications. The images cut out at this time are the same cross-sectional images having different time phases. The processing in step S3 is performed by, for example, the preprocessing arithmetic unit 20.
なお、ステップ S1で読み込むデータがボリュームデータでなくて、上述の様な所望 の断面を含む断層像のデータである場合は、前述のステップ S2では、断層像同士の 位置合わせを行うことになり、ステップ S3は不要となる。  If the data read in step S1 is not volume data but data of a tomogram including the desired cross section as described above, in step S2, the tomograms are aligned with each other. Step S3 is not necessary.
[0032] ステップ S4では、これらの時相の異なる断層像を用いて後述するように、少なくとも 一部の臓器毎に又は少なくとも一部の領域内の画素毎に、時間軸方向にカラー合 成処理 (色づけ)演算を行う。このステップ S4の処理は、各特徴量取得部 7, 8, 9及び カラー合成画像生成部 10で行われる。  In step S4, as will be described later using these tomographic images having different time phases, color composition processing in the time axis direction is performed for at least a part of the organs or at least a part of the pixels in a part of the region. Perform (coloring) calculation. The processing in step S4 is performed by the feature amount acquisition units 7, 8, 9 and the color composite image generation unit 10.
[0033] ステップ S5では、上記ステップ S4で生成されたカラー画像が、医療用画像表示装置 における表示部 11で表示される。 なお、前述のステップ SIの前に、造影剤を注入された被検体を実際に撮影して、時 相の異なるボリュームデータを取得し、例えば画像処理装置 5内の記憶装置に記憶 するステップを有しても良レ、。 [0033] In step S5, the color image generated in step S4 is displayed on the display unit 11 in the medical image display device. It should be noted that before the above-described step SI, there is a step of actually photographing the subject into which the contrast medium has been injected, acquiring volume data of different phases, and storing it in a storage device in the image processing device 5, for example. Even if it is good ,.
[0034] ステップ S4のカラー合成処理は、図 6に示すように、少なくとも一部の臓器毎に又は 少なくとも一部の領域内の画素毎に、同一臓器又は同一画素の輝度値を抽出するス テツプ S4aと、輝度値の最大 (最小)値 Inを取得するステップ S4bと、輝度値変化量 Dnを 取得するステップ S4cと、輝度値が最大 (最小)になった時刻 Tnを取得するステップ S4d と、輝度値の最大 (最小)値 Inと輝度値変化量 Dnと輝度値が最大 (最小)になった時刻 Tnとから、その臓器又は画素の色づけを行うステップ S4eを、臓器又は画素を変えて 繰り返す。或いは、ステップ S4b, S4c, S4d,及び S4eをそれぞれ単独に臓器又は画素 を変えて繰り返してもよい。図 6は画素の場合及び最大の場合のみを示している。 なお、色づけ対象臓器又は領域を設定する場合は、上記ステップ S4aの前にマウス などを用いて任意の断層像上でその臓器又は領域を指定しておく。以下、上記ステ ップ S4の各ステップを詳細に説明する。  As shown in FIG. 6, the color composition process in step S4 is a step of extracting the luminance value of the same organ or the same pixel for at least some of the organs or for every pixel in at least some of the regions. S4a, step S4b for acquiring the maximum (minimum) value In of the luminance value, step S4c for acquiring the luminance value change amount Dn, step S4d for acquiring the time Tn when the luminance value reaches the maximum (minimum), Repeat step S4e for coloring the organ or pixel from the maximum (minimum) value In and the amount of change Dn and the time Tn when the luminance value reaches the maximum (minimum), changing the organ or pixel. . Alternatively, steps S4b, S4c, S4d, and S4e may be repeated independently with different organs or pixels. Figure 6 shows only the pixel and maximum cases. When setting a coloring target organ or region, the organ or region is designated on an arbitrary tomographic image using a mouse or the like before step S4a. Hereinafter, each step of step S4 will be described in detail.
[0035] ステップ S4aでは、同一臓器又は同一画素の輝度値が時相の異なる断層像毎に抽 出されて、その臓器又は画素の輝度値の時間的変化が取得される。つまり、前述の 図 3,図 4の様な特性曲線が取得される。このステップ S4aの処理で抽出される特性曲 線データは、以降の各ステップ S4b〜S4dにおいて共通に使用されるデータであり、 例えば図 1の前処理演算部 20で行われる。  In step S4a, the luminance value of the same organ or the same pixel is extracted for each tomographic image having a different time phase, and the temporal change in the luminance value of the organ or pixel is acquired. In other words, the characteristic curves shown in Fig. 3 and Fig. 4 are obtained. The characteristic curve data extracted in the process of step S4a is data that is commonly used in the subsequent steps S4b to S4d, and is performed by, for example, the preprocessing arithmetic unit 20 in FIG.
[0036] ステップ S4bでは、輝度値の時間軸方向での最大 (最小)値および最小 (最大)値が取 得され、最大 (最小)輝度値 In= [最大 (最小)値]とされる。このステップ S4bの処理は、 輝度値特徴量取得部 7で行われる。なお、最大輝度値として、任意の一つの断層像 の輝度値を Inとしても良い。  In step S4b, the maximum (minimum) value and minimum (maximum) value of the luminance value in the time axis direction are obtained, and the maximum (minimum) luminance value In = [maximum (minimum) value] is set. The process of step S4b is performed by the luminance value feature quantity acquisition unit 7. Note that the luminance value of any one tomographic image may be In as the maximum luminance value.
[0037] ステップ S4cでは、輝度値の時間軸方向での最大 (最小)値および最小 (最大)値が取 得され、輝度値変化量 Dn = [最大 (最小)値]一 (最小値)とされる。このステップ S4cの処 理は、ェンハンス効果度合レ、特徴量取得部 8で行われる。  [0037] In step S4c, the maximum (minimum) value and minimum (maximum) value of the luminance value in the time axis direction are obtained, and the luminance value change amount Dn = [maximum (minimum) value] one (minimum value). Is done. The processing of step S4c is performed by the enhancement effect level and feature quantity acquisition unit 8.
[0038] ステップ S4dでは、輝度値の時間軸方向での変化が最大 (最小)値となる時刻 Tnが 取得される。このステップ S4dの処理は、ェンハンス効果時刻特徴量取得部 9で行わ れる。なお、輝度値の変化量が所定の閾値以上となった時刻 Tnとしても良レ、。或い は、輝度値の変化の勾配が所定の閾値以上となった時刻 Τηとしても良い。 [0038] In step S4d, a time Tn at which the change of the luminance value in the time axis direction becomes the maximum (minimum) value is acquired. The processing of step S4d is performed by the Jenhans effect time feature quantity acquisition unit 9. It is. It should be noted that the time Tn when the change amount of the luminance value becomes equal to or greater than a predetermined threshold is also acceptable. Alternatively, it may be the time Τη when the gradient of the luminance value change becomes a predetermined threshold value or more.
[0039] ここで、ステップ S4dに関しては、ボリュームデータ数 Nの値が小さい場合であっても 、比較的優れた分解能で輝度値が最大 (最小)になった時刻 Tを取得する方法を説明 する。図 7のような極座標平面上で、偏角を N等分するベクトルを考え (ここでは例とし て、 N = 3の場合を示す)、それぞれのベクトルの大きさを、 VI, V2, V3とする。ここで、 VI, V2, V3は、時刻 1,時刻 2,時亥 1J3の輝度値である。図 8に示すようにこれらのベタ トルの合成ベクトルを算出し、この合成ベクトルの偏角 Thetaを算出し、これを時刻に 変換して輝度値が最大 (最小)になった時刻 Tを取得する。このようにして数 1に示す 画素 (x, y)の最大 (最小)輝度値 In(x, y)、輝度値変化量 Dn(x, y)、輝度値が最大 (最小 )になった時刻 Tn(x, y)を、画素 (x, y)の最大値 Max(x, y)、最小値 Min(x, y)、及び最 大 (最小)となった時刻 Thetaから得ることができる。 [0039] Here, regarding step S4d, a method of acquiring time T at which the luminance value becomes maximum (minimum) with relatively excellent resolution even when the value of volume data number N is small will be described. . Consider a vector that divides the declination into N parts on the polar coordinate plane as shown in Fig. 7 (here, N = 3 is shown as an example), and the magnitude of each vector is VI, V2, and V3. To do. Here, VI, V2, and V3 are the luminance values at time 1, time 2, and hour 1J3. As shown in FIG. 8, the combined vector of these vectors is calculated, the declination angle Theta of this combined vector is calculated, and this is converted into time to obtain the time T when the luminance value becomes maximum (minimum). . Thus, the maximum (minimum) luminance value In (x, y), the luminance value change Dn (x, y), and the luminance value at the maximum (minimum) of the pixel (x, y) shown in Equation 1 Tn (x, y) can be obtained from the maximum value Max (x, y), minimum value Min (x, y), and maximum (minimum) time Theta of the pixel (x, y).
(数 1)  (Number 1)
In、x, y) = Max , yパ  In, x, y) = Max, y
Dn(x, y) = Max(x, y)— Min(x, y) ;  Dn (x, y) = Max (x, y) — Min (x, y);
Tn(x, y) = Theta;  Tn (x, y) = Theta;
[0040] ステップ S4eでは、臓器毎又は画素毎に、ェンハンス効果が現れた時刻 Tを表す特 徴量として輝度値が最大 (最小)となる時刻 Tnと、ェンノヽンス効果の度合レ、 Dを表す特 徴量として輝度値変化量 Dnと、ェンハンス効果によって強調された組織を表す特徴 量として CT値である輝度値の最大 (最小)値 Inとを、図 2に示すようにそれぞれ HSV表 色系の色相成分、彩度成分、明るさ成分にそれぞれ変換して、その臓器又は画素を 色づける。  [0040] In step S4e, for each organ or pixel, the time Tn at which the luminance value is the maximum (minimum) as the characteristic amount representing the time T at which the enhancement effect appears, the degree of the non-sense effect, and D As shown in Fig. 2, the brightness value change amount Dn as the feature value to be expressed and the maximum (minimum) value In of the brightness value as the CT value as the feature value representing the tissue emphasized by the Enhans effect, respectively, are shown in HSV color. It converts the hue component, saturation component, and brightness component of the system, and colors the organ or pixel.
例えば、 2つの異なる臓器又は画素が、同じ時刻に輝度値が最大 (最小)になれば、 同じ色相に色づけられるが、輝度値変化量が異なれば、少なくとも一方の臓器又は 画素の彩さが異なる色に色づけされることになる。時刻 Tnから色相成分へ、輝度値 変化量 Dから彩度成分へ、また輝度値の最大 (最小)値 Iから明度成分への変換方法 に関しては後述する。  For example, if two different organs or pixels have the maximum (minimum) luminance value at the same time, they are colored in the same hue, but if the amount of change in luminance value is different, the color of at least one organ or pixel is different It will be colored. The conversion method from the time Tn to the hue component, the luminance value change D to the saturation component, and the maximum (minimum) value I of the luminance value to the lightness component will be described later.
[0041] 上記ステップ S4a〜S4eを少なくとも一部の臓器毎に又は少なくとも一部の領域内の 画素毎に繰り返すことによって、少なくとも一部の臓器又は少なくとも一部の領域が 色づけられる。この結果、輝度値の時間的変化が異なる臓器又は画素に対して異な る色づけが行われる。また、同じ臓器内でも、輝度値の時間的変化が異なる画素に 対して異なる色づけが行われる。また、異なる臓器内のそれぞれの画素でも、同じ色 づけ又は異なる色づけが行われる。 [0041] Steps S4a to S4e described above are performed for at least some of the organs or in at least some of the regions. By repeating for each pixel, at least some organs or at least some areas are colored. As a result, different coloring is performed on organs or pixels having temporal changes in luminance values. In addition, even in the same organ, different coloring is performed on pixels with different temporal changes in luminance values. In addition, the same coloring or different coloring is performed on each pixel in different organs.
そして、 HSV表色系に基づいてカラー合成されたカラー画像力 図 5に示した様に、 ステップ S5で例えば表示部 11に表示される。  Then, the color image power synthesized based on the HSV color system is displayed on the display unit 11, for example, in step S5 as shown in FIG.
[0042] 図 9は、画像表示装置の表示部 22における表示例を示す図である。 FIG. 9 is a diagram showing a display example on the display unit 22 of the image display device.
カラー画像 26の下部近傍にはカラーバー 27が配置され、彩度 Sと明度 Vは定数と" 例えば、彩度 S = 255、明度 V=255)、色合レ、 (つまり色相成分) Hを 0から 255に変化さ せたときのカラーグラデーションが表示されている。その下には色合いに対応する時 相 (時亥 1J)が目盛りとして記されており、このカラーバー 27を見ることによって色合いと ェンハンス効果が現れた時刻 Tとの関係が分かるようになつている。このとき、彩度 S 明度 Vは定数であればよぐ例えば操作者が指定した画像上の点の彩度と明度を採 用するようにしてもよレ、。さらに、このカラーバー 27は、ェンハンス効果の時刻 Tを表現 することに限定されるものではなぐェンハンス効果の度合い Dや、ェンノ、ンス効果に よって強調された組織を表現する CT値 (つまり輝度値) Iを表示するようにしてもよい。 例えば、操作者が、 CT値 I、ェンハンス効果の時刻 T、ェンハンス効果の度合い Dの 3 つの特徴量にそれぞれ対応するラジオボタン 28a, 28b, 28cのいずれかを選択すると 、選択されたラジオボタンに対応するカラーバーに切り替わる。また、一次元カラーバ 一に限定されることなぐこれら 3つの特徴量を組み合わせた形で、二次元カラーバ 一や三次元カラーバーを表示するようにしてもよい。  A color bar 27 is placed near the bottom of the color image 26. Saturation S and lightness V are constants. For example, saturation S = 255, lightness V = 255), hue level (that is, hue component) H is 0. The color gradation is displayed when the color is changed from to 255. Below that, the time phase corresponding to the hue (hour 1J) is marked as a scale. It is now possible to understand the relationship with the time T when the Hanns-Hans effect appears, in which case the saturation S and the brightness V should be constant, for example, the saturation and brightness of the point on the image specified by the operator. In addition, this color bar 27 is not limited to expressing the time T of the Hannhans effect, but is emphasized by the degree D of the Hannhans effect, the Jenno effect, CT value (ie, brightness value) I representing tissue For example, when the operator selects any one of the radio buttons 28a, 28b, and 28c corresponding to the three feature amounts of CT value I, time T of the enhancement effect T, and degree of the enhancement effect D, respectively. Switch to the color bar corresponding to the selected radio button, and display the 2D color bar or 3D color bar in a combination of these three features without being limited to the 1D color bar. It may be.
このようにカラー画像 26とカラーバー 27を並置して表示するようにすると、操作者は 、カラー画像 26における表示カラーが何を意味するかを対応付けして判断することが 容易になる。  When the color image 26 and the color bar 27 are displayed side by side in this way, the operator can easily determine in association with what the display color in the color image 26 means.
[0043] また、 HSV表色系の色相成分にェンハンス効果が現れた時刻 T、彩度成分にェン ハンス効果の度合い D、明度成分に CT値 Iをそれぞれ割り当てることに限らず、ェン ハンス効果が現れた時刻 T、ェンノヽンス効果の度合い D、 CT値 Iに対してそれぞれ異 なる成分を割り当てても識別が容易になる。例えば、彩度成分にェンハンス効果が現 れた時刻 T、明度成分にェンノヽンス効果の度合い D、色相成分に CT値 Iをそれぞれ 割り当てもよレ、。或いは、 HSV表色系の色相成分の配列に限らず、ェンノ、ンス効果が 現れた時刻 Tを複数の期間に細分化して、予め複数色を用意して各色と各期間との 対応関係を予め決めてぉレ、てもよレ、。 [0043] Further, the time T when the Hanns effect appears in the hue component of the HSV color system, the degree D of the Hanns effect as the chroma component, and the CT value I as the lightness component are not allotted. It differs from the time T when the effect appears, the degree D of the non-sense effect D, and the CT value I. Even if the components are assigned, the identification becomes easy. For example, the time T when the Hanns effect appears in the saturation component, the degree D of the ensemble effect D as the lightness component, and the CT value I as the hue component. Alternatively, not only the arrangement of hue components in the HSV color system, but the time T when the effect of the Ennos and Sons effect appears is subdivided into a plurality of periods, a plurality of colors are prepared in advance, and the correspondence between each color and each period is determined in advance. Decide it and get it.
[0044] さらに、 HSV表色系のみならず、 RGB表色系や他の表色系を用いて、それらの表色 系の異なる成分のいずれかに工ンハンス効果が現れた時刻 T、ェンノヽンス効果の度 合い D、及び輝度値 Iのいずれかを割り当ててもよレ、。例えば、 RGB表色系を用いる 場合は、 R (赤)成分にェンハンス効果が現れた時刻 Tを、 G (緑)成分にェンハンス効 果の度合レ、Dを、 B (青)成分に輝度値 Iをそれぞれ割り当てても良レ、。  [0044] Furthermore, using not only the HSV color system but also the RGB color system or other color systems, the time T when the Hannshang effect appears in one of the different components of those color systems, You can assign either the degree D or the luminance value I. For example, when using the RGB color system, the time T when the enhancement effect appears in the R (red) component, the degree of the enhancement effect in the G (green) component, D, and the brightness value in the B (blue) component You can assign each I.
[0045] また、画像表示装置の表示部 22にカラー画像 26を表示するとき、特定の部位や臓 器、或いは時刻を視覚上見易くするために、特徴量表示の表示窓 (Window)を任意に 調整できるようにしている。例えば、図 9に示す様に、 CT値 I、ェンハンス効果の時刻 T、ェンハンス効果の度合い Dの 3つの特徴量に対して、操作者がラジオボタン 28a, 2 8b, 29cで各々を選択する。その後に、操作者は、各特徴量の Windowレベルを Windo wレベル調整手段 30としてのスクロールバー 30aで、また各特徴量の Window幅を Win dow幅調整手段 31としてのスクロールバー 31aで、それぞれ独立して調整可能する。 これにより各特徴量の表示窓 (Window)が調整されるようにしている。ここで、 Windowレ ベルとは表示窓の中心値であり、 Window幅とは表示窓の幅、つまり最大値と最小値 の差を意味する。  [0045] Further, when displaying the color image 26 on the display unit 22 of the image display device, in order to make it easy to see a specific part, organ, or time visually, a display window (Window) for displaying the feature amount is arbitrarily set. It can be adjusted. For example, as shown in FIG. 9, the operator selects each of the three feature quantities of CT value I, time of effect effect T, and degree of effect effect D with radio buttons 28a, 28b, and 29c. After that, the operator can independently set the window level of each feature amount with the scroll bar 30a as the Windows level adjustment means 30, and the window width of each feature amount with the scroll bar 31a as the Windows width adjustment means 31. Can be adjusted. As a result, the display window of each feature value is adjusted. Here, the Window level is the center value of the display window, and the Window width means the width of the display window, that is, the difference between the maximum value and the minimum value.
[0046] 次に、表示窓に関する画像処理方法について説明する。  Next, an image processing method related to the display window will be described.
特徴量表示の表示窓が設定された場合、ェンハンス効果の時刻 Tから色相への変 換は、図 10に示した第一の変換方法 32によって次のように行われる。 Windowレベル 3 0(つまり、縦軸に平行な 2本の一点鎖線の間隔で示された表示窓の中心位置)を調整 する (つまりレベルを左右させる)と変換式 Η = ίΗ(Τ)の切片が調整 (つまり変換式と縦 軸との交点の位置が上下)され、 Window幅 31(つまり、縦軸に平行な 2本の一点鎖線 の間隔で示された表示窓の幅)を調整する (つまり表示窓の幅を広狭する)と変換式 H = ίΗ(Τ)の傾きが調整される。つまり、この第一の変換方法 32は、変換式 Η = ίΗ(Τ)の 形状を表示窓に応じて変える方法である。時刻 Tが表示窓の内側である場合は Hoc T (比例関係)とし、時刻 Tが表示窓よりも小さい場合は H = 0とし、時刻 Tが表示窓よりも 大きい場合は H = 255とする。こうすることによって、表示窓の内側の時刻にェンノヽン ス効果が現れた組織が赤色→緑色→青色のカラーグラデーションで表示され、表示 窓よりも前の時刻にェンハンス効果が現れた組織は赤色で表示され、表示窓よりも後 の時刻にェンハンス効果が現れた組織は青色で表示されることとなる。このとき、色 合い Hと時刻 Tとの関係の変化に応じて、カラーバー上の時相の目盛りの位置を変 化させて表示する。もしくは、カラーバー上の目盛りの位置は固定にして、カラーダラ デーシヨンを変化させて表示する。 When the display window for feature amount display is set, the conversion from the time T of the Hannhans effect to the hue is performed by the first conversion method 32 shown in FIG. 10 as follows. Adjusting the window level 30 (that is, the center position of the display window indicated by the interval between two dashed lines parallel to the vertical axis) (that is, changing the level), the conversion formula Η = ίΗ (Τ) intercept Is adjusted (i.e., the position of the intersection of the conversion formula and the vertical axis is up and down), and the window width 31 (i.e., the width of the display window indicated by the distance between two dashed lines parallel to the vertical axis) is adjusted ( In other words, the inclination of the conversion formula H = ίΗ (変 換) is adjusted when the width of the display window is increased or decreased. In other words, this first conversion method 32 uses the conversion formula Η = ίΗ (Τ) This is a method of changing the shape according to the display window. If time T is inside the display window, Hoc T (proportional relationship) is set. If time T is smaller than the display window, H = 0. If time T is larger than the display window, H = 255. By doing this, the organization where the hennosse effect appears at the time inside the display window is displayed in a color gradation of red → green → blue, and the tissue where the henhans effect appears at a time before the display window is red. Tissues that have an enhanced effect at a later time than the display window are displayed in blue. At this time, according to the change in the relationship between the hue H and the time T, the position of the time scale on the color bar is changed and displayed. Or, the position of the scale on the color bar is fixed, and the display is changed by changing the color dura.
[0047] また、ェンハンス効果度合い Dを表す特徴量として輝度値変化量 Dnから彩度へ、ま たェンハンス効果によって強調された組織を表現する特徴量として輝度値 Iから明度 への変換に関しても、それぞれの所定の変換式に基づいて、上述のェンハンス効果 の時刻 Tから色相への変換と同様に行う。  [0047] Further, regarding the conversion from the luminance value change amount Dn to saturation as the feature amount representing the degree of enhancement effect D, and the conversion from the luminance value I to brightness as the feature amount representing the organization emphasized by the enhancement effect, Based on the respective predetermined conversion formulas, the above-mentioned conversion from the time T to the hue of the Enhansse effect is performed.
[0048] なお、上述したェンハンス効果の時刻 Tから色相への変換処理では、表示窓よりも 前の時刻にェンハンス効果が現れた組織と、表示窓よりも後の時刻にェンハンス効 果が現れた組織とは異なる色相のカラーで表示される力 表示窓の内側の時刻丁で ェンハンス効果が現れた組織のみがカラー表示されるようにし、その他の時刻にェン ハンス効果が現れる組織はグレースケールで表示されるようにしても良い。この場合 は、上述の処理に加えて数 2で示した次の処理が行われることになる。  [0048] In the above-described conversion process from the time T of the Hannhans effect to the hue, the structure in which the Hannhs effect appears at a time before the display window and the Hannhs effect at a time later than the display window. Force that is displayed in a color with a hue different from that of the tissue Only the tissue that shows the Hanhuns effect is displayed in color in the time zone inside the display window, and the tissue that shows the Hannhans effect at other times is grayscale It may be displayed. In this case, in addition to the above-described processing, the following processing shown in Equation 2 is performed.
(数 2)  (Equation 2)
If T = (表示窓の外側)  If T = (outside the display window)
Then Saturation(S) = 0;  Then Saturation (S) = 0;
[0049] つまり、表示窓の外側の時刻 Tでェンハンス効果が現れた画素は、その彩度成分を ゼロにされる。こうすることにより、表示窓が調整されたときに、表示窓の内側に対応 する臓器又は画素のみがカラー表示とされ、外側はグレースケール表示とされるので 、特定の時刻下におけるェンノヽンス効果を見るのに適したカラー画像が得られること になる。このとき、カラーバー 27は、表示窓の内側のみがカラー表示され、外側がグ レースケール表示されてもょレ、。 [0050] また、表示窓に関する画像処理方法において、ェンハンス効果の時刻 Tから色相 への変換は、図 11に示した第二の変換方法 33のように、ェンハンス効果の時刻丁から 色相へ変換してもよい。この第二の変換方法 33は、変換式 H=fH(T)を変えずにこの 変換式上で表示窓の位置と幅を変える方法である。つまり、 Window幅および Window レベルを調整しても、変換式の切片、傾きを一定とする。そして、 Windowレベルにより 表示窓の位置力 Window幅により表示窓の幅が調整される。これにより、表示窓は 点線で示された変換式線上を移動するだけとなる。そのため、第一の変換方法 32と 比較して表示窓を調整したときに各組織の色合いが変化することがなくなる。また、力 ラーバー 27上のカラーグラデーションと、時相の目盛りの位置は固定的に表示される 。なお、数 2に示したように、表示窓の外側の時刻 Tでェンハンス効果が現れた画素 の彩度成分がゼロにされることにより、第一の変換方法 32と同様に、表示窓の内側の 時亥 1JTでェンハンス効果が現れた組織のみがカラー表示され、その他の時刻にェン ハンス効果が現れる組織がグレースケールで表示されるようにしても良い。 [0049] That is, the saturation component of the pixel in which the enhancement effect appears at time T outside the display window is set to zero. By doing this, when the display window is adjusted, only the organs or pixels corresponding to the inside of the display window are displayed in color, and the outside is displayed in grayscale. A color image suitable for viewing images can be obtained. At this time, the color bar 27 is displayed in color only on the inside of the display window and grayscale on the outside. [0050] In the image processing method relating to the display window, the conversion from the time T of the Hanengs effect to the hue is performed by converting the time from the time Hannjé effect to the hue as in the second conversion method 33 shown in FIG. May be. This second conversion method 33 is a method of changing the position and width of the display window on this conversion equation without changing the conversion equation H = fH (T). In other words, even if the Window width and Window level are adjusted, the intercept and slope of the conversion formula are made constant. Then, the position of the display window is adjusted by the window level, and the width of the display window is adjusted by the window width. As a result, the display window only moves on the conversion line indicated by the dotted line. Therefore, the hue of each tissue does not change when the display window is adjusted compared to the first conversion method 32. In addition, the color gradation on the power rubber 27 and the position of the time scale are fixedly displayed. Note that, as shown in Equation 2, the saturation component of the pixel where the enhancement effect appears at time T outside the display window is set to zero, so that the inner side of the display window is the same as in the first conversion method 32. At this time, it is possible to display in color only the structures that show the Hanhans effect at 1 JT, and to display the tissues that show the Hannhans effect at other times in gray scale.
[0051] さらに、この表示窓の応用技術として、ェンハンス効果の時刻 Tの Window幅を一定 値にした状態で、 Windowレベルを変化させると、ェンハンス効果が現れた部位の時 間変化が動画として分かるようになる。図 12は、時刻 Tの Window幅を狭めた状態で、 Windowレベルを 34a〜34eと変化させてレ、つたときの動画 35a〜35eを示してレ、る。この ような動画 35a〜35eによって、操作者は造影剤が時々刻々と流れる様子を容易に観 察すること力 Sできる。  [0051] Further, as an application technique of this display window, when the window level is changed with the window width at the time T of the Hannhs effect set to a constant value, the time change of the part where the Hannhs effect appears can be seen as a moving image. It becomes like this. FIG. 12 shows moving images 35a to 35e when the window level is changed from 34a to 34e while the window width at time T is narrowed. With such moving images 35a to 35e, the operator can easily observe how the contrast agent flows every moment.
或いは、 Windowレベルを一定値にした状態で Window幅を変化させても良いし、 Wi ndowレベルと Window幅を共に変化させて動画を生成しても良レ、。  Alternatively, the window width may be changed with the window level set to a constant value, or a movie may be generated by changing both the window level and the window width.
[0052] このとき各動画 35a〜35eの表示に際して、特定時刻においてェンノヽンス効果が現 れた特定の部位や臓器の明るさを増加して強調することによって視覚上見易くするこ ともできる。例えば、図 12では、 Windowレベル力 ¾4aにおける動画 35a内の特定部位 の色相 Hueが、 HVS表色系の赤色に対応してカラーバー 27に対して 0 < Hue< 0.3の とき、この範囲内の色相 Hueで表示されている特定部位の明度を増加して強調するよ うにすることもできる。 [0052] At this time, when displaying each of the moving images 35a to 35e, the brightness of a specific part or organ where the ennence effect appears at a specific time can be increased and emphasized to make it visually easy to see. For example, in Fig. 12, when the hue Hue of a specific part in the video 35a at Window level force ¾4a is 0 <Hue <0.3 for the color bar 27 corresponding to the red color of the HVS color system, Hue It is also possible to increase the brightness of a specific part displayed in Hue and emphasize it.
[0053] 上述した医療用画像診断装置における画像生成装置では、従来の CT画像が CT 値 Iのみを表現したものであるのに対して、上述したように CT値 Iのみならず、造影剤 によるェンハンス効果が現れた時刻 Tの情報と、ェンノヽンス効果の度合い Dの情報と 力 新たに CT画像に取り入れてカラー表示とされ、それらの情報が一枚の画像上に 表現される。 X線 CT装置では、従来は、 CT値 Iをグレースケールで表示することが行 われていた。 [0053] In the above-described image generating apparatus in the medical diagnostic imaging apparatus, a conventional CT image is a CT. While only the value I is expressed, not only the CT value I as described above, but also the information of the time T when the enhancement effect by the contrast agent appears, the information of the degree D of the non-sense effect D and the force Newly incorporated into the CT image and displayed in color, the information is represented on a single image. Traditionally, X-ray CT systems have displayed CT values I in grayscale.
一方、本発明の画像生成装置は、このグレースケール表示を変えずに、ェンハンス 効果が現れた時刻 Tの情報と、ェンハンス効果の度合レ、Dの情報とをグレースケール 情報に付加してこれらをカラー表示することによって、 CT値 Iに関するこれまでの経験 を継承しながら、新たな判別しやすレ、情報を表示することができる。  On the other hand, the image generation apparatus of the present invention adds the information of the time T when the Hannhans effect appears, the degree of the Hannhans effect, and the D information to the grayscale information without changing the grayscale display. By displaying in color, it is possible to display new information and information that can be easily discriminated while inheriting the previous experience with CT value I.
[0054] 特に、上述した 3つの特徴量を表現するために、それら 3つの特徴量は HSV表色系 を採用したカラー表示とされる。つまり、色相成分にはェンハンス効果が現れた時刻 T、彩度成分はェンハンス効果の度合い D、また明度成分は輝度値 (つまり CT値) Iが 、それぞれ割り当てられる。これによつて、操作者にとって、一層見易ぐ識別判断が 容易なカラー画像を表示する画像表示装置とすることができる。  [0054] In particular, in order to express the above-described three feature amounts, the three feature amounts are displayed in color using the HSV color system. In other words, the hue component is assigned the time T when the Hannsheng effect appears, the saturation component is assigned the degree D of the Hannes effect, and the brightness component is assigned the luminance value (ie CT value) I. As a result, an image display device that displays a color image that is easier for the operator to easily identify and judge can be provided.
[0055] また、上述した医療用画像診断装置における画像生成装置では、ェンハンス効果 の度合レ、を表す特徴量としての輝度値変化量 Dnを用いてレ、るため、骨のようにェン ハンス効果がない組織の場合は輝度値変化量 Dnが小さくグレースケールで表示さ れ、一方、血管などのようにェンハンス効果がある場合は輝度値変化量 Dnが大きく力 ラーで表示されるので、操作者は両者を区別して血流の有無を把握し易くなる。また 、ェンハンス効果によって強調された組織を表現する特徴量として、最大 (最小)輝度 値 Inが採用され、さらに、ェンノヽンス効果が現れた時刻 Tを表す特徴量として輝度値 が最大 (最小)となる時刻 Tnが採用されて、それぞれ色情報に変換されて画像表示に 反映されるようにしてレ、る。このため、画像上で、ェンハンス効果が現れた時刻によつ て、色合いが赤色、黄色、緑色、青色と変化して表示され、造影剤が時々刻々と進ん でいく様子が一枚の画像上で表示されることになる。これにより、操作者は血行状態 を一層把握し易くなる。  [0055] Further, in the above-described image generation apparatus in the medical diagnostic imaging apparatus, since the luminance value change amount Dn is used as the feature amount representing the degree of the enhancement effect, the Hannhans like a bone is used. If the tissue has no effect, the change in luminance value Dn is small and displayed in grayscale, while if there is an enhanced effect such as a blood vessel, the change in luminance value Dn is large and displayed in power. It becomes easy for a person to distinguish between the two and grasp the presence or absence of blood flow. In addition, the maximum (minimum) luminance value In is adopted as a feature value representing the organization emphasized by the Enhans effect, and the luminance value is the maximum (minimum) value as a feature value representing the time T at which the Nennence effect appears. The time Tn is adopted, and each is converted into color information and reflected in the image display. For this reason, depending on the time at which the Enhansse effect appears on the image, the hue changes to red, yellow, green, and blue, and the contrast agent progresses from moment to moment on the image. Will be displayed. This makes it easier for the operator to grasp the blood circulation state.
[0056] (第 2の実施形態)  [0056] (Second Embodiment)
図 13は、本発明の他に実施の形態であり、医療用画像診断装置としての超音波診 断装置に本発明の画像生成装置を用いた例の概略構成図である。 FIG. 13 shows another embodiment of the present invention, and an ultrasonic diagnosis as a medical image diagnostic apparatus. It is a schematic block diagram of the example which used the image generation apparatus of this invention for the cutting device.
超音波診断装置 38は、ネットワークやその他の可搬性記憶媒体を利用して X線 CT 装置や MRI装置等の医療画像診断装置 37から第 1の実施形態で説明したボリューム データの転送を受け、データ記憶部 43に格納している。この超音波診断装置 38の主 要部は、超音波像を再構成する系統と、再構成された超音波像に対応する断層像を 再構成する系統とに大別される。前者は、探触子 39から出力される反射エコー信号 に基づいて超音波像を再構成する超音波像取得部 42を有してなる。後者は、データ 記憶部 43と、探触子 39に取り付けられた磁場を検知する位置センサや体位センサや これらと協働する本体である磁場発生器や呼吸センサなどから成る各種検出手段 40 と、位置関連づけ部 44と、断層像取得部 45とを有している。超音波像取得部 42と断 層像取得部 45から出力される同一断面の超音波像と断層像のカラー画像とは表示 処理装置 46によって表示部 11に描画され、ここでは超音波像と断層像を同一画面に 並べて表示するようにしている。  The ultrasonic diagnostic apparatus 38 receives the volume data transfer described in the first embodiment from the medical image diagnostic apparatus 37 such as an X-ray CT apparatus and an MRI apparatus using a network or other portable storage medium, and receives data. It is stored in the storage unit 43. The main parts of the ultrasonic diagnostic apparatus 38 are roughly classified into a system for reconstructing an ultrasonic image and a system for reconstructing a tomographic image corresponding to the reconstructed ultrasonic image. The former includes an ultrasonic image acquisition unit 42 that reconstructs an ultrasonic image based on the reflected echo signal output from the probe 39. The latter includes a data storage unit 43, various detection means 40 including a position sensor and a body position sensor for detecting a magnetic field attached to the probe 39, and a magnetic field generator and a respiration sensor as a main body in cooperation with these, A position associating unit 44 and a tomographic image acquiring unit 45 are provided. The ultrasonic image of the same section and the color image of the tomographic image output from the ultrasonic image acquisition unit 42 and the slice image acquisition unit 45 are drawn on the display unit 11 by the display processing device 46, and here the ultrasonic image and the tomographic image are displayed. The images are displayed side by side on the same screen.
本実施形態の画像生成装置は、少なくとも上記データ記憶部 43と断層像取得部 45 を含んで成り、さらに表示処理装置 46と表示部 11とを含んで構成されも良レ、。  The image generation apparatus according to the present embodiment includes at least the data storage unit 43 and the tomographic image acquisition unit 45, and may further include a display processing unit 46 and the display unit 11.
[0057] 探触子 39は、被検体 41との間で超音波を送受信するものであり、超音波を発生す ると共に反射エコーを受信する複数の振動子を内蔵している。超音波像取得部 42は 、探触子 39から出力される反射エコー信号を入力してデジタル信号に変換し、診断 部位の例えば白黒断層像 (Bモード像)やカラーフローマッピング像 (CFM像)を生成す るものである。 [0057] The probe 39 transmits and receives ultrasonic waves to and from the subject 41, and includes a plurality of transducers that generate ultrasonic waves and receive reflected echoes. The ultrasonic image acquisition unit 42 inputs the reflected echo signal output from the probe 39 and converts it into a digital signal, and for example, a black and white tomographic image (B mode image) or a color flow mapping image (CFM image) of the diagnostic part. Is generated.
[0058] 位置関連づけ部 44は、超音波像と略同一の断面を含む断層像を得るために様々 な情報を各種検出手段 40から取得するものである。この各種検出手段 40は、被検体 41の体位変化や、呼吸による内臓移動によって被検体 41の座標系が絶えず変化す ることによる断層像と超音波像の座標系のずれを補正するための情報を付加して取 得する。断層像取得部 45は、位置関連づけ部 44で取得された各種情報に基づいて データ記憶部 43内のボリュームデータを用いて、超音波像と略同一の断面を含む断 層像を得る。このように超音波像と同一断面の断層像を取得する方法は、種々考えら れ、本願と同一出願人に係わる種々の提案に示された算出方法やその他の様々な 算出方法を採用することができる (例えば、 WO2004/098414号公報)。 The position associating unit 44 acquires various pieces of information from the various detection means 40 in order to obtain a tomographic image including substantially the same cross section as the ultrasonic image. The various detection means 40 are information for correcting a shift in the coordinate system between the tomographic image and the ultrasonic image due to a change in the posture of the subject 41 and a change in the coordinate system of the subject 41 due to movement of the internal organs due to respiration. Acquire by adding. The tomographic image acquisition unit 45 uses the volume data in the data storage unit 43 based on the various types of information acquired by the position association unit 44 to obtain a slice image including substantially the same cross section as the ultrasonic image. As described above, there are various methods for obtaining a tomographic image having the same cross section as that of the ultrasonic image. A calculation method can be employed (for example, WO2004 / 098414).
[0059] また断層像取得部 45は、図 2に示したようにデータ記憶部 43内の輝度値 (つまり CT 値) Iのデータ 19と、造影剤によるェンハンス効果が現れた時刻 Tのデータ 20と、ェン ハンス効果の度合い Dのデータ 21とから、前述の第 1の実施形態で説明したカラー画 像データ 22を生成ように構成されてレ、る。 Further, the tomographic image acquisition unit 45, as shown in FIG. 2, data 19 of the luminance value (that is, CT value) I in the data storage unit 43 and data 20 of time T when the enhancement effect by the contrast agent appears. The color image data 22 described in the first embodiment is generated from the data 21 of the degree D of the Hannhans effect.
[0060] 図 14は、表示部 11における表示例を示す図である。同一断面の超音波画像 70と C Tのカラー画像 71を表示する。また、被検体の体表面および内臓を可視化した 3D画 像に対して、現在の超音波スキャン面を重ねた 3Dボディマーク 72を表示する。 FIG. 14 is a diagram showing a display example on the display unit 11. An ultrasonic image 70 and a CT color image 71 of the same cross section are displayed. In addition, a 3D body mark 72 is displayed by superimposing the current ultrasonic scan surface on a 3D image that visualizes the body surface and internal organs of the subject.
[0061] 3D画像は、カラー画像データ 22のボリュームデータを用いて、ボリュームレンダリン グ法で生成する。ボリュームレンダリング法とは、三次元可視化画像を生成するため の代表的な処理であり、ォパシティ (不透明度)の調整により表示対象の選択や内部 構造の透視を行うことができる。(詳細は。例えば、「Barthold Lichtenbelt, et al著、 Int roduction to Volume rendering, Hewlett-Packard Professional Books」に §ΰ載されて いる) [0061] The 3D image is generated by the volume rendering method using the volume data of the color image data 22. The volume rendering method is a typical process for generating a three-dimensional visualization image, and the display target can be selected and the internal structure can be seen through by adjusting the opacity. (For details, see §ΰ, for example, “Barthold Lichtenbelt, et al, Introduction to Volume rendering, Hewlett-Packard Professional Books”)
[0062] 本画像生成装置は、ェンハンス効果が現れた時刻 Τ、輝度値 I、ェンハンス効果の 度合レ、 Dの内の少なくとも一つの特徴量に応じてォパシティを設定することができるよ うにしておく。例えば、(数 3)のように、ォパシティ 0を、ェンハンス効果が現れた時亥 IJT と、ェンノヽンス効果の度合い Dと、 CT値 Iの関数として定義しておく。なお、ォパシティ 0を、ェンハンス効果が現れた時刻 Τと、ェンハンス効果の度合い Dと、 CT値 Iの内の いずれか 1つ又は 2つの関数としても良い。  [0062] The present image generation apparatus is configured so that the opacity can be set according to at least one feature amount among the time when the enhancement effect appears, the luminance value I, the degree of the enhancement effect, and D. deep. For example, as in (Equation 3), the opacity 0 is defined as a function of the IJT when the Hannhans effect appears, the degree D of the Nennons effect, and the CT value I. The opacity 0 may be a function of one or two of the time 時刻 when the Hannhans effect appears, the degree D of the Hannhans effect, and the CT value I.
(数 3)  (Equation 3)
0 =f(T、 D、 I)  0 = f (T, D, I)
[0063] このォパシティ 0を求めるステップは、例えば前述の図 6におけるステップ S4dとステ ップ S4eとの間に挿入されて、臓器毎又は画素毎にォパシティが求められる。或いは 、ステップ S4とステップ S5との間に挿入されて臓器毎又は画素毎のォパシティ 0を求 めても良い。そして、全ての臓器又は画素の色づけとォパシティが求められた後に、 上記ボリュームレンダリング法を用いて、カラー 3D合成画像が生成されるステップを 有し、この後に前述の表示ステップ S5で、表示部 11にカラー 3D画像が表示される。 [0064] 表示部 11には、例えば、ォパシティ 0とェンハンス効果が現れた時刻 Tとの関係を グラフとして表した、ォパシティ曲線 73が表示される。操作者は、マウスなどを用いて ォパシティ曲線を変更できるようにしておく。これにより、ェンハンス効果が現れた時 刻毎に、操作者は、表示対象を選択することや内部構造の透視を行うことができる。 The step of obtaining the opacity 0 is inserted, for example, between step S4d and step S4e in FIG. 6 described above, and the opacity is obtained for each organ or each pixel. Alternatively, it may be inserted between step S4 and step S5 to obtain the opacity 0 for each organ or each pixel. Then, after the coloring and opacity of all the organs or pixels are obtained, a color 3D composite image is generated using the volume rendering method. After that, in the display step S5 described above, the display unit 11 A color 3D image is displayed on the screen. [0064] On the display unit 11, for example, an opacity curve 73 is displayed in which the relationship between the opacity 0 and the time T at which the enhancement effect appears is shown as a graph. The operator should be able to change the opacity curve using a mouse. As a result, the operator can select a display target or perform a fluoroscopic view of the internal structure every time the enhancement effect appears.
[0065] 例えば、時相 2のォパシティ値が大きく設定され、時相 2以外のォパシティ値が小さ く設定されると、時相 2にェンハンス効果が得られた部位が抽出された 3D画像が生成 されることになる。  [0065] For example, if the opacity value of time phase 2 is set large and the opacity value other than time phase 2 is set small, a 3D image is generated in which the region where the enhancement effect is obtained in time phase 2 is extracted. Will be.
[0066] なお、ォパシティ曲線 73は、ォパシティ 0とェンハンス効果の度合い Dとの関係を示 すものを表示してもよいし、ォパシティ 0と輝度値 Iとの関係を示すものを表示してもよ レ、。操作者のラジオボタン 28a、 28b、 28cの選択に対応して、表示が切り替えられるよ うにしておく。  [0066] The opacity curve 73 may display a relationship between the opacity 0 and the degree of enhancement effect D, or may display a relationship between the opacity 0 and the luminance value I. Yo! The display should be switched in response to the operator's selection of radio buttons 28a, 28b, 28c.
[0067] このような超音波診断装置における画像生成装置によれば、超音波像と、それと同 一断面の断層像とを同時に表示することができる。また、前述の第 1の実施形態で説 明したように、断層像はカラー表示によって強調されているため、操作者は両者を対 比しながら総合的な診断を一層容易に行うことができる。  [0067] According to such an image generation apparatus in the ultrasonic diagnostic apparatus, an ultrasonic image and a tomographic image having the same cross section can be displayed simultaneously. Further, as described in the first embodiment, the tomographic image is emphasized by color display, so that the operator can more easily make a comprehensive diagnosis while comparing the two.
[0068] しかも、前述の第 1の実施形態と同様にェンハンス効果の度合い Dを表す特徴量と しての輝度値変化量 Dnを用いているため、骨のようにェンハンス効果がない組織の 場合は輝度値変化量 Dnが小さくグレースケールで表示される。一方、血管などのよう にェンノヽンス効果がある場合は輝度値変化量 Dnが大きく色づけ表示される。両者の このような差異のために、操作者は両者を区別して血流の有無を把握し易くなる。ま た、ェンノ、ンス効果によって強調された組織を表現する特徴量として、輝度値の最大 (最小)値 Inを採用し、さらに、ェンハンス効果が現れた時刻を表す特徴量として、輝 度値が最大 (最小)となる時刻 Tnを採用してカラー画像に反映されるようにしているた め、ェンハンス効果が現れた時刻によって、色合いが、赤色、黄色、緑色、青色と変 化して表示され、造影剤が時々刻々と進んでいく様子が一枚の画像上で表示できる ようになる。  [0068] Moreover, since the luminance value change amount Dn is used as the feature amount representing the degree D of the enhancement effect as in the first embodiment described above, the tissue has no enhancement effect such as a bone. The brightness value change amount Dn is small and displayed in grayscale. On the other hand, when there is an ensemble effect such as a blood vessel, the luminance value change amount Dn is displayed in a large color. Because of these differences, it is easier for the operator to distinguish between the two and grasp the presence or absence of blood flow. In addition, the maximum (minimum) value In of the luminance value is adopted as the feature value that represents the organization emphasized by the Jenno and Gons effect, and the luminance value is used as the feature value that represents the time at which the Enhans effect appears. The maximum (minimum) time Tn is used so that it is reflected in the color image, so depending on the time when the Enhansse effect appears, the hue will be displayed as red, yellow, green, and blue. It is possible to display the progress of the contrast medium from moment to moment on a single image.

Claims

請求の範囲 The scope of the claims
[1] 造影剤が注入された被検体を撮影して取得された画像データを用いて、前記造影 剤による組織の時間的変化を表す医用画像を生成する方法であって、  [1] A method for generating a medical image representing a temporal change of a tissue by the contrast agent using image data obtained by imaging a subject into which a contrast agent has been injected,
(a)時相の異なる複数の前記画像データを取得するステップと、  (a) obtaining a plurality of the image data having different time phases;
(b)前記時相の異なる複数の画像データの各々から同一位置の画素の輝度値を取 得するステップと、  (b) obtaining a luminance value of a pixel at the same position from each of the plurality of image data having different time phases;
(c)前記輝度値の時間的変化に基づいて、前記造影剤による前記輝度値の時間的 変化を表す一以上の特徴量を取得するステップと、  (c) obtaining one or more feature amounts representing a temporal change in the luminance value due to the contrast agent based on the temporal change in the luminance value;
(d)前記一以上の特徴量をそれぞれ異なる色情報に変換するステップと、 (d) converting the one or more feature quantities into different color information,
(e)前記一以上の色情報に基づいて、前記輝度値の時間的変化の情報を表すよう に前記画素の色づけを行うステップと、 (e) coloring the pixel based on the one or more color information so as to represent information on temporal change of the luminance value;
(f)少なくとも一部の領域の異なる画素毎に前記ステップ (b)〜(e)を繰り返して、該少 なくとも一部の領域が前記色づけされたカラー画像を生成するステップと、 を含むことを特徴とする画像生成方法。  (f) repeating the steps (b) to (e) for each different pixel in at least a part of the region to generate a color image in which the part of the region is colored. An image generation method characterized by the above.
[2] 請求項 1記載の画像生成方法において、 [2] In the image generation method according to claim 1,
前記色づけステップでは、第 1の画素における輝度値の時間的変化の第 1情報に 基づいて該第 1の画素が第 1の色に色づけされ、第 2の画素における輝度値の時間 的変化の第 2情報に基づいて該第 2の画素が第 2の色に色づけられることを特徴とす る画像生成方法。  In the coloring step, the first pixel is colored to the first color based on the first information of the temporal change of the luminance value in the first pixel, and the first temporal change of the luminance value in the second pixel is performed. 2. An image generation method characterized in that the second pixel is colored in a second color based on information.
[3] 請求項 2記載の画像生成方法において、 [3] The image generation method according to claim 2,
前記色づけステップでは、前記第 1情報と前記第 2情報とに基づいて、前記第 1の画 素と前記第 2の画素の内の少なくとも一方力 S、第 3の色に色づけされることを特徴とす る画像生成方法。  In the coloring step, based on the first information and the second information, at least one force S of the first pixel and the second pixel is colored to a third color. Image generation method.
[4] 請求項 2記載の画像生成方法において、 [4] The image generation method according to claim 2,
前記第 1の画素と前記第 2の画素とがーつの臓器に含まれることを特徴とする画像 生成方法。  The image generating method, wherein the first pixel and the second pixel are included in one organ.
[5] 請求項 2記載の画像生成方法において、  [5] The image generation method according to claim 2,
前記第 1の画素は第 1の臓器に含まれ、前記第 2の画素は第 2の臓器に含まれ、 前記第 1の臓器が前記第 1の色に、前記第 2の臓器が前記第 2の色に、それぞれ色 づけられることを特徴とする画像生成方法。 The first pixel is contained in a first organ, the second pixel is contained in a second organ; An image generation method, wherein the first organ is colored in the first color and the second organ is colored in the second color.
[6] 請求項 1記載の画像生成方法において、 [6] The image generation method according to claim 1,
前記色づけステップでは、所望の時相を基準としたときの前記輝度値の時間的変 化の情報が異なる第 1の画素と第 2の画素とが、それぞれ異なる色に色づけられること を特徴とする画像生成方法。  In the coloring step, the first pixel and the second pixel, which are different in temporal change information of the luminance value with reference to a desired time phase, are colored in different colors, respectively. Image generation method.
[7] 請求項 2記載の画像生成方法において、 [7] The image generation method according to claim 2,
前記変換ステップでは、前記色情報として、色の種類と色の鮮やかさの内の少なく とも一方が用いられ、  In the conversion step, at least one of color type and color vividness is used as the color information,
前記色づけステップでは、前記第 1の画素と前記第 2の画素に、前記色の種類と前 記色の鮮やかさの内の少なくとも一方が異なる色が色づけられることを特徴とする画 像生成方法。  In the coloring step, the first pixel and the second pixel are colored with a color that differs in at least one of the color type and the vividness of the color.
[8] 請求項 1記載の画像生成方法において、前記特徴量取得ステップでは、  [8] In the image generation method according to claim 1, in the feature amount acquisition step,
前記一以上の特徴量として、前記造影剤によるェンハンス効果が現れた時刻を表 す特徴量と、前記造影剤によるェンノヽンス効果を表す特徴量の内の少なくとも一つ が取得されることを特徴とする画像生成方法。  As the one or more feature quantities, at least one of a feature quantity representing a time at which an enhancement effect by the contrast medium appears and a feature quantity representing an ensemble effect by the contrast medium is acquired. An image generation method.
[9] 請求項 8記載の画像生成方法において、  [9] The image generation method according to claim 8,
前記造影剤によるェンハンス効果を表す特徴量は、前記造影剤による前記画素の 輝度値の増加量又は減少量であることを特徴とする画像生成方法。  The image generation method characterized in that the feature amount representing the enhancement effect by the contrast agent is an increase amount or a decrease amount of the luminance value of the pixel by the contrast agent.
[10] 請求項 8記載の画像生成方法において、  [10] The image generation method according to claim 8,
前記造影剤によるェンノヽンス効果が現れた時刻を表す特徴量は、前記複数の画像 データの中で前記画素の輝度値が最大又は最小となった時刻であることを特徴とす る画像生成方法。  An image generation method characterized in that the feature amount indicating the time when the ensemble effect due to the contrast agent appears is the time when the luminance value of the pixel becomes maximum or minimum in the plurality of image data .
[11] 請求項 8記載の画像生成方法において、 [11] The image generation method according to claim 8,
前記変換ステップでは、  In the conversion step,
色相 (Hue)成分,彩度 (Saturation)成分,及び明度 (Value)成分からなる HSV表色系 が用いられ、  HSV color system consisting of Hue component, Saturation component, and Value component is used.
前記造影剤によるェンハンス効果が現れた時刻を表す特徴量が前記色相成分に、 前記造影剤によるェンハンス効果を表す特徴量が前記彩度成分に、それぞれ変換 され、 A feature amount indicating the time when the enhancement effect by the contrast agent appears is the hue component, A feature amount representing the enhancement effect by the contrast agent is converted into the saturation component,
前記カラー画像生成ステップでは、前記色相成分と前記彩度成分の内の少なくとも 一つの成分を用いて前記カラー画像が生成されることを特徴とする画像生成方法。  In the color image generation step, the color image is generated using at least one of the hue component and the saturation component.
[12] 請求項 11記載の画像生成方法において、 [12] The image generation method according to claim 11,
前記特徴量取得ステップでは、前記一以上の特徴量の一つとして、さらに、輝度値 に基づく特徴量が取得され、  In the feature amount acquisition step, a feature amount based on a luminance value is further acquired as one of the one or more feature amounts,
前記変換ステップでは、前記輝度値に基づく特徴量が前記明度成分に変換され、 前記カラー画像生成ステップでは、さらに、前記明度成分を用いて前記カラー画像 が生成されることを特徴とする画像生成方法。  In the conversion step, a feature amount based on the luminance value is converted into the lightness component, and in the color image generation step, the color image is further generated using the lightness component. .
[13] 請求項 12記載の画像生成方法において、 [13] The image generation method according to claim 12,
前記輝度値に基づく特徴量は、前記複数の画像データにおける同一の前記画素 の中で最大又は最小の輝度値であることを特徴とする画像生成方法。  The image generation method characterized in that the feature amount based on the luminance value is a maximum or minimum luminance value among the same pixels in the plurality of image data.
[14] 請求項 1記載の画像生成方法において、 [14] The image generation method according to claim 1,
前記カラー画像生成ステップの後に、  After the color image generation step,
前記カラー画像を表示するステップと、  Displaying the color image;
前記カラー画像における色づけを調整するステップと、  Adjusting the coloring in the color image;
を有し、  Have
前記調整ステップでは、前記色情報に変換される前記特徴量の範囲とレベルの内 の少なくとも一方が調整されることによって、前記カラー画像における色づけが調整さ れることを特徴とする画像生成方法。  In the adjustment step, the coloring in the color image is adjusted by adjusting at least one of the range and level of the feature amount converted into the color information.
[15] 請求項 14記載の画像生成方法において、 [15] The image generation method according to claim 14,
前記調整ステップでは、前記色情報に変換される前記特徴量の範囲とレベルの内 の少なくとも一方が連続的に変更されることによって、前記カラー画像が動画として 生成されることを特徴とする画像生成方法。  In the adjustment step, the color image is generated as a moving image by continuously changing at least one of a range and a level of the feature amount converted into the color information. Method.
[16] 請求項 1記載の画像生成方法において、 [16] The image generation method according to claim 1,
画像データを取得するステップは、  The step of acquiring image data is
時相の異なる複数のボリュームデータを読み込むステップと、 前記時相の異なる複数のボリュームデータの位置合わせを行うステップと、 を含み、 A step of reading a plurality of volume data having different time phases; Aligning a plurality of volume data having different time phases, and
前記輝度値取得ステップでは、前記時相の異なる複数のボリュームデータの各々 力 同一位置の画素の輝度値が取得され、  In the luminance value acquisition step, a luminance value of a pixel at the same position is acquired for each of the plurality of volume data having different time phases,
前記色づけステップと前記カラー画像生成ステップとの間に、  Between the coloring step and the color image generating step,
前記一以上の特徴量の内の少なくとも一つの特徴量に基づいて前記画素のォパ シティ一を決定するステップを有し、  Determining an opacity of the pixel based on at least one of the one or more feature quantities;
前記カラー画像生成ステップは、前記一以上の色情報と前記ォパシティ一とに基 づいて、該少なくとも一部の領域が色づけされた 3次元カラー画像を生成するステツ プと、  The color image generation step includes a step of generating a three-dimensional color image in which at least a part of the area is colored based on the one or more color information and the opacity.
を含むことを特徴とする画像生成方法。  An image generation method comprising:
[17] 造影剤が注入された被検体を撮影して取得された時相の異なる複数の画像データ を入力する入力手段と、 [17] An input means for inputting a plurality of image data having different time phases obtained by imaging a subject into which a contrast medium has been injected,
前記時相の異なる複数の画像データを記憶する記憶手段と、  Storage means for storing a plurality of image data having different time phases;
前記時相の異なる複数の画像データを用いて医用画像を生成する演算手段と、 を備え、  A calculation means for generating a medical image using a plurality of image data having different time phases, and
前記演算手段は、前記時相の異なる複数の画像データから、前記造影剤による組 織の時間的変化を表す一以上の特徴量を画素毎に抽出してそれぞれ異なる色情報 に変換し、少なくとも一部の領域の画素毎にその時間的変化の情報を表すように色 づけされたカラー画像を生成することを特徴とする画像生成装置。  The computing means extracts one or more feature amounts representing temporal changes in the organization due to the contrast agent from the plurality of image data having different time phases, and converts them into different color information. An image generating apparatus characterized by generating a color image colored so as to represent information of temporal change for each pixel in a part area.
[18] 造影剤が注入された被検体を撮影して取得された時相の異なる複数の画像データ を入力する入力手段と、 [18] An input means for inputting a plurality of image data having different time phases obtained by imaging a subject into which a contrast medium has been injected,
前記時相の異なる複数の画像データを記憶する記憶手段と、  Storage means for storing a plurality of image data having different time phases;
前記時相の異なる複数の画像データを用いて医用画像を生成する演算手段と、 を備え、  A calculation means for generating a medical image using a plurality of image data having different time phases, and
前記演算手段は、前記時相の異なる複数の画像データから、前記造影剤による組 織の時間的変化を表す一以上の特徴量を画素毎に抽出してそれぞれ異なる色情報 に変換し、少なくとも一部の領域の画素毎にその時間的変化の情報を表すように色 づけされたカラー画像を生成することを特徴とする画像生成装置。 The computing means extracts one or more feature amounts representing temporal changes in the organization due to the contrast agent from the plurality of image data having different time phases, and converts them into different color information. Color to represent the temporal change information for each pixel in the area Generating an attached color image.
[19] 請求項 18記載の画像生成装置において、  [19] The image generation device according to claim 18,
前記カラー画像を表示する表示手段を備え、  Display means for displaying the color image;
前記表示手段は、前記カラー画像と、前記特徴量の色情報への変換における対応 関係を表すカラーバーとを表示することを特徴とする画像生成装置。  The display means displays the color image and a color bar representing a correspondence relationship in conversion of the feature quantity into color information.
[20] 請求項 18記載の画像生成装置において、 [20] The image generating device according to claim 18,
前記入力手段は、前記被検体の所望の断面を撮影して取得された第 1の断層像デ ータを入力し、  The input means inputs first tomographic image data acquired by imaging a desired cross section of the subject,
前記所望の断面の位置情報を取得する手段を備え、  Means for obtaining position information of the desired cross section;
前記演算手段は、  The computing means is
前記位置情報に基づレ、て、前記時相の異なる複数の画像データから前記所望の 断面を含む時相の異なる複数の第 2の断層像データを取得し、  Based on the position information, a plurality of second tomographic image data having different time phases including the desired cross section are acquired from the plurality of image data having different time phases,
前記時相の異なる複数の第 2の断層像データを用いて前記カラー画像を生成する ことを特徴とする画像生成装置。  The color image is generated by using the plurality of second tomographic image data having different time phases.
PCT/JP2005/020564 2004-11-10 2005-11-10 Image creating method and device WO2006051831A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006544930A JP4980723B2 (en) 2004-11-10 2005-11-10 Image generation method and image generation apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2004326043 2004-11-10
JP2004-326043 2004-11-10

Publications (1)

Publication Number Publication Date
WO2006051831A1 true WO2006051831A1 (en) 2006-05-18

Family

ID=36336508

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2005/020564 WO2006051831A1 (en) 2004-11-10 2005-11-10 Image creating method and device

Country Status (2)

Country Link
JP (1) JP4980723B2 (en)
WO (1) WO2006051831A1 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008005923A (en) * 2006-06-27 2008-01-17 Olympus Medical Systems Corp Medical guide system
JP2008073301A (en) * 2006-09-22 2008-04-03 Toshiba Corp Medical imaging diagnostic apparatus and medical image processor
JP2009056108A (en) * 2007-08-31 2009-03-19 Toshiba Corp X-ray computed tomographic instrument and its method
JP2010512888A (en) * 2006-12-21 2010-04-30 ブラッコ インターナショナル ベスローテン フエンノートシャップ Detection of immobilized contrast agent detachment in medical imaging applications
JP2010158360A (en) * 2009-01-07 2010-07-22 Toshiba Corp Medical image processor, ultrasonic diagnostic apparatus, and medical image processing program
CN101884545A (en) * 2009-05-13 2010-11-17 株式会社东芝 Nuclear medical imaging apparatus, image processing apparatus and image processing method
CN102048552A (en) * 2009-10-30 2011-05-11 西门子公司 Beam hardening correction for ct perfusion measurements
JP2011224354A (en) * 2010-03-30 2011-11-10 Toshiba Corp Ultrasonic diagnostic apparatus, ultrasonic image processor, and medical image diagnostic apparatus
JP2012020174A (en) * 2011-10-17 2012-02-02 Toshiba Corp Medical diagnostic imaging apparatus and medical image processor
WO2012063831A1 (en) * 2010-11-10 2012-05-18 東芝メディカルシステムズ株式会社 Image-processing apparatus and x-ray diagnostic apparatus
JPWO2011114830A1 (en) * 2010-03-19 2013-06-27 株式会社日立メディコ Ultrasonic diagnostic apparatus and ultrasonic image display method
JP2014008147A (en) * 2012-06-28 2014-01-20 Ge Medical Systems Global Technology Co Llc Ultrasonic diagnostic apparatus, and control program for the same
JP2014054565A (en) * 2009-09-30 2014-03-27 Toshiba Corp Magnetic resonance imaging apparatus and display processing system
JPWO2012137431A1 (en) * 2011-04-05 2014-07-28 コニカミノルタ株式会社 Ultrasonic diagnostic apparatus and output method of ultrasonic diagnostic image
WO2014136294A1 (en) * 2013-03-07 2014-09-12 Eizo株式会社 Color adjustment device, image display device, and color adjustment method
JP2015112232A (en) * 2013-12-11 2015-06-22 株式会社東芝 Image analysis apparatus and x-ray diagnostic apparatus
JP2015126868A (en) * 2013-11-29 2015-07-09 株式会社東芝 Medical image processing apparatus, x-ray diagnostic apparatus and medical image processing program
US20160015348A1 (en) * 2013-04-01 2016-01-21 Kabushiki Kaisha Toshiba Medical image processing apparatus, x-ray diagnostic apparatus, medical image processing method and x-ray diagnostic method
JP2016067568A (en) * 2014-09-29 2016-05-09 株式会社東芝 Medical image processing apparatus and x-ray diagnostic apparatus
US9607378B2 (en) 2013-04-05 2017-03-28 Panasonic Corporation Image region mapping device, 3D model generating apparatus, image region mapping method, and image region mapping program
US9750474B2 (en) 2013-04-05 2017-09-05 Panasonic Corporation Image region mapping device, 3D model generating apparatus, image region mapping method, and image region mapping program
CN108567443A (en) * 2017-03-09 2018-09-25 通用电气公司 Color visualization system and method for CT images
JP2021078847A (en) * 2019-11-20 2021-05-27 コニカミノルタ株式会社 Medical image display device, area display method, and area display program

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110827241B (en) * 2019-10-21 2022-08-12 国家广播电视总局广播电视规划院 Low-brightness enhanced picture full-reference method based on color distortion and contrast enhancement

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63242236A (en) * 1987-03-30 1988-10-07 株式会社島津製作所 Time information color-coded display device for medical images
JP2000005133A (en) * 1998-06-19 2000-01-11 Toshiba Corp Picture image display device for medical use
JP2004141612A (en) * 2002-08-30 2004-05-20 Hitachi Medical Corp Method and apparatus for image processing

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63242236A (en) * 1987-03-30 1988-10-07 株式会社島津製作所 Time information color-coded display device for medical images
JP2000005133A (en) * 1998-06-19 2000-01-11 Toshiba Corp Picture image display device for medical use
JP2004141612A (en) * 2002-08-30 2004-05-20 Hitachi Medical Corp Method and apparatus for image processing

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008005923A (en) * 2006-06-27 2008-01-17 Olympus Medical Systems Corp Medical guide system
JP2008073301A (en) * 2006-09-22 2008-04-03 Toshiba Corp Medical imaging diagnostic apparatus and medical image processor
JP2010512888A (en) * 2006-12-21 2010-04-30 ブラッコ インターナショナル ベスローテン フエンノートシャップ Detection of immobilized contrast agent detachment in medical imaging applications
US8512249B2 (en) 2006-12-21 2013-08-20 Bracco International Bv Detection of the detachment of immobilized contrast agent in medical imaging applications
JP2009056108A (en) * 2007-08-31 2009-03-19 Toshiba Corp X-ray computed tomographic instrument and its method
JP2010158360A (en) * 2009-01-07 2010-07-22 Toshiba Corp Medical image processor, ultrasonic diagnostic apparatus, and medical image processing program
CN101884545A (en) * 2009-05-13 2010-11-17 株式会社东芝 Nuclear medical imaging apparatus, image processing apparatus and image processing method
JP2014054565A (en) * 2009-09-30 2014-03-27 Toshiba Corp Magnetic resonance imaging apparatus and display processing system
CN102048552A (en) * 2009-10-30 2011-05-11 西门子公司 Beam hardening correction for ct perfusion measurements
CN102048552B (en) * 2009-10-30 2015-02-11 西门子公司 Beam hardening correction for ct perfusion measurements
JPWO2011114830A1 (en) * 2010-03-19 2013-06-27 株式会社日立メディコ Ultrasonic diagnostic apparatus and ultrasonic image display method
JP2011224354A (en) * 2010-03-30 2011-11-10 Toshiba Corp Ultrasonic diagnostic apparatus, ultrasonic image processor, and medical image diagnostic apparatus
US9380999B2 (en) 2010-03-30 2016-07-05 Kabushiki Kaisha Toshiba Ultrasonic diagnostic apparatus, ultrasonic image processing apparatus, and medical diagnostic imaging apparatus
US8965085B2 (en) 2010-11-10 2015-02-24 Toshiba Medical Systems Corporation Image processing apparatus and X-ray diagnosis apparatus
CN102858245A (en) * 2010-11-10 2013-01-02 东芝医疗系统株式会社 Image-processing apparatus and x-ray diagnostic apparatus
WO2012063831A1 (en) * 2010-11-10 2012-05-18 東芝メディカルシステムズ株式会社 Image-processing apparatus and x-ray diagnostic apparatus
CN102858245B (en) * 2010-11-10 2016-01-06 东芝医疗系统株式会社 Image processing apparatus and radiographic apparatus
JPWO2012137431A1 (en) * 2011-04-05 2014-07-28 コニカミノルタ株式会社 Ultrasonic diagnostic apparatus and output method of ultrasonic diagnostic image
JP2012020174A (en) * 2011-10-17 2012-02-02 Toshiba Corp Medical diagnostic imaging apparatus and medical image processor
JP2014008147A (en) * 2012-06-28 2014-01-20 Ge Medical Systems Global Technology Co Llc Ultrasonic diagnostic apparatus, and control program for the same
WO2014136294A1 (en) * 2013-03-07 2014-09-12 Eizo株式会社 Color adjustment device, image display device, and color adjustment method
RU2611599C1 (en) * 2013-03-07 2017-02-28 ЭЙЗО Корпорайшн Image colour adjustment device, image display device and chromaticity control method
CN109938759A (en) * 2013-04-01 2019-06-28 佳能医疗系统株式会社 Medical image-processing apparatus and radiographic apparatus
US20160015348A1 (en) * 2013-04-01 2016-01-21 Kabushiki Kaisha Toshiba Medical image processing apparatus, x-ray diagnostic apparatus, medical image processing method and x-ray diagnostic method
US9607378B2 (en) 2013-04-05 2017-03-28 Panasonic Corporation Image region mapping device, 3D model generating apparatus, image region mapping method, and image region mapping program
US9750474B2 (en) 2013-04-05 2017-09-05 Panasonic Corporation Image region mapping device, 3D model generating apparatus, image region mapping method, and image region mapping program
JP2015126868A (en) * 2013-11-29 2015-07-09 株式会社東芝 Medical image processing apparatus, x-ray diagnostic apparatus and medical image processing program
CN109938760A (en) * 2013-12-11 2019-06-28 东芝医疗系统株式会社 Image analysis apparatus and radiographic apparatus
US9833211B2 (en) 2013-12-11 2017-12-05 Toshiba Medical Systems Corporation Image analysis device and X-ray diagnostic apparatus
CN109938760B (en) * 2013-12-11 2024-03-12 东芝医疗系统株式会社 Image analysis device and X-ray diagnosis device
US10130324B2 (en) 2013-12-11 2018-11-20 Toshiba Medical Systems Corporation Image analysis device and X-ray diagnostic apparatus
JP2015112232A (en) * 2013-12-11 2015-06-22 株式会社東芝 Image analysis apparatus and x-ray diagnostic apparatus
JP2016067568A (en) * 2014-09-29 2016-05-09 株式会社東芝 Medical image processing apparatus and x-ray diagnostic apparatus
JP2018183567A (en) * 2017-03-09 2018-11-22 ゼネラル・エレクトリック・カンパニイ System and method for color visualization of CT images
JP7098356B2 (en) 2017-03-09 2022-07-11 ゼネラル・エレクトリック・カンパニイ Systems and methods for color visualization of CT images
CN108567443B (en) * 2017-03-09 2023-12-01 通用电气公司 Color visualization system and method for CT images
CN108567443A (en) * 2017-03-09 2018-09-25 通用电气公司 Color visualization system and method for CT images
JP2021078847A (en) * 2019-11-20 2021-05-27 コニカミノルタ株式会社 Medical image display device, area display method, and area display program
JP7424003B2 (en) 2019-11-20 2024-01-30 コニカミノルタ株式会社 Medical image display device, area display method, and area display program

Also Published As

Publication number Publication date
JPWO2006051831A1 (en) 2008-05-29
JP4980723B2 (en) 2012-07-18

Similar Documents

Publication Publication Date Title
JP4980723B2 (en) Image generation method and image generation apparatus
US8355775B2 (en) Image diagnosing support method and image diagnosing support apparatus
JP4558645B2 (en) Image display method and apparatus
JP5442530B2 (en) Image processing apparatus, image display apparatus, program, and X-ray CT apparatus
JP3833282B2 (en) Ultrasonic diagnostic equipment
US7817828B2 (en) Image processor for medical treatment support
JP5191167B2 (en) Medical guide system
US20090010519A1 (en) Medical image processing apparatus and medical image diagnosis apparatus
US8160315B2 (en) Ultrasonic imaging apparatus and projection image generating method
JP2006020800A (en) Ultrasonic diagnostic apparatus and image processor
US20090198130A1 (en) Ultrasonic diagnostic apparatus
US20160078621A1 (en) Image processing device and x-ray diagnostic apparatus
CN107885476A (en) A kind of medical image display methods and device
WO2014167325A1 (en) Methods and apparatus for quantifying inflammation
US20100033501A1 (en) Method of image manipulation to fade between two images
JP2004057411A (en) Method for preparing visible image for medical use
CN100577107C (en) Functional image display method and device
CN103140174B (en) Diagnostic ultrasound equipment, image processing apparatus
JP4996128B2 (en) Medical image processing apparatus and medical image processing method
JP5042533B2 (en) Medical image display device
JP5305635B2 (en) Medical image display device
CN104248450A (en) Ultrasonic processing apparatus and method
CN106028943A (en) Ultrasonic virtual endoscopic imaging system and method, and apparatus thereof
JP4268695B2 (en) Diagnostic imaging apparatus and ultrasonic diagnostic apparatus
JP4686279B2 (en) Medical diagnostic apparatus and diagnostic support apparatus

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KN KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2006544930

Country of ref document: JP

122 Ep: pct application non-entry in european phase

Ref document number: 05805982

Country of ref document: EP

Kind code of ref document: A1