WO2006030731A1 - 超音波撮像装置及び投影像生成方法 - Google Patents
超音波撮像装置及び投影像生成方法 Download PDFInfo
- Publication number
- WO2006030731A1 WO2006030731A1 PCT/JP2005/016745 JP2005016745W WO2006030731A1 WO 2006030731 A1 WO2006030731 A1 WO 2006030731A1 JP 2005016745 W JP2005016745 W JP 2005016745W WO 2006030731 A1 WO2006030731 A1 WO 2006030731A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- projection
- image data
- projection image
- dimensional image
- image
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/08—Volume rendering
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/06—Measuring blood flow
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8979—Combined Doppler and pulse-echo imaging systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8993—Three dimensional imaging systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52053—Display arrangements
- G01S7/52057—Cathode ray tube displays
- G01S7/52071—Multicolour displays; using colour coding; Optimising colour or information content in displays, e.g. parametric imaging
Definitions
- the present invention relates to an ultrasonic imaging apparatus and a projection image generation method, and more particularly to a technique suitable for generating a three-dimensional ultrasonic image.
- An ultrasonic imaging apparatus transmits and receives ultrasonic waves to and from a subject via an ultrasonic probe, and an ultrasonic image of the subject (for example, based on a reflected echo signal from which ultrasonic probe force is also output) , Gray scale images and color blood flow images) are reconstructed and displayed. Based on such an ultrasound image, non-invasive and real-time diagnosis of the imaging part is possible.
- tomogram volume data! /, U three-dimensional tomogram data
- Doppler image volume data image data
- the tissue projection image generated from the acquired tomographic volume data and the Doppler projection image generated from the Doppler image volume data are combined and displayed. Is done.
- Nutrition of the cancer tissue Recognizing the state of the blood vessel, whether the cancer tissue is primary or metastatic Judgment is made (for example, see Patent Document 1).
- Patent Document 1 US Pat. No. 6,280,387
- An object of the present invention is to generate a three-dimensional ultrasonic image capable of accurately grasping the positional relationship between tissues.
- an ultrasonic imaging apparatus includes an ultrasonic probe (10) that transmits and receives ultrasonic waves to and from a subject, and a drive signal to the ultrasonic probe (10).
- Signal processing means (
- a projection image generation method of the present invention includes a first accumulation step of accumulating a plurality of first two-dimensional image data, and the plurality of first two-dimensional image data.
- a first three-dimensional process for generating first volume data from the second, a second accumulation process for accumulating a plurality of second two-dimensional image data, and the plurality of second two-dimensional image data forces A second three-dimensional process for generating second volume data, a first projection process for rendering the first volume data to generate a first projection image, and the second volume data.
- generating the first projection image based on the second projection step Said second volume
- the second projection image is generated based on data and at least a part of the first volume data.
- the first projection image is generated by adding the information of the second 3D image data to the first 3D image data
- the second 3D image is generated for each pixel.
- the overlapping information with the data is reflected in the brightness. Therefore, by referring to the shading of the first projection image, the degree of overlap with the second three-dimensional image data can be grasped, and a three-dimensional and effective diagnosis can be easily performed.
- the second projected image is generated by adding the information of the first 3D image data to the second 3D image data
- the second projected image overlaps with the first 3D image data for each pixel.
- Information is reflected in brightness. Therefore, by referring to the shading of the second projected image, the degree of overlap with the first three-dimensional image data can be grasped, and a three-dimensional and effective diagnosis can be easily performed.
- FIG. 1 is a configuration diagram of an ultrasonic imaging apparatus of the present embodiment to which the present invention is applied.
- FIG. 2 is a configuration diagram of a projection image generation unit 28 in FIG.
- FIG. 3 is a diagram for explaining an operation for generating a composite image.
- FIG. 4 is an explanatory diagram of normal volume rendering processing.
- FIG. 5 is a display example of a composite image 69 configured using a phantom.
- Botacel value force is a diagram showing the relationship leading to opacity.
- FIG. 7 is a flowchart showing a processing flow for obtaining a projected image and a composite image.
- FIG. 8 is a diagram showing an example in which a plurality of different color mapping tables are displayed and a composite image is generated based on a selected map!
- FIG. 1 is a diagram showing an embodiment of an ultrasonic imaging apparatus and a projection image generation method to which the present invention is applied.
- FIG. 1 is a configuration diagram of the ultrasonic imaging apparatus of the present embodiment.
- the ultrasonic imaging apparatus displays an imaging processing system for acquiring three-dimensional ultrasonic image volume data of a subject and the acquired three-dimensional ultrasonic image volume data. Broadly divided into display processing systems.
- the imaging processing system includes an ultrasonic probe 10 in which a plurality of transducers that transmit and receive ultrasonic waves to and from a subject are two-dimensionally arranged, and a transmission unit 12 that supplies a drive signal to the ultrasonic probe 10.
- the ultrasonic probe 10 includes a receiving unit 14 that receives the reflected echo signal output, and a phasing addition unit 16 that performs phasing addition of the reflected echo signal output from the receiving unit 14. Further, as a means for acquiring tomogram volume data based on the reflected echo signal output from the phasing adder 16, a tomogram system signal processor 18 and a tomogram volume data generator 19 are provided.
- a Doppler image signal processing unit 20 and a Doppler image volume data creation unit 21 are provided as means for acquiring Doppler image volume data based on the reflected echo signal from which the phasing addition unit 16 also outputs.
- control commands (dotted lines in FIG. 1) are sent to the ultrasonic probe 10, the transmission unit 12, the reception unit 14, the phasing addition unit 16, the tomographic image signal processing unit 18, and the Doppler image system signal processing unit 20. It has a control unit 22 for outputting.
- the display processing system includes a communication port 24 that captures tomographic volume data and Doppler image volume data output from the imaging processing system, and a volume data storage unit 26 that stores volume data output from the communication port 24.
- a projection image generation unit 28 that generates a projection image based on the volume data read from the volume data storage unit 26, and the projection image generated by the projection image generation unit 28 is displayed on the monitor via the video memory 30.
- a display unit 32 displayed on the screen is provided.
- a central processing unit hereinafter referred to as a CPU 34
- a CPU 34 that outputs control commands to the communication port 24, the volume data storage unit 26, the projection image generation unit 28, the video memory 30, and the display unit 32 is provided.
- the communication port 24, the volume data storage unit 26, the projection image generation unit 28, the video memory 30, the display unit 32, and the CPU 34 are connected to each other via a shared bus 36.
- a magnetic disk device 27 can be provided as an auxiliary storage device of the volume data storage unit 26. However, not only the magnetic disk unit 27 but also other storage devices such as DVD-R may be provided.
- a console 38 is connected to the imaging processing system and the image processing system.
- the console 38 has an input device such as a keyboard or a mouse, and captures commands input via the input device. Output to the control unit 22 of the processing system, or output to the CPU 34 of the image processing system via a shared bus.
- an input device such as a keyboard or a mouse
- FIG. 2 is a configuration diagram of the projection image generation unit 28 of FIG.
- the projection image generation unit 28 corrects the information belonging to each voxel of the tomogram volume data read from the volume data storage unit 26 based on the information belonging to each botacell of the Doppler image volume data
- a tissue image rendering unit 40 for generating a light and shade (for example, black and white) tissue projection image from the corrected tomographic image volume data is provided.
- a memory 42 for storing a correction coefficient (blend coefficient) R (or Rl, R2 to be described later) to be given to the tissue image rendering unit 40 is provided.
- a Doppler image rendering unit 44 for generating a projection image is provided.
- the memory 46 also stores a correction coefficient (blend coefficient) S (or SI, S2 described later) to be given to the Doppler image rendering unit 44.
- the correction coefficients R, S (or Rl, R2, SI, S2 to be described later) are variably set in the range of “0” to “1” by a command from the console 38. A fixed value may be used.
- a composite image is generated by superimposing the tissue projection image generated by the tissue image rendering unit 40 and the Doppler projection image generated by the Doppler image rendering unit 44, and the generated composite image is displayed on the display unit.
- a compositing unit 48 to be displayed on 32 is provided. It has a memory 50 that stores a color mapping table for composition for giving color data to the composite image.
- the ultrasonic probe 10 is brought into contact with the body surface of the subject.
- a drive signal for tissue imaging is supplied from the transmitter 12 to the ultrasonic probe 10.
- the supplied drive signal is input to a predetermined transducer group selected according to a command from the control unit 22.
- ultrasonic waves are emitted from each transducer to which the drive signal is input to the subject.
- the reflected echo signal generated by the subject force is received by each transducer and then output by the ultrasonic probe 10 forces.
- the reflected echo signal output from the ultrasound probe 10 is subjected to amplification processing or analog-digital conversion processing by the receiver 14. Applied.
- the reflected echo signal output from the receiving unit 14 is subjected to processing such as detection by the tomographic signal processing unit 18 to obtain black and white tomographic image data based on the signal intensity of the reflected echo signal.
- a plurality of tomographic image data corresponding to each scan plane is acquired.
- Each acquired tomogram data is input to the tomogram volume data creation unit 19.
- the plurality of input tomographic image data is constructed as tomographic image volume data by adding position data (for example, coordinate data of each scan plane) to each botacell by the tomographic volume data creation unit 19.
- the constructed tomographic volume data is stored in the volume data storage unit 26 via the communication port 24.
- the Doppler image system signal processing unit 20 calculates a Doppler shift (for example, frequency change or phase change of the reflected echo signal). Then, color Doppler image data such as blood flow velocity, reflection intensity, and dispersion is acquired from the calculated Doppler deviation.
- a plurality of Doppler image data corresponding to each scan plane is acquired.
- Each acquired Doppler image data is input to the Doppler image volume data creation unit 21.
- the plurality of input Doppler image data is constructed as Doppler image volume data by assigning position data (for example, coordinate data of each scan plane) to the respective button cells by the Doppler image volume data creation unit 21.
- the constructed tomographic volume data is stored in the volume data storage unit 26 via the communication port 24.
- the tomographic image volume data and the Doppler image volume data stored in the volume data storage unit 26 are read according to a command from the CPU 34 and input to the projection image generation unit 28. Based on the input tomographic volume data, a black and white tissue projection image is input to the projection image generator 28. Therefore, it is generated. Further, based on the read Doppler image volume data, a color Doppler projection image is generated by the projection image generation unit 28. The generated tissue projection image and Doppler projection image are displayed on the display unit 32 as a composite image by being superimposed so that the position data of each pixel is the same. When the tomographic volume data and Doppler image volume data are stored in the magnetic disk device 27, each volume data may be read from the magnetic disk device 27 and input to the projection image generation unit 28.
- FIG. 3 is a diagram for explaining an operation of generating each projection image and a composite image thereof.
- tomographic volume data 50 is constructed based on a plurality of tomographic image data Pl to Pn (FIG. 3A) (FIG. 3B).
- ⁇ Qn V based on FIG. 3 (X)
- Doppler image volume data 52 is constructed (FIG. 3 (Y).
- the tomographic volume data 50 is subjected to volume rendering processing by the tissue image rendering unit 40 based on the observation direction (gaze direction) set via the console 38.
- a tissue projection image 54 is generated (FIG. 3C).
- the tissue image rendering unit 40 corrects the information belonging to each botacell of the tomographic image volume data 50 based on the information belonging to each botacell of the Doppler image volume data 52, and the slice image volume data 50 And a tissue projection image 54 based on the corrected information.
- the tissue image rendering unit 40 calculates the attenuation of the botacell determined by the opacity belonging to each botacell in the tomographic image volume data 50.
- the tomogram volume data 50, the opacity, and the corrected attenuation are corrected based on the opacity of the Doppler image volume data corresponding to the button cell and the opacity belonging to the button cell and the correction coefficient R variably set by the console 38.
- a tissue projection image 54 is generated by V.
- the Doppler image volume data 52 is subjected to volume rendering processing by the Doppler image rendering unit 44 based on the observation direction set through the console 38.
- a Doppler projection image 56 is generated (FIG. 3 (Z).
- the Doppler image rendering unit 44 uses the tomographic image volume data 52 to obtain information belonging to each botacell of the Doppler image volume data 52. Correction is performed based on information belonging to each of the 50 botacells, and a Doppler projection image 56 is generated based on the Doppler image volume data 52 and the corrected information.
- the tissue image rendering unit 40 calculates the attenuation of the button cell determined by the opacity belonging to each button cell of the Doppler image volume data 52. Correction based on the opacity of the tomographic image volume data 50 corresponding to the botacell and the correction coefficient S variably set by the console 38, the Doppler image volume data 52, the opacity and the corrected attenuation. Based on the degree, a Doppler projection image 56 is generated.
- a composite image 58 is generated by superimposing the tissue projection image 54 and the Doppler projection image 56 so that the coordinate data of each pixel is the same (FIG. 3 (K)).
- the color mapping table 59 is used.
- This color mapping table 59 is a two-dimensional representation of the brightness value of the composite image 58 corresponding to the brightness value of the tissue projection image 54 set on the horizontal axis and the brightness value of the Doppler projection image 56 set on the vertical axis. It is a map. For example, when the luminance value of the tissue projection image 54 is a and the luminance value of the Doppler projection image 56 is 3 ⁇ 4, the value of the point (a, b) on the color mapping table 59 is the value of the composite image 58 (color (Color) and its brightness).
- FIG. 8 shows an example in which the selected state of the table is indicated by a mark indicating the selected state ⁇ or the non-selected state 0 in the vicinity of each table.
- the color mapping table 82 is a table for emphasizing a black and white tissue projection image, and is a table with few color components in the entire table.
- the color mapping table 84 is a table that emphasizes the color Doppler projection image, and is a table with many color components in the entire table.
- a synthesized image is synthesized based on the color mapping table.
- the composite image is synthesized After that, by changing the selection of the color mapping table, the composite image is recombined based on the newly selected color mapping table.
- Fig. 8 shows an example in which two different color mapping tables are displayed. The number of displayed color mapping tables may be three or more.
- the composite image is generated as described above, at least one of the tissue projection image 54, the Doppler projection image 56, and the generated composite image 58 is displayed on the display screen of the display unit 32.
- the composite image is displayed with priority.
- a tissue projection image and a Doppler projection image are generated together and a composite image is generated by combining these two projection images.
- Only one of the projected images may be generated.
- a projection image generated from volume data having a larger amount of information may be generated.
- the amount of information V and volume data information are taken into account, and the amount of information and volume data force are generated.
- the tomogram volume data generally has more information than the doppler image volume data, so only the generation of the tissue projection image reflecting the overlap with the blood flow image is performed. You can choose to do it.
- one projection image is generated in the same manner as in the above-described embodiment, and the other projection image is generated based on only the volume data as in the conventional technology, and these two projection images are combined. You may do it.
- a tissue projection image may be generated in the same manner as in the above-described embodiment
- a Doppler projection image may be generated only in the Doppler image volume data
- a composite image may be generated by combining these two projection images. It can.
- only the tomographic volume data is generated for the tissue projection image as in the prior art, the Doppler projection image is generated as in the previous embodiment, and the composite image is generated by combining these two projection images. You can also.
- FIG. Figure 4 shows a typical volleyball when the fetus is the subject of imaging. This figure shows the daring process and is cited from the literature (Kazunori Baba, Yuko Io: Obstetrics and Gynecology 3D Ultrasound. Medical View, Tokyo, 2000.).
- FIG. 4A is a diagram showing the concept of volume rendering.
- volume rendering means that the brightness of each point on the projection plane 62 is calculated by performing a predetermined calculation on the brightness value of the botasel on the line 60 passing through the three-dimensional volume data. It is a method to decide.
- the line 60 is parallel to the observation direction (gaze direction) set via the console 38.
- FIG. 4B is a diagram showing a concept of a calculation method for determining the luminance of each point on the projection plane 62.
- the output light amount Cout of the button cell V (x) is expressed by the following equation (1).
- the output light intensity Cout of the bot cell V (x) attenuates the incident light intensity Cin to the bot cell V (x) according to the degree of attenuation (1 ⁇ ( ⁇ )). It is determined by adding (x) the amount of emitted light C (x) X ⁇ ( ⁇ ).
- the opacity ⁇ ( ⁇ ) is a value in the range of “0” to “1”. The closer to “0”, the more transparent the button cell V (x), and the closer to “1”, the more the button cell. V (x) means opaque.
- This opacity ⁇ ( ⁇ ) can be determined by the Botacel value V (x).
- An example is shown in Fig. 6.
- the example in Fig. 6 (a) is an example in which the value opacity ⁇ (X) with a low value of the botacel value V (x) is set to a large value, which makes the shape of the subject surface on the incident light side large in the projected image. Will be reflected.
- Fig. 6 (a) is an example in which the value opacity ⁇ (X) with a low value of the botacel value V (x) is set to a large value, which makes the shape of the subject surface on the incident light side large in the projected image. Will be reflected.
- Fig. 6 (a) is
- the high value of the botacel value V (x) is also an example in which the opacity ⁇ 0 is set to a large value.
- the shape of the part having a large V (x) is greatly reflected in the projected image.
- the example in Fig. 6 (c) is an example in which the opacity (X (X)) is set in proportion to the botacel value V (x). Become.
- the tissue image rendering unit 40 converts the attenuation level of the tomographic image volume data 50 into the botell cell Vbw (x), the opacity level belonging to the Doppler image volume data 52 (botacell Vc x), and the variably set correction coefficient R. Based on the tomogram volume data 50, opacity, and corrected attenuation, a tissue projection image 54 is generated.
- the self-emission amount of the Botacel Vbw (x) in the tomographic volume data 50 is Cbw (x)
- the opacity is a—bw (x)
- the opacity of the Botacell Vc x) in the Doppler image volume data 52 is a — Cx
- the output light quantity Cout of the toeogram volume data 50 of the botasel Vbw (x) is expressed by the following equation (2).
- the original attenuation (1 a-bw (x)) of the botel cell Vbw (x) is corrected to the new attenuation (1 a _bw (x) + a _cKx) X R)!
- the Doppler image rendering unit 44 adjusts the attenuation of the Botacel Vc x) of the Doppler image volume data 52, the opacity belonging to the Botacel Vbw (x) of the tomographic image volume data 50, and a variable correction. Correction is performed based on the coefficient S, and a Doppler projection image 56 is generated based on the Doppler image volume data 52, the opacity, and the corrected attenuation.
- Cc x) be the self-emission amount of the Botacell Vc x) in the Doppler image volume data 52, and opacity a—cx), )
- the output light amount Cout of the Botacell Vc x) of the Doppler image volume data 52 is expressed by the following equation (3).
- the original attenuation (1 a—c x)) of the bot cell Vc x) is corrected to the new attenuation (1 a _cKx) + a _bw (x) X S)!
- the output light intensity Cout of the volume data 50 box Vbw (x) can also be obtained.
- the output light quantity Cout can be obtained as shown in Equation 4.
- the output light amount Cout of the button cell Vc x) of the Doppler image volume data 52 can be obtained. That is, the output light amount Cut can also be obtained as shown in Equation 5.
- the last term (Cbw (x) X ⁇ -bw (x) X S2) is the correction amount
- the brightness of each point on the projection plane is determined, whereby the tissue projection image 54 is generated.
- a Doppler projection image 56 is generated based on the output light amount Cout of the Doppler image volume data 52.
- the tissue projection image 54 is generated by adding the tomographic image volume data 50 to the information belonging to each botacell of the Doppler image volume data 52.
- the degree of overlap is reflected in the brightness. Therefore, referring to the density of the tissue projection image 54, the degree of blood vessel overlap in the tissue can be easily grasped, and a three-dimensional and effective diagnosis can be performed.
- by adding the correction amount as shown in Equation 4 and obtaining the projection image it is easier to overlap the blood vessels in the tissue more clearly than the projection image using Equation 2 without adding the correction amount. To be able to grasp.
- the attenuation of the voxel V (x) in which the Doppler images overlap is corrected among the votacells V (x) of the tomographic image volume data 50. Therefore, when the tissue projection image 54 is generated based on the tomographic volume data 50 and the corrected attenuation, the generated tissue projection image 54 reflects the overlapping state with the blood flow image as a shadow for each pixel. It becomes a thing ( Figure 3 (C)). As a result, by referring to the shades of shadow in the tissue projection image 54, it is possible to easily grasp the degree of blood vessel overlap in the tissue. Can be gripped.
- the Doppler projection image 56 is generated by adding the information belonging to each of the botasels V (x) of the tomographic image volume data 50 to the Doppler image volume data 52. Therefore, the Doppler projection image 56 overlaps the tomographic image for each pixel. The condition is reflected in the brightness. Therefore, referring to the Doppler projection image 56, it is possible to easily grasp the tissue overlap in the blood vessel. In particular, by adding a correction amount as shown in Equation 5 to obtain a projected image, it is easier to make tissue overlap more clearly in the blood vessel than a projection image using Equation 3 without adding a correction amount. It becomes possible to grasp.
- the attenuation of the voxel V (x) where the tomographic images overlap is corrected among the votacells V (x) of the Doppler image volume data 52. Therefore, when the Doppler projection image 56 is generated based on the Doppler image volume data 52 and the corrected attenuation, the generated Doppler projection image 56 is reflected as a shadow for each pixel in the overlapping state with the slice image. Become a thing. As a result, by referring to the shade of the Doppler projection image 56, it is possible to easily grasp the tissue overlap in the blood vessel.
- tissue projection image 54 and the Doppler projection image 56 are reflected as a shadow, the tissue projection image 54 and the Doppler projection image 56 are combined to create a blood vessel and a tissue around the blood vessel.
- a composite image 58 in which the three-dimensional positional relationship is accurately displayed is displayed on the display unit 32. Therefore, the three-dimensional positional relationship between the blood vessel and the tissue surrounding the blood vessel can be easily grasped by referring to the displayed composite image 58.
- the correction coefficients R, S can be varied through the console 38 as necessary, they appear in the tissue projection image 54 or the Doppler projection image 56. You can adjust the shading.
- a composite image can be displayed according to, for example, tissue characteristics of the imaging region, the visibility of the composite image indicating the three-dimensional positional relationship between the blood vessel and the tissue around the blood vessel can be improved.
- FIG. 5 is a display example of a composite image 69 configured using a phantom.
- Fig. 5 (A) shows a display example of a composite image 69 configured with both correction factors R, S (or Rl, R2, SI, S2) set to ⁇ 0 '', and
- Fig. 5 (B) shows the correction factor R (Or Rl, R2) is set to ⁇ 1 '' and S (or SI, S2) is set to ⁇ 0.1 ''.
- R Or Rl, R2
- S or SI, S2
- 5 (C) is a display example of a composite image 69 configured by setting both correction coefficients R, S (or Rl, R2, SI, S2) to ⁇ 1 '', and the blood vessel is in the back of the tissue.
- the situation of gradually disappearing as it enters (that is, how the projection directions overlap each other) is understood accurately.
- Generating a composite image with both correction coefficients R and S set to “0” is equivalent to the prior art, whereas it was generated with both correction coefficients R and S set to other than “0”.
- the composite image is a composite image according to the present invention.
- the composite image 69 in FIG. 5 (A) the image on the viewpoint side of the diagnostician is displayed with priority, so a part of the Doppler projection image 70 is hidden in the tissue projection image 72. . Therefore, it is difficult to accurately grasp the three-dimensional positional relationship between the blood vessel and the tissue surrounding the blood vessel (for example, the penetration state of the object related to the Doppler projection image 70 with respect to the object related to the tissue projection image 72). .
- the tissue projection image 72 and the Doppler projection image 70 generated by setting both the correction coefficients R, S (or Rl, R2, SI, S2) to "0" do not take into account each other's volume data information.
- a composite image 69 is generated from the tissue projection image 72 and the Doppler projection image 70 generated in this manner based on a predetermined composition ratio. By varying the composition ratio, either the tissue projection image 72 or the Doppler projection image 70 can be displayed with emphasis.
- the tissue projection image 72 or the Doppler projection image 70 Since the brightness of each pixel is changed uniformly, it is difficult to grasp the three-dimensional positional relationship between the blood vessel and the tissue surrounding the blood vessel with the composite image 69 force displayed in this way.
- the Doppler projection image 70 and the tissue projection image 72 are displayed translucently.
- the three-dimensional positional relationship between the projected image 70 and the tissue projected image 72 can be accurately and easily visually confirmed.
- FIG. 7 shows the flow of each process from the acquisition of the tomographic image volume data and the Doppler image volume data to the generation of each projection image and the generation of these projection image power composite images. Show.
- each processing step in this flowchart will be described individually. Since the detailed description of each processing step is as described above, only the outline will be described.
- step S701 an observation direction (line-of-sight direction) for generating a projection image is set.
- the plane perpendicular to the viewing direction is the projection plane.
- step S702 the first line parallel to the observation direction set in step S701 is selected.
- step S703 in the two volume data, first botasels are respectively selected on the lines selected in step S702.
- step S704 the initial value of the input light intensity Cin is set for each of the two volume data. For example, it can be “0”.
- step S705 the voxel value Vbw (x) of the tomographic image volume data is used to determine the self-emission amount Cbw (x) of this voxel. Further, using the Botacell value Vc x) of the Doppler image volume data, the self-emission amount Cc x) of this Botacel is obtained.
- step S706 the opacity (a—bw (x)) and attenuation (1—a—bw (x)) of this botacel are obtained using the botacel value Vbw (x). Use this to find the opacity (a—cx)) and attenuation (1 a—cx) of this buttonel.
- step S707 the attenuation of the botels Vbw (x) and Vc x) is corrected. For example, number
- the attenuation (1—a _bw (x)) is corrected using the opacity _c x)) and the correction coefficient R, and the corrected attenuation (1—a _bw (x) + a— cx) Let XR).
- the attenuation (1—a—cx)) is corrected using the opacity (a—bw (x)) and the correction coefficient S, and the corrected attenuation (1 a _cKx) + a—bw (x) XS).
- equations (4) and (5) find the amount of correction that should be added when calculating the output light intensity Cout.
- step S708 the output light amount Cout of the bottom cell Vbw (x) is obtained using the above-described Equation 2 or Equation 4.
- the output light amount Cout of the botel cell Vc x) is obtained using the above formula 3 or formula 5.
- step S709 the output light amount Cout obtained in step S708 is set as the input light amount C in of the next button cell.
- step S710 it is checked whether it is the last buttonel on the line. If it is the last box, the process proceeds to step S711; otherwise, the process proceeds to step S713.
- step S711 the output light amount Cout of the last button cell Vbw (x) is set as the pixel value of the tissue projection image on the line.
- the output light amount Cout of the last bottom cell Vc x) is set to the pixel value of the Doppler projection image on the line.
- step S712 it is checked whether or not the line is the last position force. If it is the last line, since all the pixel values of the tissue projection image and the Doppler projection image have been obtained, the process proceeds to step S715. Otherwise, the process proceeds to step S714.
- step S713 in the tomographic image volume data and the Doppler image volume data, the adjacent buttonacells on the line are selected, and the process proceeds to step S705.
- step S714 the position of the line parallel to the observation direction is changed in the tomogram volume data and the Doppler image volume data, and the process proceeds to step S703.
- step S715 the tissue projection image and the Doppler projection image are combined based on the color mapping table to obtain a combined image.
- the present invention has been described based on the embodiments, the present invention is not limited to the above embodiments.
- the ultrasonic probe 10 in which a plurality of transducers are arranged two-dimensionally, an ultrasonic probe with a position sensor may be used.
- any scan plane position data can be acquired to be applied to each tomogram volume data or Doppler volume data! ,.
- the drive signal for tissue imaging and the drive signal for blood flow imaging can be supplied to the transmission unit 12-force ultrasonic probe 10 in a time-sharing manner in a predetermined order.
- the tomographic image volume data 50 and the Doppler image volume data 52 can be created almost simultaneously.
- the present invention is not limited to the mode of supplying time-division, and any mode that can acquire tomographic image volume data and Doppler image volume data may be used.
- a drive signal for tissue imaging it is preferable to use a signal corresponding to a single Norse wave in order to improve the image resolution of the tissue tomogram.
- Doppler deviation detection is performed for the driving signal for blood flow imaging. In order to facilitate output, it is recommended to use a signal in which multiple (eg, 8) single pulse waves are concatenated.
- a region of interest can be set via the console 38 so as to surround a display target (for example, cancer tissue or fetus).
- a display target for example, cancer tissue or fetus.
- tomogram volume data and Doppler image volume data are not limited to these, and the projection image generation method of the present invention can be applied to two different volume data.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Acoustics & Sound (AREA)
- Surgery (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Radiology & Medical Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- General Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Computer Networks & Wireless Communication (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Hematology (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006535872A JP4847334B2 (ja) | 2004-09-13 | 2005-09-12 | 超音波撮像装置及び投影像生成方法 |
US11/575,166 US8160315B2 (en) | 2004-09-13 | 2005-09-12 | Ultrasonic imaging apparatus and projection image generating method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-265158 | 2004-09-13 | ||
JP2004265158 | 2004-09-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006030731A1 true WO2006030731A1 (ja) | 2006-03-23 |
Family
ID=36059984
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/016745 WO2006030731A1 (ja) | 2004-09-13 | 2005-09-12 | 超音波撮像装置及び投影像生成方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US8160315B2 (ja) |
JP (1) | JP4847334B2 (ja) |
WO (1) | WO2006030731A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009011711A (ja) * | 2007-07-09 | 2009-01-22 | Toshiba Corp | 超音波診断装置 |
JP2010505575A (ja) * | 2006-10-13 | 2010-02-25 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | グレイスケール反転を用いる3d超音波カラーフローイメージング |
WO2011099410A1 (ja) * | 2010-02-09 | 2011-08-18 | 株式会社 日立メディコ | 超音波診断装置及び超音波画像表示方法 |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007107925A2 (en) * | 2006-03-21 | 2007-09-27 | Koninklijke Philips Electronics, N.V. | Optimization of velocity scale for color tissue doppler imaging |
EP1959391A1 (de) * | 2007-02-13 | 2008-08-20 | BrainLAB AG | Bestimmung des dreidimensionalen Verlaufs des Randes einer anatomischen Struktur |
JP4636338B2 (ja) * | 2007-03-28 | 2011-02-23 | ソニー株式会社 | 表面抽出方法、表面抽出装置及びプログラム |
US20100130860A1 (en) * | 2008-11-21 | 2010-05-27 | Kabushiki Kaisha Toshiba | Medical image-processing device, medical image-processing method, medical image-processing system, and medical image-acquiring device |
JP5316118B2 (ja) | 2009-03-12 | 2013-10-16 | オムロン株式会社 | 3次元視覚センサ |
JP5245938B2 (ja) * | 2009-03-12 | 2013-07-24 | オムロン株式会社 | 3次元認識結果の表示方法および3次元視覚センサ |
JP5714232B2 (ja) * | 2009-03-12 | 2015-05-07 | オムロン株式会社 | キャリブレーション装置および3次元計測のためのパラメータの精度の確認支援方法 |
JP2010210585A (ja) * | 2009-03-12 | 2010-09-24 | Omron Corp | 3次元視覚センサにおけるモデル表示方法および3次元視覚センサ |
JP5282614B2 (ja) * | 2009-03-13 | 2013-09-04 | オムロン株式会社 | 視覚認識処理用のモデルデータの登録方法および視覚センサ |
CN103220980B (zh) * | 2010-10-28 | 2015-05-20 | 株式会社日立医疗器械 | 超声波诊断装置以及超声波图像显示方法 |
US9486291B2 (en) | 2012-06-21 | 2016-11-08 | Rivanna Medical Llc | Target region identification for imaging applications |
CN103006263B (zh) * | 2012-12-19 | 2014-09-10 | 华南理工大学 | 一种基于线性扫描的医学超声三维成像的位置标定方法 |
US11147536B2 (en) | 2013-02-28 | 2021-10-19 | Rivanna Medical Llc | Localization of imaging target regions and associated systems, devices and methods |
WO2014134188A1 (en) * | 2013-02-28 | 2014-09-04 | Rivanna Medical, LLC | Systems and methods for ultrasound imaging |
US11134921B2 (en) * | 2013-04-12 | 2021-10-05 | Hitachi, Ltd. | Ultrasonic diagnostic device and ultrasonic three-dimensional image generation method |
KR101851221B1 (ko) * | 2013-07-05 | 2018-04-25 | 삼성전자주식회사 | 초음파 영상 장치 및 그 제어 방법 |
US10548564B2 (en) | 2015-02-26 | 2020-02-04 | Rivanna Medical, LLC | System and method for ultrasound imaging of regions containing bone structure |
US10019784B2 (en) * | 2015-03-18 | 2018-07-10 | Toshiba Medical Systems Corporation | Medical image processing apparatus and method |
EP3588438A4 (en) * | 2017-03-07 | 2020-03-18 | Shanghai United Imaging Healthcare Co., Ltd. | METHOD AND SYSTEM FOR PRODUCING COLORED MEDICAL IMAGES |
US20220095891A1 (en) * | 2019-02-14 | 2022-03-31 | Dai Nippon Printing Co., Ltd. | Color correction device for medical apparatus |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07213522A (ja) * | 1994-01-26 | 1995-08-15 | Toshiba Corp | 超音波診断装置 |
JPH09262236A (ja) * | 1996-03-22 | 1997-10-07 | Advanced Technol Lab Inc | 超音波診断3次元画像処理方法及び装置 |
US5857973A (en) * | 1997-09-30 | 1999-01-12 | Siemens Medical Systems, Inc. | Fuzzy logic tissue flow determination system |
JP2001017428A (ja) * | 1999-07-06 | 2001-01-23 | Ge Yokogawa Medical Systems Ltd | オパシティ設定方法、3次元像形成方法および装置並びに超音波撮像装置 |
US6280387B1 (en) * | 1998-05-06 | 2001-08-28 | Siemens Medical Systems, Inc. | Three-dimensional tissue/flow ultrasound imaging system |
JP2005143733A (ja) * | 2003-11-13 | 2005-06-09 | Toshiba Corp | 超音波診断装置、3次元画像データ表示装置及び3次元画像データ表示方法 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5860924A (en) * | 1996-11-26 | 1999-01-19 | Advanced Technology Laboratories, Inc. | Three dimensional ultrasonic diagnostic image rendering from tissue and flow images |
JP2000201925A (ja) * | 1999-01-12 | 2000-07-25 | Toshiba Corp | 3次元超音波診断装置 |
JP4610011B2 (ja) * | 2003-07-22 | 2011-01-12 | 株式会社日立メディコ | 超音波診断装置及び超音波画像表示方法 |
-
2005
- 2005-09-12 JP JP2006535872A patent/JP4847334B2/ja active Active
- 2005-09-12 US US11/575,166 patent/US8160315B2/en active Active
- 2005-09-12 WO PCT/JP2005/016745 patent/WO2006030731A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07213522A (ja) * | 1994-01-26 | 1995-08-15 | Toshiba Corp | 超音波診断装置 |
JPH09262236A (ja) * | 1996-03-22 | 1997-10-07 | Advanced Technol Lab Inc | 超音波診断3次元画像処理方法及び装置 |
US5857973A (en) * | 1997-09-30 | 1999-01-12 | Siemens Medical Systems, Inc. | Fuzzy logic tissue flow determination system |
US6280387B1 (en) * | 1998-05-06 | 2001-08-28 | Siemens Medical Systems, Inc. | Three-dimensional tissue/flow ultrasound imaging system |
JP2001017428A (ja) * | 1999-07-06 | 2001-01-23 | Ge Yokogawa Medical Systems Ltd | オパシティ設定方法、3次元像形成方法および装置並びに超音波撮像装置 |
JP2005143733A (ja) * | 2003-11-13 | 2005-06-09 | Toshiba Corp | 超音波診断装置、3次元画像データ表示装置及び3次元画像データ表示方法 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010505575A (ja) * | 2006-10-13 | 2010-02-25 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | グレイスケール反転を用いる3d超音波カラーフローイメージング |
JP2009011711A (ja) * | 2007-07-09 | 2009-01-22 | Toshiba Corp | 超音波診断装置 |
WO2011099410A1 (ja) * | 2010-02-09 | 2011-08-18 | 株式会社 日立メディコ | 超音波診断装置及び超音波画像表示方法 |
US8988462B2 (en) | 2010-02-09 | 2015-03-24 | Hitachi Medical Corporation | Ultrasonic diagnostic apparatus and ultrasonic image display method |
JP5774498B2 (ja) * | 2010-02-09 | 2015-09-09 | 株式会社日立メディコ | 超音波診断装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2006030731A1 (ja) | 2008-05-15 |
JP4847334B2 (ja) | 2011-12-28 |
US8160315B2 (en) | 2012-04-17 |
US20080260227A1 (en) | 2008-10-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4847334B2 (ja) | 超音波撮像装置及び投影像生成方法 | |
US9943288B2 (en) | Method and system for ultrasound data processing | |
JP3187148B2 (ja) | 超音波診断装置 | |
JP5495357B2 (ja) | 画像表示方法及び医用画像診断システム | |
US6951543B2 (en) | Automatic setup system and method for ultrasound imaging systems | |
JP5236655B2 (ja) | グレイスケール反転を用いる3d超音波カラーフローイメージング | |
US6048312A (en) | Method and apparatus for three-dimensional ultrasound imaging of biopsy needle | |
US5911691A (en) | Ultrasound image processing apparatus and method of forming and displaying ultrasound images by the apparatus | |
JP5848709B2 (ja) | 超音波診断装置及び超音波画像表示方法 | |
JP5774498B2 (ja) | 超音波診断装置 | |
JP2013536720A (ja) | 2次元超音波画像の3次元表示 | |
CN101791229A (zh) | 超声波诊断装置、图像处理装置及方法、图像显示方法 | |
US20180206825A1 (en) | Method and system for ultrasound data processing | |
JP2005095278A (ja) | 超音波診断装置 | |
JP3936450B2 (ja) | 投影画像生成装置及び医用画像装置 | |
JP2001128982A (ja) | 超音波画像診断装置および画像処理装置 | |
US7346228B2 (en) | Simultaneous generation of spatially compounded and non-compounded images | |
JP4297561B2 (ja) | オパシティ設定方法、3次元像形成方法および装置並びに超音波撮像装置 | |
JP6169911B2 (ja) | 超音波画像撮像装置及び超音波画像表示方法 | |
JP7078457B2 (ja) | 血流画像処理装置及び方法 | |
JP6879039B2 (ja) | 超音波診断装置、合成画像の表示方法及びプログラム | |
JPH0938084A (ja) | 超音波3次元画像形成方法および装置 | |
JP5182932B2 (ja) | 超音波ボリュームデータ処理装置 | |
EP0809119A2 (en) | Ultrasound image processing apparatus and method of forming and displaying ultra sound images by the apparatus | |
JP2005006718A (ja) | 超音波診断装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006535872 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase | ||
WWE | Wipo information: entry into national phase |
Ref document number: 11575166 Country of ref document: US |