WO2012039192A1 - 超音波診断装置および超音波画像の表示方法 - Google Patents
超音波診断装置および超音波画像の表示方法 Download PDFInfo
- Publication number
- WO2012039192A1 WO2012039192A1 PCT/JP2011/067190 JP2011067190W WO2012039192A1 WO 2012039192 A1 WO2012039192 A1 WO 2012039192A1 JP 2011067190 W JP2011067190 W JP 2011067190W WO 2012039192 A1 WO2012039192 A1 WO 2012039192A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- elastic
- dimensional
- elasticity
- value
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
- A61B8/14—Echo-tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4444—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
- A61B8/4461—Features of the scanning mechanism, e.g. for moving the transducer within the housing of the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/485—Diagnostic techniques involving measuring strain or elastic properties
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5207—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5223—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/54—Control of the diagnostic device
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
- A61B8/469—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
Definitions
- the present invention relates to an ultrasound diagnostic apparatus that displays an ultrasound image of a diagnostic site in a subject using ultrasound, and in particular, displays an elastic image showing the hardness of a living tissue of the subject as a three-dimensional image.
- the present invention relates to an ultrasonic diagnostic apparatus capable of performing the above.
- an ultrasonic diagnostic apparatus transmits ultrasonic waves into a subject, receives ultrasonic reflected echo signals of biological tissue from the reflected waves, and performs signal processing.
- a black and white tomographic image (B-mode image) of the diagnostic region with the ultrasonic reflectance as luminance is generated and displayed.
- Patent Document 2 by performing 3D coordinate conversion from a black and white tomographic image and its acquisition position, obtaining 3D volume data in which a plurality of tomographic image data is arranged in 3D, and volume rendering this A method for obtaining a black and white three-dimensional image of a diagnostic site viewed from an arbitrary line-of-sight direction is disclosed.
- Rendering is performed by assigning opacity to each voxel according to the luminance value of the voxels that make up the three-dimensional volume data, until the cumulative value of opacity of the voxels arranged on the line of sight becomes 1,
- a method is disclosed in which a value obtained by sequentially accumulating the luminance values of the voxels is used as a pixel value on a two-dimensional projection plane.
- Patent Documents 1 and 2 also disclose a method for obtaining an elastic image of a diagnostic site.
- a black and white tomographic image of 2 frames is selected, how much each point on the image is displaced between the 2 frames is obtained by block matching, etc., and a known calculation is performed on the obtained displacement to obtain each displacement on the image.
- An elastic value (strain, elastic modulus, etc.) representing the hardness of the point is obtained.
- a two-dimensional elasticity image representing the elasticity value in hue is obtained.
- a three-dimensional gray image was obtained from a black and white tomographic image in order to solve the problem of being unable to obtain a three-dimensional image of an inner lesion by being hidden by a voxel having high outer opacity.
- a technique is disclosed in which volume rendering is performed by giving opacity to each voxel of three-dimensional volume data in accordance with the magnitude of an elastic value. As a result, a three-dimensional grayscale image is obtained in which hard tissues are made more opaque and soft tissues are made more transparent.
- Patent Document 2 a plurality of two-dimensional elastic images are coordinate-transformed to obtain three-dimensional volume data, and this is volume-rendered to generate a three-dimensional elastic image.
- the normal rendering method that simply accumulates the voxel values causes the problem that the elasticity values are accumulated, resulting in a three-dimensional elasticity image in which the elastic characteristics of the living tissue are distorted.
- a voxel with the highest contribution rate on the line of sight is obtained, and a three-dimensional elasticity image is generated with the elasticity value of the voxel as the elasticity value of the two-dimensional projection plane.
- Patent Document 1 As a display method, in Patent Document 1, a black and white tomographic image (two-dimensional) and a two-dimensional elastic image are superimposed and displayed. Patent Document 2 discloses that an image in which a black-and-white three-dimensional image and a three-dimensional elastic image are superimposed is generated, and this is displayed side by side on a single screen along with a black and white tomographic image in three directions.
- Patent Document 3 discloses that a black and white tomographic image and a two-dimensional elastic image are displayed side by side on one screen, and that a black and white tomographic image and a two-dimensional elastic image are superimposed and displayed.
- opacity is given to each voxel of the three-dimensional volume data obtained from the black and white tomographic image according to the magnitude of the elastic value, and volume rendering is performed to generate a three-dimensional image.
- An object of the present invention is to provide an ultrasonic diagnostic apparatus capable of generating a three-dimensional image of a subject for a tissue region having an elastic value range desired by an operator and easily grasping the region.
- the ultrasonic diagnostic apparatus includes a tomographic image forming unit that generates a tomographic image of a subject using a signal that is transmitted by receiving ultrasonic waves in the subject, and an elasticity value that indicates elasticity by processing the signal.
- a rendering unit that generates a three-dimensional elasticity image of elasticity value data in a desired elasticity value range, a three-dimensional elasticity image, and at least one of a two-dimensional elasticity image and a tomographic image showing a region corresponding to the elasticity value range; And a display unit for displaying.
- the display unit may be configured to display a mask on a region outside the elastic value range of the two-dimensional elastic image. For example, a composite image in which the addition ratio of the two-dimensional elastic image and the tomographic image is different between the region within the elastic value range and the region covered with the mask is displayed. In this case, it is also possible to display a composite image in which the addition ratio of the two-dimensional elastic image of the masked region is zero.
- the display unit can also display an image in which a region within the elastic value range of a two-dimensional elastic image or a region covered with a mask is filled with a single hue.
- the single hue is set according to the elasticity value of the area to be filled with the two-dimensional elastic image.
- the display unit may be configured to display a line indicating the contour of the elastic value range on the two-dimensional elastic image.
- a configuration further including an operation unit for the operator to set the elastic value range.
- the display unit displays a region corresponding to the elastic value range on at least one of the two-dimensional elastic image and the tomographic image only for a predetermined time. be able to.
- the display unit adds a display indicating the elastic value range to the color map indicating the relationship between the elastic value and the hue. It can be configured to display. Alternatively, it is also possible to display the elastic value range by adding a hue corresponding to the elastic value range to the numerical value indicating the elastic value range.
- the following ultrasonic image display method is provided. That is, a tomographic image of a subject is generated using signals received by transmitting ultrasonic waves into the subject, and a two-dimensional elasticity image representing elasticity is generated by processing the signal. Volume data is generated from a dimensional elastic image. By selecting and rendering the elastic value data of the volume data included in the desired elastic value range, a three-dimensional elastic image of the elastic value data in the desired elastic value range is generated. A three-dimensional elastic image and at least one of a two-dimensional elastic image and a tomographic image indicating a region corresponding to the elastic value range are displayed.
- the region of the subject tissue corresponding to the range of the elasticity value displayed as the three-dimensional elasticity image can be visually displayed on at least one of the tomographic image and the two-dimensional elasticity image. Therefore, the operator can easily grasp the area. Therefore, it is possible to easily obtain a three-dimensional elastic image of a desired region by setting the elastic value range.
- FIG. 1 is a block diagram showing the configuration of an ultrasonic diagnostic apparatus according to an embodiment.
- an explanatory diagram showing an example of a screen on which a black and white tomographic image 111 and a color two-dimensional elastic image 112 and a three-dimensional elastic image are displayed,
- (b) (C) an enlarged view of the color map 104 of the screen example,
- (c) an enlarged view of the display area of the elastic value range of FIG. (A), and
- a perspective view showing the external appearance of the image display unit 13 and the operation panel 120 of the ultrasonic diagnostic apparatus of FIG.
- FIG. 12 is an explanatory diagram illustrating an example of a screen when the elastic value ⁇ is set to 72 by 121, and (c) an explanatory diagram illustrating an exemplary screen when the elastic value ⁇ is set to 50 by the toggle switch 121 in the first embodiment.
- FIG. 5 is a flowchart showing the operation of the mode of the first embodiment.
- FIG. 9 is an explanatory diagram illustrating an example of a screen generated in the second embodiment.
- FIG. 10 is an explanatory diagram showing an example of a screen generated in the fifth embodiment.
- FIG. 10 is an explanatory diagram showing an example of a screen generated in the sixth embodiment.
- a B-mode image showing the distribution of ultrasonic reflectance of the tissue in a predetermined cross section of the subject is a black and white tomographic image
- a two-dimensional projection image obtained by rendering volume data composed of black and white tomographic data is three-dimensional.
- the ultrasonic diagnostic apparatus includes a probe 2 that is used in contact with the subject 1, and an ultrasonic wave that passes through the probe 2 to a diagnostic site in the subject 1 for a predetermined time.
- Transmitter 3 that repeatedly transmits at intervals
- receiver 4 that receives reflected echo signals reflected from the subject 1 in time series
- ultrasonic reception control unit 5 and phasing that adds the received reflected echo And an adder 6.
- the probe 2 includes a plurality of transducers arranged in rows or sectors, and transmits and receives ultrasonic waves from the transducers to the subject 1.
- the probe 2 has a function of mechanically scanning a plurality of transducers in a direction (short axis direction) orthogonal to the arrangement direction, and can transmit and receive ultrasonic waves in three dimensions. Note that, as the probe 2, a configuration in which a plurality of transducers are two-dimensionally arranged may be used to transmit and receive ultrasonic waves three-dimensionally without mechanically shaking the transducers.
- the transmission unit 3 generates a transmission pulse for driving the probe 2 to generate ultrasonic waves.
- the phase of the transmission signal delivered to each transducer of the probe 2 is controlled, and the convergence point of the transmitted ultrasonic wave is set to a certain depth.
- the receiving unit 4 also amplifies the reflected echo signal received by each transducer of the probe 2 with a predetermined gain to generate an RF signal, that is, a received signal.
- the ultrasonic transmission / reception control unit 5 controls the transmission unit 3 and the reception unit 4.
- the phasing and adding unit 6 forms an ultrasonic beam converged at one or more convergence points by matching the RF signal phase and then adding the RF signal frame data (equivalent to RAW data). Generate.
- the ultrasonic diagnostic apparatus also includes an interface unit 43 that accepts settings from an operator, an image display unit 13, a switching addition unit 12 that switches the type of image displayed on the image display unit 13, and an image system control unit 44. And a short-axis scanning position control unit 46.
- the short-axis scanning position control unit 46 performs a three-dimensional transmission / reception within a predetermined range by controlling an operation in which the probe 2 mechanically scans a plurality of transducers in a direction orthogonal to the arrangement direction.
- the ultrasonic diagnostic apparatus includes a tomographic image forming unit 7, a tomographic image storage unit 35, as a configuration for generating a black and white tomographic image and a three-dimensional image of a diagnostic region having an ultrasonic reflectance as luminance from RF frame data, A tomographic volume data creation unit 36, a volume rendering unit 38, and a multi-frame configuration unit (tomographic image) 46 are provided.
- an RF signal frame data storage unit 27 an RF signal frame data selection unit 28, a displacement measurement unit 30, an elasticity information
- a calculation unit 32 an elastic image configuration unit 34, a two-dimensional elastic image storage unit 39, an elastic image volume data creation unit 40, a volume rendering unit 42, a multi-frame configuration unit (elastic image) 48, and a rendering range setting unit 51 I have.
- the tomographic image construction unit 7 performs gain correction, log compression, detection, contour enhancement, filter processing, and the like on the RF signal frame data generated by the phasing addition unit 6, and sets the reflectance to luminance (light / dark). A black and white tomographic image (B-mode image) of the diagnostic site is generated.
- the image system control unit 44 receives a monochrome tomographic image generation condition from the operator via the interface unit 43 and controls the tomographic image forming unit 7.
- the tomographic image storage unit 35 stores the black and white tomographic image formed by the tomographic image forming unit 7 in association with the acquisition position.
- the acquisition position here is the amount of movement in the short axis direction under the control of the short axis scanning position control unit 46.
- Volume data creation unit 36 creates 3D volume data by performing coordinate transformation to rearrange multiple black and white tomographic images (for 1 volume) stored in tomographic image storage unit 35 according to the amount of movement in the short axis direction To do.
- the volume rendering unit 38 performs volume rendering of the three-dimensional volume data generated by the volume data creation unit 36 based on luminance and opacity using the following equations (1) to (3), so that the diagnosis part of the subject is detected. Construct a 3D image (2D projection image of 3D volume data).
- the projection direction (gaze direction) is received from the operator by the image system control unit 44 via the interface 43.
- Cout (i) Cout (i-1) + (1 ⁇ Aout (i-1)) ⁇ A (i) ⁇ C (i) ⁇ S (i) ⁇ ⁇ ⁇ Equation (1)
- Aout (i) Aout (i-1) + (1-Aout (i-1)) ⁇ A (i) ...
- a (i) Bopacity [C (i)] (3)
- Cout (i) is a value output as a pixel value on the two-dimensional projection plane.
- the voxel here refers to the position of individual luminance data constituting the three-dimensional volume data.
- Cout (i-1) indicates the integrated value up to the voxel i-1.
- a (i) in equation (1) is the opacity of the i-th voxel on the line of sight, and is a value between 0 and 1.0.
- opacity A (i) refer to the table (Bopacity [C (i)]) that defines the relationship between the predetermined luminance value C (i) and opacity (Opacity), as shown in Equation (3).
- the luminance value C (i) into a function (Bopacity [C (i)]) that defines the relationship between the predetermined luminance value C (i) and opacity. It is determined according to the size of the value. For example, a large opacity is given to a voxel having a large luminance value.
- the contribution ratio of the luminance value C (i) of the voxel to the luminance value Cout (N ⁇ 1) of the two-dimensional projection plane to be output is determined.
- Aout (i) in Expression (2) is a value obtained by integrating the opacity A (i) given by Expression (3) according to the right side of Expression (2) up to the i-th voxel.
- the integrated value Aout (i ⁇ 1) of opacity up to the (i ⁇ 1) -th voxel calculated as in equation (2) is used.
- Aout (i) is integrated and converges to 1.0 each time it passes through the voxel.
- S (i) in equation (1) is a weighting component for shading, and is calculated from the gradient of the luminance value obtained from the luminance value C (i) and the surrounding luminance values. For example, when the normal of the surface (luminance value gradient) centered on the i-th voxel coincides with the optical axis of a predetermined light source, the light is reflected most strongly. Based on this, 1.0 is assigned to voxel i as S (i), and 0.0 is assigned to S (i) when the light source and the normal line are orthogonal. As a result, the obtained two-dimensional projection image is shaded to give an enhancement effect.
- the multi-frame construction unit (tomographic image) 46 generates a black and white tomographic image of an arbitrary cross section from the three-dimensional volume data created by the volume data creation unit 36.
- the position of the arbitrary cross section is received from the operator by the interface unit 43, and is set in the multi-frame configuration unit (tomographic image) 46 via the image system control unit 44.
- a plurality of arbitrary slice positions can be set, and the multi-frame configuration unit 46 generates a monochrome tomographic image for each of the plurality of arbitrary slice positions.
- These configurations form a three-dimensional image of the diagnostic region of the subject 1 and a black and white tomographic image of an arbitrary cross section.
- the RF signal frame data storage unit 27 sequentially stores the RF signal frame data generated by the phasing addition unit 6.
- the RF signal frame data storage unit 27 sequentially stores the RF signal data generated from the phasing addition unit 6 based on the time series, that is, the frame rate of the image, in the frame memory.
- the RF signal frame data selection unit 28 selects one set, that is, two RF signal frame data from a plurality of RF signal frame data stored in the RF signal frame data storage unit 27.
- the RF signal frame data selection unit 28 selects the most recently stored RF signal frame data (N) as the first data in response to a command from the image system control unit 44, and in the past in time.
- One RF signal frame data (X) is selected from the stored RF signal frame data group (N-1, N-2, N-3,..., NM).
- N, M, and X are index numbers assigned to the RF signal frame data, and are natural numbers.
- the displacement measuring unit 30 obtains the displacement of the living tissue from one set of RF signal frame data.
- the displacement measuring unit 30 has a one-dimensional or two-dimensional correlation with one set of data selected by the RF signal frame data selecting unit 28, that is, the RF signal frame data (N) and the RF signal frame data (X). Processing is performed to obtain a one-dimensional or two-dimensional displacement distribution related to the displacement or movement vector (direction and size of displacement) in the living tissue corresponding to each point of the tomographic image (two-dimensional reflectance image).
- a block matching method is used to detect the movement vector.
- the block matching method divides a tomographic image into blocks of, for example, N ⁇ N pixels, focuses on the block in the region of interest, and selects the block that most closely approximates the reflectance distribution of the block of interest from the previous frame. This is a method of searching and referring to this to determine a sample value by predictive coding, that is, a difference.
- the elasticity information calculation unit 32 calculates the elasticity value by performing a predetermined calculation based on the displacement and the movement vector obtained by the displacement measurement unit 30, and outputs it as time-series elasticity frame data.
- the elasticity value here may be a value representing the elasticity of the tissue of the subject 1, and examples thereof include strain, elastic modulus, displacement, viscosity, strain ratio, and the like. When strain is used as the elastic value, it can be calculated by spatially differentiating the amount of movement of the living tissue, for example, displacement.
- the two-dimensional elastic image construction unit 34 includes a frame memory and an image processing unit, stores the elastic frame data output in time series from the elastic information calculation unit 32 in the frame memory, and stores the stored frame data in the image processing unit In this way, a two-dimensional elasticity image showing a two-dimensional distribution of elasticity values in the diagnostic region of the subject is generated.
- a two-dimensional elastic image is a color image obtained by converting an elastic value into hue information based on a predetermined color conversion table. For example, in response to the elasticity value changing from a predetermined small value to a large value, the gradation value changes sequentially from blue (B) to green (G) and red (R) in 255 gradations (1 to 255). Adds hue.
- the elasticity value of the hardest part is 1, and the elasticity value of the softest part is 255.
- the two-dimensional elastic image storage unit 39 stores the two-dimensional elastic image generated by the elastic image constructing unit 34 in association with the acquisition position.
- the volume data creation unit 40 performs a three-dimensional coordinate transformation based on the two-dimensional elastic image stored in the two-dimensional elastic image storage unit 39 and its acquisition position, thereby obtaining a plurality of spatially continuous two-dimensional elastic images. Generate 3D volume data arranged in 3D.
- the voxel constituting the three-dimensional volume data is within the elastic value range desired by the operator. Render only voxels. As a result, a three-dimensional elastic image of the tissue in the elastic value range set by the operator is obtained.
- the setting of the elastic value range is received by the rendering range setting unit 51 from the operator via the interface unit 43.
- the elastic value range is 255 gradations, from 1 to 255, from one value set by the operator to 1 representing the hardest elastic value (1 to ⁇ ) Is used as the range of elasticity values for rendering, and the range of elasticity values for rendering is from 255 ( ⁇ to 255), which is the softest elasticity value from one value set by the operator.
- a method or a method of setting a range between two elasticity values set by an operator as a rendering range is used. In the following description, an example will be described in which rendering is performed from one value set by the operator to 1, which is a value representing the hardest elasticity value.
- FIG. 2 (c) shows an enlarged view of the state where the elastic value ⁇ set by the operator is displayed in the area 107.
- the volume rendering unit 42 selects voxels whose elasticity value data is included in the range set in the rendering range setting unit 51 from the elastic volume data, and performs volume rendering only for these voxels to form a three-dimensional elastic image. To do. As a result, it is possible to obtain a three-dimensional elastic image of the voxel in the range of the elastic value desired by the operator, so that the three-dimensional shape of a hard tissue or a soft tissue inside the voxel with high opacity can be grasped.
- a method of replacing the elastic value data outside the range set in the rendering range setting unit 51 with zero, or a method of applying a mask to a voxel whose elastic value data is outside the setting range Etc. are used.
- volume rendering is performed based on the elasticity value and opacity using the following equations (4) to (6).
- the projection direction (gaze direction) is received from the operator by the image system control unit 44 via the interface 43.
- Eout (i) Eout (i-1) + (1-Aout (i-1)), A (i), E (i), S (i) ...
- Aout (i) Aout (i-1) + (1 ⁇ Aout (i-1)) ⁇ A (i) (5)
- a (i) Eopacity [E (i)] (6)
- Eout (i) is a value output as a pixel value on the projection plane.
- the elasticity value data outside the predetermined elasticity value range is not rendered.
- Eout (i-1) indicates the integrated value up to the (i-1) th.
- a (i) in equation (4) is the opacity of the i-th voxel on the line of sight, and is a value between 0 and 1.0.
- opacity A (i) refer to the table (Eopacity [E (i)]) that defines the relationship between the predetermined elastic value E (i) and opacity (Opacity), as shown in Equation (6).
- Eopacity [E (i)] that defines the relationship between the predetermined elastic value E (i) and opacity. It is determined according to the size of the value.
- Aout (i) in Expression (5) is a value obtained by integrating the opacity A (i) given by Expression (6) according to the right side of Expression (5) up to the i-th voxel.
- the integrated value Aout (i ⁇ 1) of opacity up to the (i ⁇ 1) th voxel calculated as in equation (5) is used.
- Aout (i) is integrated and converges to 1.0 each time it passes through the voxel.
- S (i) in equation (4) is a weighting component for shading, and is calculated from the gradient of the elastic value obtained from the elastic value E (i) and the surrounding elastic values. For example, if the normal of the surface (elasticity gradient) centered on the i-th voxel coincides with the optical axis of the predetermined light source, the light is reflected most strongly. Based on this, 1.0 is assigned to voxel i as S (i), and 0.0 is assigned to S (i) when the light source and the normal line are orthogonal. As a result, the obtained two-dimensional projection image is shaded to give an enhancement effect.
- the three-dimensional elastic image is colored with reference to the color map 105 for the three-dimensional elastic image in FIG.
- the color map 105 for the three-dimensional elastic image gives a hue (one color) corresponding to a set elastic value range, and changes the luminance (brightness) according to the magnitude of the value of Eout (i). .
- Eout (i) becomes smaller as the contribution of the weighting component S (i) for shading is larger (smaller S (i)) than Equation (4), so it is set closer to black (low brightness) Has been.
- the hue corresponding to the set elastic value range is, for example, blue if the average value of the set elastic value range is smaller than the average elastic value 127, and red if it is larger. Further, for example, blue may be used when a hard range is selected by a soft / hard selection switch 122 described later, and red may be selected when a soft range is selected. The hue corresponding to the average value of the set elastic value range may be used.
- the multi-frame component (elastic image) 48 generates a two-dimensional elastic image in an arbitrary cross section from the elastic volume data.
- the designation of an arbitrary cross section is received by the interface unit 43 from the operator, and transferred to the multi-frame configuration unit 48 via the image system control unit 44.
- a plurality of arbitrary cross-sectional positions can be set, and the multi-frame configuration unit 48 generates a two-dimensional elastic image for each of the plurality of arbitrary cross-sectional positions.
- the switching addition unit 12 includes a frame memory, an image processing unit, and an image selection unit.
- the frame memory is generated by the black and white tomographic image generated by the elastic image forming unit 7 and the multi-frame forming unit 46, the color two-dimensional elastic image generated by the two-dimensional elastic image forming unit 34 and the multi-frame forming unit 48, and the volume rendering unit 38 Each of the three-dimensional image and the three-dimensional elastic image generated by the volume rendering unit 48 are stored.
- the switching addition unit 12 generates a composite image obtained by adding a color two-dimensional elastic image to a black and white tomographic image at a predetermined ratio in accordance with an instruction from the operator.
- a black and white tomographic image a black and white tomographic image generated by the elastic image forming unit 7 or a black and white tomographic image of an arbitrary cross section generated by the multi-frame forming unit 46 is used.
- a color two-dimensional elastic image a two-dimensional elastic image generated by the two-dimensional elastic image forming unit 34 or a two-dimensional elastic image of an arbitrary cross section generated by the multi-frame forming unit 48 is used.
- a known synthetic image generation method described in Patent Document 1 will be briefly described.
- the monochrome tomographic image is converted into a color tomographic image.
- Output values of pixel red (R), green (G), and blue (B) are C (R) (x, y), C (G) (x, y), C (B) (x, y)
- conversion is performed by setting the value of C (x, y) in each of the following equations (7) to (9).
- C (R) (x, y) C (x, y) (7)
- C (G) (x, y) C (x, y) (8)
- C (B) (x, y) C (x, y) (9)
- the output value of red (R), green (G), and blue (B) of the pixel at the coordinates (x, y) of the two-dimensional color image of color is E (R) (x, y), E (G) (x , y), E (B) (x, y), and when the composition ratio set by the operator is r (0 ⁇ r ⁇ 1), the output value of the pixel of the composite image is D (R) (x, y), D (G) (x, y), and D (B) (x, y) are obtained by the following equations (10) to (12).
- the switching addition unit 12 is a monochrome tomographic image, a color two-dimensional elastic image, a three-dimensional image, a three-dimensional elastic image, and a composition that are stored in the frame memory in accordance with an instruction from the operator received via the interface unit 43. Among images, an image to be displayed on the image display unit 13 is selected and transferred. The image display unit 13 displays the delivered one or more images in a predetermined arrangement on the screen.
- the image system control unit 44 controls each unit related to image generation.
- the interface unit 43 also accepts the setting of the hue of the elastic image (the hue of the color map), the setting of the ROI (region of interest), the frame rate, and the like from the operator.
- the image system control unit 44 also displays the range of the elasticity value set for generating the three-dimensional elasticity image, the color map for the elasticity image, the values of various parameters set, etc. on the screen of the image display unit 13. It is displayed at a predetermined position above.
- FIG. 2 (a) is an example of an image displayed on the image display unit 13.
- FIG. 1 In the left region of the screen 100, a composite image of the black and white tomographic image 111 and the two-dimensional elastic image 112 is displayed. Since the two-dimensional elastic image 112 is generated only for the ROI 101 set by the operator, the two-dimensional elastic image 112 is synthesized only in the central region of the display region of the monochrome tomographic image 111.
- a color map 104 indicating the relationship between the hue and the elastic value of the two-dimensional elastic image 112 is displayed. In the color map 104, as shown in an enlarged view in FIG.
- a soft area having a large elastic value is assigned a red hue
- a hard area having a small elastic value is assigned a blue hue
- an intermediate area is assigned a green hue. Changes stepwise depending on the value of the elastic value, and the total is 255 gradations.
- a three-dimensional elastic image 103 is displayed in the right area of the screen 100.
- the three-dimensional elastic image 103 is an image obtained by rendering only the voxels within the range of the elastic value set by the operator.
- a three-dimensional elastic image color map 105 showing the relationship between the hue and the elastic value of the three-dimensional elastic image 103 is displayed.
- a parameter display area 106 in which values of various parameters received from the operator are displayed is displayed.
- the parameter display area includes an area 107 for displaying an elastic value range set as a rendering range by the operator.
- the displayed three-dimensional elastic image 103 is generated by the rendering unit 42 rendering the voxels in the elastic value range set as the rendering range.
- the 1 includes an operation panel 120 arranged at the bottom of the image display unit 13 as shown in FIG. 3 (a).
- the operation panel 120 is provided with a toggle switch 121 for setting an elastic value and a soft / hard selection switch 122.
- the soft / hard selection switch 122 is configured as shown in FIG. 4 (a), and when the button labeled “hard” is selected, the volume rendering unit 42 toggles as shown in FIG. 4 (b).
- the volume rendering unit 42 selects voxels in the range of ⁇ to 255 that are softer than the elastic value ⁇ as shown in FIG.
- a three-dimensional elastic image 103 of a soft tissue having a larger than ⁇ is generated and displayed.
- a tomographic image forming unit 7 that generates a tomographic image of a subject using a signal transmitted by receiving ultrasonic waves in the subject, and processes the signal to obtain elasticity.
- a display unit 13 for displaying at least one of the tomographic images.
- a tomographic image of a subject is generated using signals received by transmitting ultrasonic waves into the subject, and the signal is processed to generate a two-dimensional elastic image of elasticity representing elasticity.
- An image is generated, and a three-dimensional elastic image and at least one of a two-dimensional elastic image and a tomographic image showing a region corresponding to the elastic value range are displayed.
- ⁇ ⁇ It has a mode to set an elasticity value range (rendering range) to be rendered by the operator and display an image that allows the operator to easily grasp the area of the elasticity value range.
- This mode is realized by the image system control unit 44 controlling each unit as in the flow of FIG.
- the image system control unit 44 When the operator selects the mode by using the operation panel 120, the image system control unit 44 performs the following control by reading and executing the program stored in the built-in memory.
- step 61 the black and white tomographic image 111 generated by the tomographic image forming unit 7 or the multi-frame forming unit 46, and the two-dimensional elastic image 112 generated by the two-dimensional elastic image forming unit 34 or the multi-frame forming unit 48,
- the switching adder 12 generates a combined image by combining the above expressions (7) to (12). This is displayed in the left area of the screen 100 of the image display unit 13 as shown in FIG.
- the rendering range setting unit 51 sets a predetermined elastic value range as an initial value, thereby causing the volume rendering unit 42 to generate a three-dimensional elastic image for the voxel in the initial elastic value range. This is displayed in the right area of the screen 100.
- a predetermined initial elastic value range is displayed as shown in FIG. 2 (c) in the area 107 displaying the elastic value range at the bottom of the screen 100.
- the initial elastic value ⁇ is 72
- the soft / hard selection switch 122 shows the case where the “hard” button is selected as the initial value
- the elastic value range is 1 to 72.
- a three-dimensional elastic image 103 of a hard tissue is displayed.
- the image system control unit 44 displays the tissue region corresponding to the set elasticity value range (initial value) on the composite image in the left region of the screen 100. That is, by applying a mask to the two-dimensional elastic image 112 of the composite image and setting the composite ratio of the two-dimensional elastic image 112 to be lower than the set ratio r in the equations (10) to (12), the set elastic value is set. Which region on the two-dimensional elastic image is displayed in the range 1 to 72 is displayed. This process will be specifically described below.
- the rendering range setting unit 51 outputs the elastic value range 1 to ⁇ (initial value) to the two-dimensional elastic image forming unit 34 or the multi-frame forming unit 48 that generates the displayed two-dimensional elastic image.
- D (R) (x, y), D (G) (x, y), D (B) (x, y) are the output values of the masked composite image pixels. ).
- the output values of red (R), green (G), and blue (B) of the pixel at the coordinates (x, y) of the two-dimensional elastic image of color are E (R) (x, y), E (G) (x, y), E (B) (x, y), red (R), green (G), and blue (B )
- r is a composition ratio (0 ⁇ r ⁇ 1) set by the operator
- w is a predetermined weight (0 ⁇ w ⁇ 1).
- the color two-dimensional elastic image 112 is darkly displayed in the region 108 where the mask (M1) 110 is not applied.
- the two-dimensional elastic image 112 is displayed lightly and the monochrome tomographic image 111 is displayed darkly. Therefore, the operator can intuitively easily grasp that the region corresponding to the three-dimensional elastic image 103 on the right side of the screen 100 is the region of the color two-dimensional elastic image 112 displayed darkly.
- the volume rendering unit 42 renders voxels in the elastic value range 1 to ⁇ (that is, 1 to 50) set as the rendering range, and generates a three-dimensional elastic image. It is generated and displayed in the right area of the screen 100 as shown in FIG. In the display example of FIG. 3 (c), a three-dimensional elastic image of only a harder tissue (a range having a smaller elastic value) is displayed than in the display example of FIG. 3 (b).
- a mask (M1) 110 is generated in the same manner as in step 62 for the elasticity value range 1 to ⁇ (that is, 1 to 50) received in step 63, and the mask (M M1) is applied to reduce the addition ratio of the two-dimensional elastic image 112 in the region outside the elasticity range set as the rendering range by the weight w and generate a composite image with an increased addition ratio of the black and white tomographic image. Display in the left area of 100.
- the area 108 where the mask (M1) 110 is not applied is smaller than that in FIG. 3B. Therefore, it is possible to intuitively easily grasp that the tissue corresponding to the three-dimensional elastic image 103 on the right side of the screen 100 is the region 108 of the color two-dimensional elastic image 112 displayed darkly.
- the area 108 of the color two-dimensional elastic image 112 displayed darkly has changed (FIGS. 3B and 3C). In the example of), it can be understood that the smaller inner region is a harder region).
- the image system control unit 44 returns to step 63, and if it receives a change in the elastic value range from the next operator, it performs steps 64 to 65.
- a three-dimensional elasticity image obtained by rendering only voxels in a range of elasticity values desired by the operator is displayed side by side with a two-dimensional synthesized image, and the three-dimensional elasticity image is displayed on the two-dimensional synthesized image.
- a region corresponding to the elasticity value range of the image can be displayed with a mask. Therefore, it is possible to provide an ultrasonic diagnostic apparatus that can display a three-dimensional elastic image only in a region of a desired elastic value range and can easily grasp the position of the region.
- a process for reducing the composition ratio outside the elastic value range set as the rendering range may be added inside the switching addition unit 12 without creating a mask.
- the composite image is masked and displayed in the state where the initial value and the elastic value range are set in steps 62 and 65, but step 62 is not performed, that is, the initial value of the elastic value range is displayed. If the value remains unchanged, the composite image is not masked.In step 63, only when the setting of the elastic value range is accepted from the operator, the composite image can be masked in step 65. is there.
- the period during which the composite image is masked and displayed in step 65 may be set only while the toggle switch 121 is operated, or only for a predetermined limited time after the toggle switch 121 is operated. Is possible. That is, while the operator is turning the toggle switch 121 or the like, for a limited time, a composite image that is masked according to the value of the toggle switch 121 is displayed, but while the toggle switch 121 is not turned ( Before and after the setting of the elastic value range, a composite image generated by the normal equations (10) to (12) as shown in FIG. 2 (a) may be displayed without applying a mask.
- the two-dimensional elastic image 112 of the composite image is masked and the elastic value range set as the rendering range is displayed, but the two-dimensional elastic image is not displayed and the two-dimensional elastic image is used.
- the mask generated in this manner can be applied to the black and white tomographic image so that the rendering range is shown on the black and white tomographic image.
- a black and white elastic image may not be synthesized and displayed, and a two-dimensional elastic image and a three-dimensional elastic image with a mask may be displayed side by side.
- the region outside the elastic value range set as the rendering range is made transparent in the two-dimensional elastic image of the composite image displayed in the left region of the screen 100 as shown in FIG.
- the two-dimensional elasticity image 112 is displayed only in the region corresponding to the elasticity value range set as the rendering range.
- a two-dimensional elasticity image is not displayed outside the elasticity value range set as the rendering range, and a composite image can be obtained.
- a composite image is generated by filling the region of the elastic value range set as the rendering range in steps 62 and 65 of FIG. 5 of the first embodiment with a predetermined color corresponding to the elastic value of the region. .
- the hardness (elastic value) of the elastic value range set as the rendering range is calculated from the elastic value E (i) of the voxel included in the elastic value range set as the rendering range in the elastic volume data. For example, an average value, a maximum value, or a minimum value is calculated and used.
- the region 108 of the elastic value range set as the rendering range is filled with one color that is not in the color map 105.
- the shape and size of the region 108 can be easily grasped.
- the timing of filling the area 108 with a predetermined color can be limited to a limited time such as the time during which the operator operates the toggle switch 121.
- the non-operating time is particularly effective because the two-dimensional elastic image can be confirmed by normal display.
- the region outside the elastic value range set as the rendering range is displayed in a two-dimensional manner according to the equations (24) to (26) of the second embodiment, instead of being weighted by the equations (30) to (32). It is also possible to make the image transparent.
- the region 108 of the elastic value range set as the rendering range is painted with one color, but the elastic value range set as the rendering range using the following equations (33) to (38): It is also possible to have a configuration in which the area outside the area is filled with one color.
- the elastic value range area 108 set as the rendering range is weighted by the value w.
- Embodiment 4 will be described with reference to FIGS. 9 (a) and 9 (b).
- a line 116 indicating the outline of the rendering range (elastic value range) is displayed on the two-dimensional elastic image 112 of the composite image.
- a composite image is generated by synthesizing the line 116 indicating the outer periphery of the region of the elastic value range set as the rendering range.
- the rendering range setting unit 51 outputs the initial value or the elastic value ⁇ set by the operator using the toggle switch 121 to the two-dimensional elastic image forming unit 34 or the multiframe forming unit 48.
- the two-dimensional elastic image constructing unit 34 or the multi-frame constructing unit 48 generates a mask (M2) 115 shown in FIG. 9 (a) according to the equations (39) and (40).
- T is a predetermined value
- the value of T can be determined in advance, or a value received from the operator via the interface unit 43 can be used.
- the outline of the region 108 of the rendering range can be drawn in black with a line 116 as shown in FIG.9 (b), so that the rendering range of the three-dimensional elastic image ( It is possible to easily grasp on which area the elasticity value range is) on the composite image.
- hue of the line 116 can be determined in advance, or can be received from the operator via the interface unit 43.
- the timing for drawing the line 116 indicating the outline can be limited to a limited time such as the time during which the operator operates the toggle switch 121.
- the non-operating time is particularly effective because the two-dimensional elastic image can be confirmed by normal display.
- Embodiment 5 will be described with reference to FIG.
- the background color of the area 107 on the screen 100 where the elasticity value ⁇ set by the operator as the rendering range is displayed is displayed in a hue corresponding to the elasticity value ⁇ on the color map 104.
- This display control is performed by the rendering range setting unit 51.
- the operator turns not only the numerical value displayed in the area 107 but also the background color of the area 107 while turning the toggle switch 121, so that the elastic value ⁇ set as the upper or lower limit of the rendering range is 2 It is possible to easily grasp which region on the two-dimensional elastic image 112 corresponds to.
- the fifth embodiment can be performed together with the display of the composite image of any of the first to fourth embodiments.
- the operator can grasp the elasticity value range set as the rendering range by both the composite image and the background color of the area 107.
- the hue of the background color of the region 107 is displayed with a hue corresponding to the elastic value ⁇ together with the display of the contour line 116 of the fourth embodiment.
- the timing for displaying the background color or the character color of the area 107 with a hue corresponding to the elasticity value ⁇ is limited to a limited time such as the time when the operator operates the toggle switch 121. It is possible. As a result, the non-operating time can be concentrated on the display other than the area 107, which is effective.
- Embodiment 6 will be described with reference to FIG.
- the area outside the elastic value range set as the rendering range on the color map 104 is masked to reduce opacity or luminance, or the area of the mask 117 is transparent to the color map 104. Do not display. This display control is performed by the rendering range setting unit 51.
- the sixth embodiment is performed together with the display of the composite image of any of the first to fourth embodiments.
- the operator can grasp the range of the elasticity value set as the rendering range by both the synthesized image and the display of the color map 104. It is also possible to perform the display together with the display of the background color or character color of the area 107 in the fifth embodiment.
- FIG. 11 shows a screen 100 in which the color map 104 is covered with the mask 117 together with the composite image display of the second embodiment.
- the operator can easily grasp the elasticity value range set as the rendering range by viewing the color map 104 as well as the elasticity value displayed in the area 107 while turning the toggle switch 121.
- the process of applying the mask 117 may be performed not only on the color map 104 for the two-dimensional elastic image but also on the color map 105 for the three-dimensional elastic image.
- the operator can set a plurality of elasticity value ranges, and the plurality of elasticity value ranges can be displayed by simultaneously applying the mask 117 on the color map 104 or the like.
- the timing for masking the color map 104 or the like can be limited to the time during which the operator operates the toggle switch 121. As a result, since the mask is not put on when the operation is not performed, the entire color map 104 can be visually recognized.
- Embodiment 7 will be described with reference to FIGS. 12 (a) and 12 (b).
- a bar 118 is displayed at the position of the elastic value ⁇ set as the upper limit or lower limit of the rendering range on the color map 104, and the value of the elastic value ⁇ is manipulated. To the person. This display control is performed by the rendering range setting unit 51.
- the seventh embodiment is performed together with the display of the composite image of any of the first to fourth embodiments. Thereby, the operator can grasp the range of the elasticity value set as the rendering range both by the composite image and the display of the bar 118 of the color map 104. It is also possible to perform the display together with the display of the background color or character color of the area 107 in the fifth embodiment.
- FIG. 12 (a) shows a screen 100 in which a bar 118 is displayed on the color map 104 together with the display of the composite image of the third embodiment.
- the color map for displaying the bar may be not only the color map 104 for the two-dimensional elastic image but also the bar 118 for the color map 105 for the three-dimensional elastic image.
- the operator can set both the upper and lower limits of the elastic value range without using the soft / hard selection switch 122, and two upper and lower limits of the elastic value range can be set as shown in FIG. It is also possible to use a configuration in which the bars 118 and 119 are displayed. In this case, the operator may be able to set a plurality of elastic value ranges, and the upper and lower limits of the plurality of elastic value ranges may be displayed simultaneously by a plurality of sets of bars 118 and 119. In this case, the color of the set of bars 118 and 119 can be changed for each of a plurality of elastic value ranges.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Physics & Mathematics (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Graphics (AREA)
- General Engineering & Computer Science (AREA)
- Physiology (AREA)
- Epidemiology (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Primary Health Care (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
Aout(i)=Aout(i-1)+(1-Aout(i-1))・A(i) ・・・式(2)
A(i)=Bopacity[C(i)] ・・・式(3)
式(1)において、Cout(i)は、2次元投影面のピクセル値として出力される値である。C(i)は、2次元投影面上のある点から3次元像を見た場合、視線上i番目(ただし、i=0~N-1)に存在するボクセルの輝度値である。なお、ここでいうボクセルとは、3次元ボリュームデータを構成する個々の輝度データの位置をいう。視線上にN個のボクセルが並んでいるとき、i=0~N-1までのボクセルの輝度値を式(1)に従って積算した輝度値Cout(N-1)が最終的に出力されるピクセル値となる。Cout(i-1)は、ボクセルi-1番目までの積算値を示す。
図2(c)には、操作者が設定した弾性値αが領域107に表示されている状態を拡大して示す。
Aout(i)=Aout(i-1)+(1-Aout(i-1))・A(i) ・・・式(5)
A(i)=Eopacity[E(i)] ・・・式(6)
式(4)において、Eout(i)は、投影面のピクセル値として出力される値である。E(i)は、2次元投影面上のある点から3次元弾性画像を見た場合の視線上i番目(ただし、i=0~N-1)に存在するボクセルの弾性値である。このとき、弾性値範囲のボクセルの選択方法として、設定範囲外の弾性値データをゼロに置き換える方法を用いる場合、範囲外の弾性値データE(i)については、E(i)=0に置き換え、所定の弾性値範囲外の弾性値データがレンダリングされないようにする。視線上にNボクセルの弾性値が並んだとき、i=0~N-1まで弾性値を式(4)に従って積算した積算値Eout(N-1)が、最終的に出力されるピクセル値となる。Eout(i-1)はi-1番目までの積算値を示す。
C(G)(x,y)=C(x,y) ・・・式(8)
C(B)(x,y)=C(x,y) ・・・式(9)
カラーの2次元弾性像の座標(x,y)のピクセルの赤(R)、緑(G)、青(B)の出力値をE(R)(x,y)、E(G)(x,y)、E(B)(x,y)、操作者が設定とした合成割合をr(0<r≦1)とした場合、合成画像のピクセルの出力値をD(R)(x,y)、D(G)(x,y)、D(B)(x,y)は、下式(10)~(12)により求められる。
D(G)(x,y)=E(G)(x,y)×r+C(G)(x,y) ・・・式(11)
D(B)(x,y)=E(B)(x,y)×r+C(B)(x,y) ・・・式(12)
生成した合成画像はフレームメモリに格納される。
本実施形態1の超音波診断装置においては、被検体内に超音波を送信して受信した信号を用いて被検体の断層像を生成する断層像構成部7と、信号を処理して弾性を表す弾性値の2次元弾性像を生成する2次元弾性像構成部34と、複数の前記2次元弾性像から構成されるボリュームデータを生成し、所望の弾性値範囲に含まれるボリュームデータの弾性値データを選択してレンダリングすることにより、弾性値範囲の弾性値データの3次元弾性像を生成するレンダリング部42と、3次元弾性像と、弾性値範囲に対応する領域を示す、2次元弾性像および断層像のうちの少なくとも一方とを表示する表示部13とを有する。超音波画像の表示方法においては、被検体内に超音波を送信して受信した信号を用いて被検体の断層像を生成し、信号を処理して弾性を表す弾性値の2次元弾性像を生成し、複数の2次元弾性像からボリュームデータを生成し、所望の弾性値範囲に含まれるボリュームデータの弾性値データを選択してレンダリングすることにより、弾性値範囲の弾性値データの3次元弾性像を生成し、3次元弾性像と、弾性値範囲に対応する領域を示す、2次元弾性像および断層像のうちの少なくとも一方とを表示する。
α<E(x,y) のとき M1(x,y)=0 ・・・式(14)
切替加算部12は、白黒断層像111と2次元弾性像112とを合成する際、マスク(M1)110にもとづいて、マスクされた領域の2次元弾性像112の合成比率を設定割合rよりも低くして加算して、マスクされた合成画像を生成し、合成画像として図6(b)のように表示する。
D(R)(x,y)=E(R)(x,y)×r+C(R)(x,y) ・・・式(15)
D(G)(x,y)=E(G)(x,y)×r+C(G)(x,y) ・・・式(16)
D(B)(x,y)=E(B)(x,y)×r+C(B)(x,y) ・・・式(17)
M1(x,y)=0のとき
D(R)(x,y)=E(R)(x,y)×r×w+C(R)(x,y) ・・・式(18)
D(G)(x,y)=E(G)(x,y)×r×w+C(G)(x,y) ・・・式(19)
D(B)(x,y)=E(B)(x,y)×r×w+C(B)(x,y) ・・・式(20)
実施形態2では、図7のように画面100左側領域に表示される合成画像の2次元弾性像において、レンダリング範囲として設定された弾性値範囲外の領域を透明にする。
M1(x,y)=1のとき
D(R)(x,y)=E(R)(x,y)×r+C(R)(x,y) ・・・式(21)
D(G)(x,y)=E(G)(x,y)×r+C(G)(x,y) ・・・式(22)
D(B)(x,y)=E(B)(x,y)×r+C(B)(x,y) ・・・式(23)
M1(x,y)=0のとき
D(R)(x,y)=C(R)(x,y) ・・・式(24)
D(G)(x,y)=C(G)(x,y) ・・・式(25)
D(B)(x,y)=C(B)(x,y) ・・・式(26)
実施形態3では、実施形態1の図5のステップ62および65において、レンダリング範囲として設定された弾性値範囲の領域を、その領域の弾性値に対応した所定の一色で塗りつぶした合成画像を生成する。
M1(x,y)=1のとき
D(R)(x,y)=K(R)(x,y) ・・・式(27)
D(G)(x,y)=K(G)(x,y) ・・・式(28)
D(B)(x,y)=K(B)(x,y) ・・・式(29)
M1(x,y)=0のとき
D(R)(x,y)=E(R)(x,y)×r×w+C(R)(x,y) ・・・式(30)
D(G)(x,y)=E(G)(x,y)×r×w+C(G)(x,y) ・・・式(31)
D(B)(x,y)=E(B)(x,y)×r×w+C(B)(x,y) ・・・式(32)
D(R)(x,y)=E(R)(x,y)×r×w+C(R)(x,y) ・・・式(33)
D(G)(x,y)=E(G)(x,y)×r×w+C(G)(x,y) ・・・式(34)
D(B)(x,y)=E(B)(x,y)×r×w+C(B)(x,y) ・・・式(35)
M1(x,y)=0のとき
D(R)(x,y)=K(R)(x,y) ・・・式(36)
D(G)(x,y)=K(G)(x,y) ・・・式(37)
D(B)(x,y)=K(B)(x,y) ・・・式(38)
次に実施形態4について図9(a),(b)を用いて説明する。実施形態4では、合成画像の2次元弾性像112上に、レンダリング範囲(弾性値範囲)の輪郭を示す線116を表示する。
E(x,y)<α-T、α+T<E(x,y)のとき M2(x,y)=0・・・式(40)
ただし、Tは、予め定めた値であり、2Tが、レンダリング範囲の輪郭を示す線116の線幅となる。例えば、T=2とすることにより、線116の線幅は弾性値で4になる。
次に、実施形態5について図10を用いて説明する。
実施形態7では、操作者がレンダリング範囲として設定した弾性値αの表示される画面100上の領域107の背景色を、カラーマップ104上の弾性値αに対応する色相で表示する。この表示制御は、レンダリング範囲設定部51が行う。
次に、実施形態6について図11を用いて説明する。
実施形態6では、カラーマップ104上の、レンダリング範囲として設定された弾性値範囲外の領域に、マスク117を掛けて不透明または輝度を低下させるか、もしくは、マスク117の領域をカラーマップ104を透明にして表示しない。この表示制御は、レンダリング範囲設定部51が行う。
次に、実施形態7について図12(a),(b)を用いて説明する。
実施形態7では、図12(a)のように、カラーマップ104上の、レンダリング範囲の上限または下限として設定された弾性値αの位置に、バー118を表示し、弾性値αの値を操作者に示す。この表示制御は、レンダリング範囲設定部51が行う。
Claims (11)
- 被検体内に超音波を送信して受信した信号を用いて被検体の断層像を生成する断層像構成部と、前記信号を処理して弾性を表す弾性値の2次元弾性像を生成する2次元弾性像構成部と、
複数の前記2次元弾性像から構成されるボリュームデータを生成し、所望の弾性値範囲に含まれる前記ボリュームデータの弾性値データを選択してレンダリングすることにより、前記弾性値範囲の弾性値データの3次元弾性像を生成するレンダリング部と、
前記3次元弾性像と、前記弾性値範囲に対応する領域を示す、前記2次元弾性像および前記断層像のうちの少なくとも一方とを表示する表示部とを有することを特徴とする超音波診断装置。 - 請求項1に記載の超音波診断装置において、前記表示部は、前記2次元弾性像の前記弾性値範囲外の領域にマスクをかけて表示することを特徴とする超音波診断装置。
- 請求項2に記載の超音波診断装置において、前記表示部は、前記弾性値範囲内の領域と前記マスクをかけた領域とで、前記2次元弾性像と前記断層像との加算割合を異ならせた合成画像を表示することを特徴とする超音波診断装置。
- 請求項3に記載の超音波診断装置において、前記表示部は、前記マスクをかけた領域の前記2次元弾性像の加算割合をゼロにした合成画像を表示することを特徴とする超音波診断装置。
- 請求項2に記載の超音波診断装置において、前記表示部は、前記2次元弾性像の前記弾性値範囲内の領域、または、前記マスクをかけた領域を、単一の色相で塗りつぶした画像を表示することを特徴とする超音波診断装置。
- 請求項5に記載の超音波診断装置において、前記単一の色相は、前記2次元弾性像の塗りつぶされる領域の弾性値に応じて設定されることを特徴とする超音波診断装置。
- 請求項1に記載の超音波診断装置において、前記表示部は、前記2次元弾性像に、前記弾性値範囲の輪郭を示す線を表示することを特徴とする超音波診断装置。
- 請求項1に記載の超音波診断装置において、操作者が前記弾性値範囲の設定するための操作部をさらに有し、
前記表示部は、前記操作部が前記弾性値範囲の設定を受け付けた場合、所定の時間のみ、前記弾性値範囲に対応する領域を、前記2次元弾性像および前記断層像のうちの少なくとも一方の上で表示することを特徴とする超音波診断装置。 - 請求項1に記載の超音波診断装置において、前記2次元弾性像は、弾性値に応じて異なる色相が付与されたカラー画像であり、前記表示部は、弾性値と色相との関係を示すカラーマップに、前記弾性値範囲を示す表示を付加して表示することを特徴とする超音波診断装置。
- 請求項1に記載の超音波診断装置において、前記2次元弾性像は、弾性値に応じて異なる色相が付与されたカラー画像であり、前記表示部は、前記弾性値範囲を数値で示す表示に、前記弾性値範囲に対応する色相を付加して表示することを特徴とする超音波診断装置。
- 被検体内に超音波を送信して受信した信号を用いて被検体の断層像を生成し、
前記信号を処理して弾性を表す弾性値の2次元弾性像を生成し、
複数の前記2次元弾性像からボリュームデータを生成し、所望の弾性値範囲に含まれる前記ボリュームデータの弾性値データを選択してレンダリングすることにより、前記弾性値範囲の弾性値データの3次元弾性像を生成し、
前記3次元弾性像と、前記弾性値範囲に対応する領域を示す、前記2次元弾性像および前記断層像のうちの少なくとも一方とを表示することを特徴とする超音波画像の表示方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP11826638.6A EP2620102A4 (en) | 2010-09-21 | 2011-07-28 | ULTRASONIC DIAGNOSIS DEVICE AND METHOD FOR DISPLAYING ULTRASONIC IMAGES |
US13/820,559 US9107634B2 (en) | 2010-09-21 | 2011-07-28 | Ultrasonic diagnostic apparatus and method of displaying ultrasonic image |
JP2012534956A JP5882217B2 (ja) | 2010-09-21 | 2011-07-28 | 超音波診断装置および超音波画像の表示方法 |
CN201180044986.7A CN103108593B (zh) | 2010-09-21 | 2011-07-28 | 超声波诊断装置及超声波图像的显示方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-211162 | 2010-09-21 | ||
JP2010211162 | 2010-09-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012039192A1 true WO2012039192A1 (ja) | 2012-03-29 |
Family
ID=45873687
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/067190 WO2012039192A1 (ja) | 2010-09-21 | 2011-07-28 | 超音波診断装置および超音波画像の表示方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9107634B2 (ja) |
EP (1) | EP2620102A4 (ja) |
JP (1) | JP5882217B2 (ja) |
CN (1) | CN103108593B (ja) |
WO (1) | WO2012039192A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015519183A (ja) * | 2012-06-13 | 2015-07-09 | セノ メディカル インストルメンツ,インク. | 光音響データのパラメータマップを生成するための方法およびシステム |
CN104883978A (zh) * | 2012-12-06 | 2015-09-02 | 日立阿洛卡医疗株式会社 | 超声波诊断装置以及超声波图像显示方法 |
JP2017225819A (ja) * | 2016-06-20 | 2017-12-28 | 東芝メディカルシステムズ株式会社 | 医用画像診断装置及び医用画像処理装置 |
JP2021132869A (ja) * | 2020-02-27 | 2021-09-13 | カシオ計算機株式会社 | 電子装置、電子装置の制御方法及び電子装置の制御プログラム |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9311704B2 (en) * | 2012-10-18 | 2016-04-12 | Hitachi Aloka Medical, Ltd. | Ultrasonic diagnosis apparatus and image display method |
JP6457105B2 (ja) * | 2015-09-29 | 2019-01-23 | 富士フイルム株式会社 | 音響波診断装置およびその制御方法 |
US11704767B2 (en) * | 2020-07-31 | 2023-07-18 | Spot Vision Llc | Texture extraction |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000060853A (ja) | 1998-08-20 | 2000-02-29 | Hitachi Medical Corp | 超音波診断装置 |
WO2005048847A1 (ja) | 2003-11-21 | 2005-06-02 | Hitachi Medical Corporation | 超音波診断装置 |
JP2008259605A (ja) | 2007-04-11 | 2008-10-30 | Hitachi Medical Corp | 超音波診断装置 |
JP2008284287A (ja) * | 2007-05-21 | 2008-11-27 | Hitachi Medical Corp | 超音波診断装置 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4384625B2 (ja) * | 1996-04-15 | 2009-12-16 | オリンパス株式会社 | 超音波画像診断装置 |
JP3802462B2 (ja) * | 2001-08-06 | 2006-07-26 | アロカ株式会社 | 超音波診断装置 |
US6780155B2 (en) * | 2001-12-18 | 2004-08-24 | Koninklijke Philips Electronics | Method and system for ultrasound blood flow imaging and volume flow calculations |
JP3932482B2 (ja) * | 2002-10-18 | 2007-06-20 | 株式会社日立メディコ | 超音波診断装置 |
US7766836B2 (en) * | 2005-01-04 | 2010-08-03 | Hitachi Medical Corporation | Ultrasound diagnostic apparatus, program for imaging an ultrasonogram, and method for imaging an ultrasonogram |
EP1864612A4 (en) * | 2005-03-30 | 2009-10-28 | Hitachi Medical Corp | ULTRASOUND DEVICE |
JP4693465B2 (ja) * | 2005-04-06 | 2011-06-01 | 株式会社東芝 | 3次元超音波診断装置及びボリュームデータ表示領域設定方法 |
JP5130529B2 (ja) * | 2005-08-01 | 2013-01-30 | 国立大学法人 奈良先端科学技術大学院大学 | 情報処理装置およびプログラム |
US7678051B2 (en) * | 2005-09-27 | 2010-03-16 | Siemens Medical Solutions Usa, Inc. | Panoramic elasticity ultrasound imaging |
WO2007046272A1 (ja) * | 2005-10-19 | 2007-04-26 | Hitachi Medical Corporation | 弾性画像を生成する超音波診断装置 |
US20070167784A1 (en) * | 2005-12-13 | 2007-07-19 | Raj Shekhar | Real-time Elastic Registration to Determine Temporal Evolution of Internal Tissues for Image-Guided Interventions |
JP4892732B2 (ja) * | 2007-03-28 | 2012-03-07 | 国立大学法人岐阜大学 | 血管画像化方法、血管画像化システム及び血管画像化プログラム |
WO2010020921A2 (en) * | 2008-08-20 | 2010-02-25 | Koninklijke Philips Electronics N.V. | Blanking of image regions |
EP2319416A4 (en) * | 2008-08-25 | 2013-10-23 | Hitachi Medical Corp | ULTRASONIC DIAGNOSTIC APPARATUS AND METHOD FOR DISPLAYING AN ULTRASONIC IMAGE |
WO2010026823A1 (ja) | 2008-09-08 | 2010-03-11 | 株式会社 日立メディコ | 超音波診断装置及び超音波画像表示方法 |
JP5479353B2 (ja) * | 2008-10-14 | 2014-04-23 | 株式会社日立メディコ | 超音波診断装置 |
JP5147656B2 (ja) * | 2008-11-20 | 2013-02-20 | キヤノン株式会社 | 画像処理装置、画像処理方法、プログラム、及び記憶媒体 |
JP5461845B2 (ja) * | 2009-02-05 | 2014-04-02 | 株式会社東芝 | 超音波診断装置及び超音波診断装置の制御プログラム |
JP5484826B2 (ja) * | 2009-08-26 | 2014-05-07 | ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー | 超音波診断装置 |
WO2011059632A1 (en) * | 2009-10-29 | 2011-05-19 | The Board Of Trustees Of The University Of Illinois | Non-invasive optical imaging for measuring pulse and arterial elasticity in the brain |
-
2011
- 2011-07-28 CN CN201180044986.7A patent/CN103108593B/zh active Active
- 2011-07-28 WO PCT/JP2011/067190 patent/WO2012039192A1/ja active Application Filing
- 2011-07-28 US US13/820,559 patent/US9107634B2/en active Active
- 2011-07-28 EP EP11826638.6A patent/EP2620102A4/en not_active Withdrawn
- 2011-07-28 JP JP2012534956A patent/JP5882217B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000060853A (ja) | 1998-08-20 | 2000-02-29 | Hitachi Medical Corp | 超音波診断装置 |
WO2005048847A1 (ja) | 2003-11-21 | 2005-06-02 | Hitachi Medical Corporation | 超音波診断装置 |
JP2008259605A (ja) | 2007-04-11 | 2008-10-30 | Hitachi Medical Corp | 超音波診断装置 |
JP2008284287A (ja) * | 2007-05-21 | 2008-11-27 | Hitachi Medical Corp | 超音波診断装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2620102A4 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015519183A (ja) * | 2012-06-13 | 2015-07-09 | セノ メディカル インストルメンツ,インク. | 光音響データのパラメータマップを生成するための方法およびシステム |
CN104883978A (zh) * | 2012-12-06 | 2015-09-02 | 日立阿洛卡医疗株式会社 | 超声波诊断装置以及超声波图像显示方法 |
CN104883978B (zh) * | 2012-12-06 | 2017-03-08 | 株式会社日立制作所 | 超声波诊断装置以及超声波图像显示方法 |
JP2017225819A (ja) * | 2016-06-20 | 2017-12-28 | 東芝メディカルシステムズ株式会社 | 医用画像診断装置及び医用画像処理装置 |
JP7171168B2 (ja) | 2016-06-20 | 2022-11-15 | キヤノンメディカルシステムズ株式会社 | 医用画像診断装置及び医用画像処理装置 |
JP2021132869A (ja) * | 2020-02-27 | 2021-09-13 | カシオ計算機株式会社 | 電子装置、電子装置の制御方法及び電子装置の制御プログラム |
US11800989B2 (en) | 2020-02-27 | 2023-10-31 | Casio Computer Co., Ltd. | Electronic device, control method for the electronic device, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
JP5882217B2 (ja) | 2016-03-09 |
CN103108593A (zh) | 2013-05-15 |
US20130158400A1 (en) | 2013-06-20 |
EP2620102A1 (en) | 2013-07-31 |
JPWO2012039192A1 (ja) | 2014-02-03 |
US9107634B2 (en) | 2015-08-18 |
EP2620102A4 (en) | 2016-12-07 |
CN103108593B (zh) | 2015-11-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5882217B2 (ja) | 超音波診断装置および超音波画像の表示方法 | |
JP5479353B2 (ja) | 超音波診断装置 | |
JP5770189B2 (ja) | 超音波診断装置 | |
JP5264097B2 (ja) | 超音波診断装置 | |
JP4657106B2 (ja) | 超音波診断装置 | |
JP5730196B2 (ja) | 超音波診断装置、超音波画像処理装置、超音波画像生成方法 | |
JP5774498B2 (ja) | 超音波診断装置 | |
US20060052702A1 (en) | Ultrasound diagnostics device | |
US8941646B2 (en) | Ultrasonic diagnostic apparatus and ultrasonic image display method | |
JPWO2005122906A1 (ja) | 超音波診断装置 | |
CN102711625B (zh) | 超声波诊断装置以及超声波图像显示方法 | |
JP5815541B2 (ja) | 超音波診断装置、超音波画像表示方法、および、プログラム | |
WO2010024023A1 (ja) | 超音波診断装置及び超音波画像表示方法 | |
JP5882218B2 (ja) | 超音波診断装置および超音波画像の表示方法 | |
JP2010012311A (ja) | 超音波診断装置 | |
JP2012213545A (ja) | 超音波診断装置 | |
JP5653045B2 (ja) | 超音波診断装置 | |
JP5653146B2 (ja) | 超音波診断装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180044986.7 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11826638 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012534956 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13820559 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011826638 Country of ref document: EP |