WO2014050601A1 - 超音波診断装置及び超音波三次元画像作成方法 - Google Patents
超音波診断装置及び超音波三次元画像作成方法 Download PDFInfo
- Publication number
- WO2014050601A1 WO2014050601A1 PCT/JP2013/074740 JP2013074740W WO2014050601A1 WO 2014050601 A1 WO2014050601 A1 WO 2014050601A1 JP 2013074740 W JP2013074740 W JP 2013074740W WO 2014050601 A1 WO2014050601 A1 WO 2014050601A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light source
- illuminance
- data
- volume data
- unit
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0866—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving foetal diagnosis; pre-natal or peri-natal diagnosis of the baby
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/506—Illumination models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
Definitions
- the present invention relates to an ultrasonic diagnostic apparatus, and more particularly to an ultrasonic diagnostic apparatus that generates a three-dimensional projection image from ultrasonic luminance volume data.
- the ultrasonic diagnostic apparatus transmits an ultrasonic wave inside the subject using an ultrasonic probe, receives an ultrasonic reflection echo signal corresponding to the structure of the living tissue from the inside of the subject, and, for example, an ultrasonic tomogram ( A tomographic image such as a B-mode image is constructed and displayed for diagnosis.
- a tomographic image such as a B-mode image is constructed and displayed for diagnosis.
- the 3D data obtained by scanning the probe in the short axis direction automatically or manually is subjected to coordinate transformation and then the ultrasound image data in the line of sight.
- the technique of observing the surface of an object by reconstructing and creating a three-dimensional image has become common.
- real-time 3D or 4D that performs these signal processing in real time and displays a moving three-dimensional image is common.
- the present invention has been made to solve the conventional problems. By expressing the behavior of light in the tissue (leakage, absorption, scattering, reflection, etc.), it occurs in the shadow of the back of the tissue or in the fissure of the skin.
- An object of the present invention is to provide an ultrasonic diagnostic apparatus that reproduces a local shadow and creates a three-dimensional image expressing a shadow effect due to light leakage or absorption.
- An ultrasonic diagnostic apparatus of the present invention is an ultrasonic diagnostic apparatus that displays a three-dimensional image of an object based on luminance volume data, and sets light source data that represents characteristics of a light source set in a three-dimensional space.
- An information setting unit an optical property setting unit that sets a weighting factor that represents an optical property of the luminance volume data for the light source; and a position corresponding to the coordinates of the luminance volume data based on the optical data and the weighting factor.
- An illuminance calculation unit that calculates illuminance and creates illuminance volume data based on the calculated illuminance, and a projection processing unit that creates the three-dimensional image from the luminance volume data and the illuminance volume data.
- the present invention calculates the illuminance at a position corresponding to the coordinates of the luminance volume data based on the optical data and the weighting coefficient, and creates the illuminance volume data based on the calculated illuminance, thereby leaking light or absorbing light. It is possible to create a three-dimensional image that expresses the shading effect.
- FIG. 1 It is a figure showing the conceptual diagram of the illumination intensity calculation by the modification of this Embodiment. It is the block diagram which showed an example of the structure of the illumination intensity correction
- FIG. 1 is a block diagram showing an example of an ultrasonic diagnostic apparatus according to the present embodiment.
- an ultrasonic diagnostic apparatus 0001 includes a control unit 0003, an operation unit 0004, a transmission unit 0005, a reception unit 0006, a transmission / reception control unit 0007, a phasing addition unit 0008, a display unit 0009, and a tomographic information calculation unit 0011.
- An ultrasonic probe 0002 is connected to the ultrasonic diagnostic apparatus 0001.
- the ultrasonic probe 0002 is used in contact with the subject 0010.
- the ultrasound probe 0002 is formed by arranging a plurality of transducers, and has a function of transmitting and receiving ultrasound to and from the subject 0010 via the transducers.
- the ultrasonic probe 0002 is composed of a plurality of rectangular or fan-shaped transducers, and the transducers are mechanically shaken or moved manually in a direction orthogonal to the arrangement direction of the plurality of transducers, and the ultrasonic wave is tertiary. Can be sent and received originally.
- the ultrasonic probe 0002 may be one in which a plurality of transducers are arranged two-dimensionally and ultrasonic transmission / reception can be controlled electronically.
- the control unit 0003 controls each component of the ultrasonic diagnostic apparatus 0001 and the ultrasonic probe 0002.
- the operation unit 0004 makes various inputs to the control unit 0003.
- the operation unit 0004 includes a keyboard, a trackball, and the like.
- the transmitting unit 0005 repeatedly transmits ultrasonic waves to the subject 0010 via the ultrasonic probe 0002 at a predetermined time interval.
- the transmission unit 0005 generates a transmission pulse for driving the transducer of the ultrasonic probe 0002 to generate an ultrasonic wave.
- the transmission unit 0005 has a function of setting a convergence point of transmitted ultrasonic waves to a certain depth.
- the receiving unit 0006 receives a reflected echo signal reflected from the subject 0010.
- the reception unit 0006 amplifies the reflected echo signal received by the ultrasonic probe 0002 with a predetermined gain to generate an RF signal, that is, a reception signal.
- the transmission / reception control unit 0007 controls the transmission unit 0005 and the reception unit 0006.
- the phasing addition unit 0008 performs phasing addition of the reflected echo received by the reception unit 0006.
- the phasing and adding unit 0008 controls the phase of the RF signal amplified by the receiving unit 0006 and forms an ultrasonic beam at one or more convergence points to generate RF signal frame data (equivalent to RAW data).
- the tomographic information calculation unit 0011 configures a tomographic image based on the RF signal frame data generated by the phasing addition unit 0008.
- the three-dimensional data storage unit 0012 stores a plurality of tomographic images formed by the tomographic information calculation unit 0011.
- the arbitrary cross-sectional image creation unit 0013 creates an arbitrary cross-sectional image based on the acquired shape of the tomographic image.
- the three-dimensional coordinate conversion unit 0014 performs three-dimensional coordinate conversion based on the acquired shape of the tomographic image, generates luminance volume data, and stores it in the volume data storage unit 0015.
- the three-dimensional image processing unit 0016 creates illuminance volume data using the luminance volume data stored in the volume data storage unit 0015.
- the gradient calculation unit 0019 creates gradient volume data using the luminance volume data stored in the volume data storage unit 0015.
- the projection processing unit 0018 performs a rendering process using the illuminance volume data, the luminance volume data, and the gradient volume data, and generates a three-dimensional image. Further, the projection processing unit 0018 may create a three-dimensional image from the luminance volume data and the illuminance volume data.
- the image composition unit 0017 synthesizes the three-dimensional image generated by the projection processing unit 0018 and the arbitrary slice image created by the arbitrary slice image creation unit 0013.
- the display unit 0009 displays the display image created by the image composition unit 0017.
- the ultrasonic probe 0002 can measure along two axes of ⁇ and ⁇ , for example, while switching the transmission / reception direction two-dimensionally simultaneously with transmission / reception of ultrasonic waves.
- the tomographic information calculation unit 0011 inputs the RF signal frame data output from the phasing addition unit 0008 based on the setting conditions in the control unit 0003, and performs gain correction, log compression, detection, contour enhancement, smoothing processing, etc. 2D tomographic data is constructed.
- the three-dimensional data storage unit 0012 has a function of storing a plurality of two-dimensional tomographic data, which is output data of the tomographic information calculation unit 0011, based on the transmission / reception direction corresponding to the acquisition position. For example, two-dimensional tomographic images created from measurement results obtained by transmitting and receiving time-series ultrasound data sampled in the depth direction in the ⁇ direction are driven in the ⁇ direction orthogonal to the ⁇ direction to generate a plurality of images. A plurality of two-dimensional tomographic data acquired and associated with ⁇ is stored as three-dimensional tomographic data.
- the three-dimensional coordinate conversion unit 0014 uses the three-dimensional tomographic data stored in the three-dimensional data storage unit 0012, performs three-dimensional coordinate conversion to the coordinates on the space based on the acquisition position (depth, ⁇ , ⁇ ), Luminance volume data is generated and stored in the volume data storage unit 0015.
- the arbitrary slice image creation unit 0013 uses the three-dimensional tomographic data stored in the three-dimensional data storage unit 0012 and is set by the control unit 0003 and the operation unit 0004 based on the acquisition position (depth, ⁇ , ⁇ ). An arbitrary cross-sectional image in an arbitrary plane in a three-dimensional space is created.
- the three-dimensional image processing unit 0016 creates illuminance volume data based on the luminance volume data stored in the volume data storage unit 0015. Based on the luminance volume data stored in the volume data storage unit 0015, the gradient calculation unit 0019 creates volume data in which the gradient in the line-of-sight direction at each voxel coordinate is calculated.
- the three-dimensional image processing unit 0016 is a processing unit that is characteristic of the ultrasonic diagnostic apparatus 0001 according to the present embodiment, and uses the luminance volume data stored in the volume data storage unit 0015 to control the control unit 0003 and the operation unit 0004. Illuminance volume data is created based on the light source in the three-dimensional space set by.
- FIG. 2 is a block diagram illustrating an example of the three-dimensional image processing unit 0016.
- the three-dimensional image processing unit 0016 includes a light source information setting unit 0021, an optical characteristic setting unit 0022, and an illuminance calculation unit 0023.
- the ultrasonic diagnostic apparatus luminance 0001 according to the present embodiment is an ultrasonic diagnostic apparatus 0001 that displays a three-dimensional image of an object based on luminance volume data, and represents the characteristics of a light source set in a three-dimensional space.
- Illuminance calculation unit 0023 that calculates illuminance at a position corresponding to the coordinates of the data and creates illuminance volume data based on the calculated illuminance, and creates the three-dimensional image from the luminance volume data and the illuminance volume data
- the ultrasonic three-dimensional image creation method is an ultrasonic three-dimensional image creation method for displaying a three-dimensional image of an object based on luminance volume data, and is set in a three-dimensional space.
- Set light source data representing the characteristics of the light source
- set a weighting coefficient representing the optical characteristics of the luminance volume data for the light source
- position corresponding to the coordinates of the luminance volume data Illuminance is calculated
- illuminance volume data is created based on the calculated illuminance
- the three-dimensional image is created from the luminance volume data and the illuminance volume data.
- the light source information setting unit 0021 sets (generates) light source data representing the characteristics of the light source set in the three-dimensional space of the three-dimensional image. For example, the light source information setting unit 0021 sets light source data representing the intensity of the light source. The light source information setting unit 0021 can also set light source data by adjusting at least one of the intensity of the light source, the position of the light source in the three-dimensional space, the direction of the light source, the color tone of the light source, and the shape of the light source. .
- the optical characteristic setting unit 0022 sets the optical characteristic of the luminance volume data set by the control unit 0003 and the operation unit 0004.
- the optical characteristic setting unit 0022 sets a weighting coefficient representing the optical characteristic of the luminance volume data for the light source.
- the illuminance calculation unit 0023 calculates the illuminance arranged on the luminance volume data based on the light source data set by the light source information setting unit 0021 and the optical characteristics set by the optical characteristic setting unit 0022, and the illuminance volume data Create That is, the illuminance calculation unit 0023 calculates the illuminance at a position corresponding to the coordinates of the luminance volume data based on the optical data and the weighting coefficient, and creates illuminance volume data based on the calculated illuminance.
- the light source information set by the light source information setting unit 0021 the optical characteristics set by the optical property setting unit 0022, and a method of creating illuminance volume data by the illuminance calculation unit 0023 will be described.
- FIG. 3 is a conceptual diagram schematically showing the luminance volume data and the positional relationship between the light sources.
- the light source (parallel light source) 0302 is set in the light source direction 0303 with respect to the luminance volume data 0301 in the volume data storage unit 0015 by the control unit 0003 and the operation unit 0004.
- the position of the light source 0302 in the three-dimensional space, the light source direction 0303, and the light source data are generated by the light source information setting unit 0021.
- a plane 0304 is a position of a plane where the luminance volume data 0301 first intersects (contacts) an orthogonal plane in the light source direction 0303, and indicates an illuminance calculation start position.
- a plane 0305 is the position of the plane where the luminance volume data 0301 finally intersects (is in contact with) the orthogonal plane of the light source direction 0303, and indicates the illuminance calculation end position.
- the illuminance calculation unit 0023 performs illuminance calculation on a surface orthogonal to the light source direction 0303 (an orthogonal surface in the light source direction 0303). In FIG. 3, the illuminance calculation unit 0023 performs illuminance calculation in the range from the surface 0304 to the surface 0305, and performs illuminance calculation on the surface 0307 in the illuminance calculation of the sample 0306 located in the light source direction 0303, for example.
- the illuminance calculation unit 0023 includes an illuminance volume data storage unit 0401, a light source data holding unit 0402, a two-dimensional convolution processing unit 0403, and a weighted addition unit 0404.
- the illuminance calculation unit 0023 performs a two-dimensional convolution integration on the light source data, thereby generating a two-dimensional convolution integration data 0403, and the light source data and the two-dimensional convolution integration data.
- a weighted addition unit 0404 that creates the illuminance volume data by performing weighted addition based on the weighting factor.
- the illuminance calculation unit 0023 includes a light source data holding unit 0402 that holds an initial value of the light source data and a result of the weighted addition by the weighted addition unit as input light source data, and starts illuminance calculation in the luminance volume data.
- a two-dimensional convolution integration data is generated by performing two-dimensional convolution integration on the input light source data while switching the voxel luminance from the position to the illuminance calculation end position, and the input light source data and the two-dimensional convolution integration data are generated.
- the illuminance volume data is created by performing weighted addition based on the weight coefficient.
- the light source data holding unit 0402 receives the light source data generated by the light source information setting unit 0021 and holds it as an initial value.
- the light source data held by the light source data holding unit 0402 is hereinafter referred to as “input light source data”.
- the two-dimensional convolution processing unit 0403 generates two-dimensional convolution integration data by performing two-dimensional convolution integration on the input light source data (light source data).
- the two-dimensional convolution integration process indicates a convolution integration on a two-dimensional plane, and is a two-dimensional convolution integration process of input light source data (light source data) and a convolution kernel representing the characteristics of scattering. Implemented.
- the convolution kernel is configured by a two-dimensional matrix and is set by the control unit.
- the weighted addition unit 0404 receives the two-dimensional convolution integration data, which is the output result of the two-dimensional convolution processing unit 0403, and inputs the input light source data held by the light source data holding unit 0402.
- the weighted addition unit 0404 creates illuminance volume data by performing weighted addition on the input light source data (light source data) and the two-dimensional convolution integration data based on the weight coefficient.
- the weighting factor used by the weighted addition unit 0404 is set by the optical property setting unit 0022 as the optical property of the luminance volume data for the light source.
- the weighted addition result created by the weighted addition unit 0404 is hereinafter referred to as “output illuminance data”.
- the output illuminance data is stored at a position corresponding to the voxel coordinates of the illuminance volume data storage unit 0401.
- the output illuminance data is input to the light source data holding unit 0402 and stored (held) as input light source data. That is, the light source data holding unit 0402 holds the initial value of the light source data and the result of weighted addition by the weighted addition unit 0404 as input light source data.
- the initial input light source data is the light source data set in the light source information setting unit 0021, and is input and set (held) in the light source data holding unit 0402 before the illuminance calculation unit 0023 starts the illuminance calculation.
- the illuminance calculation unit 0023 switches the voxel luminance from the illuminance calculation start position (plane 0304) to the illuminance calculation end position (plane 0305) in the luminance volume data.
- two-dimensional convolution integration data is generated, and weighted addition is performed on the input light source data and two-dimensional convolution integration data based on the weighting coefficient, thereby obtaining an illuminance volume. Create data.
- the two-dimensional weighting coefficient table 0501 includes weighting coefficients set by the control unit 0003.
- the two-dimensional weighting coefficient table 0501 includes two values of the luminance volume data and the distance from the body surface (or tissue surface). Is a two-dimensional table for referring to a weighting factor stored two-dimensionally using as an index. That is, the weighting factor is defined by the two-dimensional weighting factor table 0501 using the luminance of the luminance volume data and the distance from the surface of the object as indices.
- the optical characteristic setting unit 0022 sets the weighting coefficient according to the luminance of the luminance volume data and the distance from the surface of the object.
- the optical characteristics in this embodiment are defined by weighting factors set so as to reproduce the behavior (behavior) of light based on the optical characteristics of the tissue, and are set by the optical characteristics setting unit 0022.
- the optical characteristic setting unit 0022 sets a two-dimensional weighting coefficient table 0501 including weighting coefficients as the optical characteristics of the luminance volume data.
- the weighting coefficients referred to from the two-dimensional weighting coefficient table 0501 are a and b based on the two indexes of the luminance volume data and the distance from the body surface (or tissue surface). A case where there are two will be described.
- the weighting factor added to the input light source data is a and the weighting factor added to the two-dimensional convolution integration data is b
- the light behavior is adjusted by adjusting the size of a and b. (Such as the degree of scattering) can be set easily.
- the weighted sum of the weight coefficients a and b, the light source data, and the two-dimensional convolution integration data is output to the illuminance volume data storage unit 0401.
- the two-dimensional weighting coefficient table 0501 includes the luminance and the distance from the body surface (or tissue surface) as two reference indices.
- the luminance that reflects the acoustic impedance of the tissue can be useful information that reflects the characteristics of the biological tissue.
- the luminance in the ultrasonic data reflects the amplitude of the reflected wave obtained by reflecting the emitted ultrasonic wave from the scatterer, and is usually attenuated as the ultrasonic wave propagates to the deep part. Therefore, it is difficult to classify tissues only with luminance in ultrasonic data. Therefore, by adding the distance from the body surface (or tissue surface) of the object as an index, it is possible to classify the tissue in the ultrasonic data.
- the intensity of the ultrasonic wave reflected from the bone (hard tissue) of the arm should be high.
- the brightness at the moment of reaching the surface of the arm is as high as that of the bone part because the arm surface is soft tissue and no attenuation occurs.
- the distance from the body surface of the object is added as an index. Since the bone body exists inside the fetal tissue, it is possible to discriminate the tissue by setting the tissue characteristics using both the distance from the body surface (or the tissue surface) and the luminance. .
- the distance from the body surface (or tissue surface) is determined to fall within the tissue when the brightness of a certain voxel is higher than a preset threshold value, and the distance from the body surface (or tissue surface) is determined. A distance corresponding to one voxel is added to the distance value.
- the luminance of a certain voxel is lower than a preset threshold value, it is determined that it does not fall within the tissue, and the value of the distance from the body surface (or tissue surface) in that voxel is initialized.
- the distance from the body surface (or tissue surface) as an indicator of the weighting factor, high-intensity soft tissue exists on the tissue surface, such as a fetal arm, and the same brightness as that of the soft tissue at a deep position from the tissue surface. If there is a bone part, even if the brightness is the same, by setting different weighting factors according to the distance from the body surface (or tissue surface), different optical effects are given depending on the tissue it can.
- a two-dimensional weighting coefficient table that reflects the characteristics of the tissue is set without performing complex calculations, and the behavior of light (such as the degree of scattering) is adjusted based on the two-dimensional weighting coefficient table.
- the behavior of light such as the degree of scattering
- an optical effect in the tissue can be easily and arbitrarily imparted, and a three-dimensional image with improved reality can be created according to the characteristics of the tissue (for example, the hardness of the tissue).
- the illuminance calculation unit 0023 repeatedly performs the above illuminance calculation process while switching the voxel luminance referred to by the weighted addition unit 0404 from the illuminance calculation start position (plane 0304) to the illuminance calculation end position (plane 0305).
- the illuminance calculation unit 0023 performs the calculation up to the illuminance calculation end position, then creates illuminance volume data in which the illuminance arranged on the luminance volume data is calculated, and stores it in the illuminance volume data storage unit 0401.
- the behavioral characteristics of light vary according to the wavelength of the light source according to the laws of nature. Therefore, when the reality is further improved in accordance with the laws of nature, the illuminance calculation is performed for each wavelength of the light source. In this case, the weighting factor is different for each wavelength of the light source.
- the light source information setting unit 0021 sets light source data corresponding to a plurality of wavelengths of the light source.
- the optical property setting unit 0022 sets a weighting factor for each of a plurality of wavelengths.
- the illuminance calculation unit 0023 performs illuminance calculation for each of a plurality of wavelengths of the light source 0302 to create illuminance volume data. For example, when the light source 0302 has seven visible light colors, the illuminance calculation unit 0023 sets seven types of weighting factors (or two-dimensional weighting factor tables) and generates seven types of illuminance volume data. In addition, when the light source 0302 is an additive color mixed primary color, the illuminance calculation unit 0023 sets three types of weighting coefficients (or two-dimensional weighting coefficient table) corresponding to the wavelengths of the R element, G element, and B element. Three types of illuminance volume data are generated.
- the light source information setting unit 0021 sets the light source data according to a plurality of wavelengths of the light source
- the optical property setting unit 0022 sets the weighting factor for each of the plurality of wavelengths
- the illuminance calculation unit 0023 The illuminance volume data is created for each of the plurality of wavelengths.
- the light source 0302 is an additive color mixture of three primary colors
- three types of weighting factors or two-dimensional weighting factor table
- three types of illuminance volume data are generated.
- An initial value of light source data is set for each wavelength of the light source 0302. That is, the light source information setting unit 0021 sets initial values of the same number of light source data as the number of effective wavelengths. Therefore, in this embodiment, three types of light source data corresponding to the wavelengths of the R element, G element, and B element are set, and are held by the light source data holding unit 0402 as independent input light source data.
- the initial values of the three types of light source data may be initial values selected by the operator via the operation unit 0004, or may be initial values set using an image.
- the illuminance calculation unit 0023 calculates the illuminance arranged on the luminance volume data based on the three types of light source data and the three types of optical characteristics (weighting factor or two-dimensional weighting factor table), and calculates the three types of illuminance volume. Create data.
- the projection processing unit 0018 creates a three-dimensional image based on the illuminance of the illuminance volume data and the opacity referenced by the luminance of the luminance volume data.
- the projection processing unit 0018 creates a three-dimensional image from the three types of illuminance volume data created by the illuminance calculation unit 0023 and the luminance volume data stored in the volume data storage unit 0015.
- the projection processing unit 0018 performs illuminance volume data L_r [k], L_g [k], for each wavelength (R element, G element, B element).
- a 3D image is generated based on illuminance (voxel value) at L_b [k], luminance volume data luminance (voxel value) C, opacity table ⁇ referenced by luminance C, and gradient volume data S [k]. Is done. That is, the opacity term obtained by the opacity table ⁇ referred to by the luminance C of the luminance volume data, the gradient in the illuminance volume data L_r [k], L_g [k], L_b [k] for each wavelength, and the gradient A three-dimensional image is generated by multiplying the value of the volume data S [k] and accumulating in the line-of-sight direction. “K” in the equation represents voxel coordinates in the line-of-sight direction.
- the line-of-sight direction is set by the operation unit 0004 as a direction for observing the ultrasonic image via the control unit 0003.
- the three-dimensional image created by the three-dimensional image processing unit 0016 is arranged on the same screen as an arbitrary cross-sectional image by the image synthesis unit 0017 and displayed on the display unit 0009.
- the ultrasonic diagnostic apparatus 0001 includes the gradient calculation unit 0019, but the gradient calculation unit 0019 may be omitted.
- the term of the gradient volume data S [k] in the expressions (1) to (3) is removed from the expressions (1) to (3) (or treated as “1.0”). , Does not contribute to the created three-dimensional image.
- a three-dimensional image 0601 in FIG. 6 is a three-dimensional image configured by the method of the present embodiment
- the three-dimensional image 0602 is a three-dimensional image configured by a general volume rendering method typified by the Levoy method. It is an image.
- a conventional three-dimensional image 0602 has a dark and thin shadow 0604 along the contour of the fetal face.
- the boundary is clarified by emphasizing the shadow 0603 and raising the contour with respect to the contour of the face.
- the internal eye angle of the fetus is represented by a thin outline 0606.
- the fetal inner eye angle is displayed with being emphasized by a deep shadow 0605, and the boundary is clear. In this way, it is possible to obtain a natural image with improved reality in the volume rendering method by clarifying the boundary by enhancing the shadow.
- FIG. 7 is a diagram showing a display example in the present embodiment. As shown in FIG. 7, three cross-sections 0701, 0702, 0703 and three-dimensional images 0704 are displayed at the same time. As described above, the three-dimensional image created by the three-dimensional image processing unit 0016 is arranged on the same screen as the three orthogonal cross sections (or arbitrary cross sectional images) 0701, 0702, and 0703 by the image composition unit 0017, and the display unit 0009 is displayed. Inspection accuracy and efficiency can be improved by observing the surface with a three-dimensional image while referring to each cross section.
- FIG. 8 is a block diagram showing a modification of the present embodiment.
- FIG. 9 is a diagram illustrating a conceptual diagram of illuminance calculation according to a modification of the present embodiment.
- the ultrasound diagnostic apparatus 0001 may include an illuminance correction unit 0080, a correction optical characteristic setting unit 0081, and a correction light source information setting unit 0082 after the illuminance calculation unit 0023.
- the ultrasonic diagnostic apparatus 0001 according to the present embodiment sets the direction opposite to the line-of-sight direction in the three-dimensional space as the correction light source direction, and sets correction light source data representing the characteristics of the correction light source that emits light in the correction light source direction.
- the illuminance volume data by the illuminance calculation unit 0023 calculates the arrangement of the light intensity in the direction from the proximal to the distal of the light source 0302.
- the illuminance observed from the observer's viewpoint 0900 is the illuminance as a result of light propagating from the distal to the proximal direction of the observer's line-of-sight direction 0901. It can be added.
- the corrected light source information setting unit 0082 sets the corrected light source on the opposite side of the viewpoint 0900 and sets the corrected light source direction 0902 in the direction opposite to the visual line direction 0901. That is, the correction light source information setting unit 0082 sets the direction opposite to the line-of-sight direction 0901 in the three-dimensional space as the correction light source direction 0902, and sets correction light source data representing the characteristics of the correction light source that emits light in the correction light source direction 0902. .
- the corrected optical characteristic setting unit 0081 sets a weighting factor in the direction opposite to the line-of-sight direction 0901 (corrected light source direction 0902). That is, the correction optical characteristic setting unit 0081 sets a weighting coefficient that represents the optical characteristic of the luminance volume data for the correction light source.
- the illuminance correction unit 0080 performs illuminance correction calculation in order to create corrected illuminance volume data in which the illuminance volume data is corrected from the distal direction to the proximal direction in the line-of-sight direction. That is, the illuminance correction unit 0080 calculates the illuminance at a position corresponding to the coordinates of the luminance volume data based on the corrected light source data and the weighting coefficient, and creates corrected illuminance volume data based on the calculated corrected illuminance.
- the light source 0302 and the light source direction 0303 are set for the luminance volume data 0301 as in FIG.
- the correction light source information setting unit 0082 sets the correction light source on the opposite side of the viewpoint 0900 and sets the correction light source direction 0902 in the direction opposite to the line-of-sight direction. Set.
- a plane 0904 is a plane position where the luminance volume data 0301 first intersects (is in contact with) the orthogonal plane of the corrected light source direction 0902 and includes the first voxel in the corrected light source direction 0902, and the illuminance calculation start position Indicates.
- a plane 0905 is a plane position where the luminance volume data 0301 last intersects (is in contact with) the orthogonal plane of the corrected light source direction 0902 and includes the last voxel in the corrected light source direction 0902, and the illuminance calculation end position Indicates.
- the illuminance correction unit 0080 performs illuminance correction on a plane orthogonal to the light source direction 0902. As illustrated in FIG. 9, the illuminance correction unit 0080 performs illuminance correction in the range of the surface 0904 to the surface 0905. For example, in the illuminance correction of the sample 0906 located in the correction light source direction 0902, the illuminance correction calculation is performed on the surface 0903. Is called.
- the illuminance correction unit 0080 includes an addition unit 1001, a corrected illuminance volume data storage unit 1002, an illuminance volume data storage unit 0401, a light source data holding unit 0402, a two-dimensional convolution processing unit 0403, and a weighted addition unit. 0404 is provided.
- the components denoted by the same reference numerals in FIGS. 4 and 10 have the same functions unless otherwise specified.
- the illuminance correction unit 0080 performs a two-dimensional convolution by performing an addition unit 1001 that adds the corrected light source data and the illuminance volume data, and two-dimensional convolution integration on the added value of the corrected light source data and the illuminance volume data.
- the corrected illuminance volume data is generated by performing weighted addition on the two-dimensional convolution processing unit 0403 that generates integral data and the corrected light source data and the two-dimensional convolution integrated data based on the weighting factor.
- the illuminance correcting unit 0080 For the input light source data (corrected light source data) stored in the light source data holding unit 0402, the illuminance correcting unit 0080 reads the output illuminance data of the corresponding coordinates from the illuminance volume data storage unit 0401, and the input light source data (corrected light source data). Is added to the output illuminance data. That is, the adding unit 1001 adds the corrected light source data and the illuminance volume data.
- the light source data holding unit 0402 of the illuminance correction unit 0080 is different from the light source data holding unit 0402 of the illuminance calculation unit 0023 in that it does not have an initial value.
- the addition unit 1001 adds the input light source data stored in the light source data holding unit 0402 and the output illuminance data of the corresponding coordinates read from the illuminance volume data storage unit 0401, and the input light source data Update and hold.
- the two-dimensional convolution processing unit 0403 performs two-dimensional convolution integration on the input light source data held by the adding unit 1001. That is, the two-dimensional convolution processing unit 0403 generates two-dimensional convolution integration data by performing two-dimensional convolution integration on the added value of the corrected light source data and the illuminance volume data stored in the illuminance volume data storage unit 0401. To do.
- the weighted addition unit 0404 receives the two-dimensional convolution integration data that is the output result of the two-dimensional convolution processing unit 0403, and inputs the updated input light source data held by the addition unit 1001.
- the weighted addition unit 0404 performs weighted addition on the output result of the two-dimensional convolution processing unit 0403 and the updated input light source data held in the addition unit 1001. That is, the weighted addition unit 0404 creates corrected illuminance volume data by performing weighted addition on the corrected light source data and the two-dimensional convolution integration data based on the weighting coefficient.
- the weighting coefficient used by the weighted addition unit 0404 is set by the correction optical characteristic setting unit 0081 set for correction.
- the weighting factor may be referred to from the two-dimensional table expressed two-dimensionally using the luminance volume data luminance and the distance from the body surface (or tissue surface) as indices. .
- the corrected illuminance volume data storage unit 1002 stores the result of the weighted addition unit 0404 together with position information corresponding to the voxel coordinates.
- the result of the weighted addition unit 0404 is input to the light source data holding unit 0402 and stored (held) as input light source data.
- the illuminance observed from the observer's viewpoint 0900 takes into account the illuminance resulting from the propagation of light from the distal to the proximal direction of the observer's line-of-sight direction 0901.
- corrected illuminance volume data in which the illuminance from two directions of the light source direction 0303 and the corrected light source direction 0902 is calculated can be created.
- the corrected illuminance volume data storage unit 1002 and the illuminance volume data storage unit 0401 are configured independently, but a configuration using a common memory area is also possible.
- the illuminance calculation (or illuminance correction calculation) is performed for each wavelength of the light source when the reality is further improved, as in the above-described embodiment. May be.
- illuminance calculation (or illuminance correction calculation) is repeatedly performed for each set wavelength, and corrected illuminance volume data for each set wavelength is created.
- the illuminance correction unit 0080 sets three types of weighting factors (or two-dimensional weighting factor table) corresponding to the wavelengths of the R element, G element, and B element. Three types of corrected illuminance volume data are generated.
- the projection processing unit 0018 creates a three-dimensional image from the three types of corrected illuminance volume data created by the illuminance correction unit 0080 and the luminance volume data stored in the volume data storage unit 0015. That is, the projection processing unit 0018 creates a three-dimensional image from the luminance volume data and the corrected illuminance volume data.
- the two-dimensional convolution processing unit 0403 has a characteristic structure. Therefore, the two-dimensional convolution processing unit 0403 will be mainly described.
- the two-dimensional convolution processing unit 0403 reads the input light source data from the light source data holding unit 0402 in the illuminance calculation unit 0023, performs two-dimensional convolution integration processing, and adds the two-dimensional convolution integration data with weights. The data is output to the unit 0404.
- the two-dimensional convolution processing unit 0403 generates two or more two-dimensional convolution integration data. That is, the two-dimensional convolution processing unit 0403 generates a plurality of two-dimensional convolution integration data by performing a plurality of two-dimensional convolution integrations on the light source data.
- the weighted addition unit 0404 performs weighted addition processing on the input light source data read from the light source data holding unit 0402 and the plurality of two-dimensional convolution integration data generated by the two-dimensional convolution processing unit 0403, and outputs illuminance. Data is created and stored in the corresponding voxel coordinates of the illuminance volume data storage unit 0401.
- the configuration of the two-dimensional convolution processing unit 0403 shown in FIG. 11 will be described.
- the two-dimensional convolution processing unit 0403 includes two or more two-dimensional convolution processing units.
- the two-dimensional convolution processing units 0403-1 to 0403-N output different two-dimensional convolution integration data for the input light source data, and output the data to the weighted addition unit 0404, respectively.
- the weighting coefficient in the weighted addition unit 0404 holds input light source data and coefficients for a plurality of two-dimensional convolution integration data created by the two-dimensional convolution processing units 0403 (0403-1 to 0403-N). Yes.
- a different weighting factor for each output result of the two-dimensional convolution processing unit 0403 (0403-1 to 0403-N) may be referred to from the two-dimensional table and used in the weighted addition unit 0404.
- the ultrasound diagnostic apparatus 0001 since the ultrasound diagnostic apparatus 0001 includes the plurality of two-dimensional convolution processing units 0403-1 to 0403-N, it is possible to express a plurality of shading effects according to the behavior of light.
- a three-dimensional image in which the illuminance is calculated based on more natural light behavior (for example, scattering) can be created.
- the display unit 0009 may display a color map indicating the hue obtained from the brightness and the distance from the body surface. That is, the display unit 0009 displays a color map corresponding to the two-dimensional weighting coefficient table that defines the weighting coefficient and using the brightness of the brightness volume data and the distance from the surface of the object as indices.
- FIG. 12 is a diagram showing that a color map is displayed in the display example in the present embodiment. As shown in FIG. 12, as in FIG. 7, three cross sections 0701 to 0704 and a three-dimensional image 0704 that are orthogonal to each other are displayed simultaneously. A color map 1201 is displayed. The color map 1201 is a pseudo color map for visually recognizing the hue of the three-dimensional image realized by the two-dimensional weighting coefficient table 0501.
- the color map 1201 arranges luminance voxel values on the vertical axis. As described with reference to FIGS. 4 and 5, the color map 1201 represents the number of repetitions of the illuminance calculation performed based on the two-dimensional weighting coefficient table 0501 according to the distance from the tissue surface (distance from the tissue surface. On the horizontal axis. As described above, the color map 1201 is a reference image indicating the hue obtained from the luminance and the distance from the body surface.
- the operator can recognize what hue is assigned to the three-dimensional image (illumination three-dimensional image) 0704 by checking the color map 1201. For example, it is possible to recognize whether the displayed region is a bone or a soft tissue. It is also possible to transpose the axis direction of the color map 1201 or to invert the axis.
- the color map 1201 may be selected from a plurality of color maps.
- the display unit 0009 selectively displays a plurality of color maps corresponding to a plurality of two-dimensional weighting factor tables that define weighting factors according to the tissue of the target object (eg, fetal face region or bone region). Also good.
- FIG. 13 is a diagram showing a method for selecting the color map 1201. As shown in FIG. 13, the color map 1201 can be selected by operating a graphical interface 1300 for selecting an area displayed on the display unit 0009 with an operation unit 0004 (pointer, trackball, encoder, etc.). is there.
- the selection screen 1301 in FIG. 13 is an example of a button displayed at the time of inspection, and a button corresponding to a target can be selected. For example, by selecting a button corresponding to a target from a target 1 button 1302, a target 2 button 1303,..., A target N button 1304, a three-dimensional image corresponding to the target can be created.
- a weighting coefficient (two-dimensional weighting coefficient table 0501) suitable for the face of the fetus is selected, and the optical characteristic setting unit 0022 or the corrected optical characteristic is selected.
- a color map 1312 set in the setting unit 0081 and corresponding to the target 1 button 1302 is displayed.
- the target 2 button 1303 designates a fetal bone
- a weighting factor two-dimensional weighting factor table 0501
- the optical property setting unit 0022 or the correction optical property setting is selected.
- a color map 1313 corresponding to the target 2 button 1303 is displayed in the section 0081.
- the area selection graphical interface 1300 can also display the selected color maps 1312-1314.
- the color maps 1312 to 1314 are color maps created based on the two-dimensional weighting coefficient table 0501 selected by the respective target buttons 1302 to 1304, and are displayed simultaneously with the target buttons 1302 to 1304, so that the operator Can select an appropriate color map without hesitation.
- a selection screen 1305 may be displayed on the graphical interface 1300 for region selection.
- the selection screen 1305 is a different example of buttons displayed at the time of inspection, and includes a target display area 1306 for displaying the name of a selected target, an upper target selection button 1307, and a lower target selection button 1308.
- a plurality of targets can be switched using the upper target selection button 1307 and the lower target selection button 1308. Therefore, when the operator operates the upper target selection button 1307 or the lower target selection button 1308, a three-dimensional image corresponding to the target can be created.
- the target 1 specifies the face of the fetus and the target 2 (target 2 button) specifies the bone of the fetus
- the target is sequentially switched by the upper target selection button 1307 or the lower target selection button 1308.
- a weighting factor suitable for the face of the fetus is selected, set in the optical characteristic setting unit 0022 or the correction optical characteristic setting unit 0081, and the color map 1201 is switched.
- the input light source data can also be switched depending on the target, and the input light source data corresponding to the target can be set in the light source information setting unit 0021.
- a target selection screen is prepared for each of the weighting factor and the input light source data, and the weighting factor and the input light source data can be selected (or controlled) independently.
- the ultrasonic diagnostic apparatus has the effect of being able to create a three-dimensional image that expresses a shadow effect due to light leakage or absorption, and generates a three-dimensional projection image from ultrasonic luminance volume data. It is useful as an ultrasonic diagnostic apparatus.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Surgery (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- Radiology & Medical Imaging (AREA)
- Molecular Biology (AREA)
- Computer Graphics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Pregnancy & Childbirth (AREA)
- Gynecology & Obstetrics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Image Generation (AREA)
Abstract
Description
OUT_G[K] = Σk=0:K ((L_g[k]・S[k])・α[C[k]]・Πm=k+1:K (1-α[C[m]]))・・・(2)
OUT_B[K] = Σk=0:K ((L_b[k]・S[k])・α[C[k]]・Πm=k+1:K (1-α[C[m]]))・・・(3)
0002 超音波探触子
0003 制御部
0004 操作部
0005 送信部
0006 受信部
0007 送受信制御部
0008 整相加算部
0009 表示部
0011 断層情報演算部
0012 三次元データ記憶部
0013 任意断面画像作成部
0014 三次元座標変換部
0015 ボリュームデータ記憶部
0016 三次元画像処理部
0017 画像合成部
0018 投影処理部
0019 勾配演算部
0021 光源情報設定部
0022 光学特性設定部
0023 照度演算部
0080 照度補正部
0081 補正光学特性設定部
0082 補正光源情報設定部
0401 照度ボリュームデータ記憶部
0402 光源データ保持部
0403 二次元畳み込み処理部
0404 加算部
1001 加算部
1002 補正照度ボリュームデータ記憶部
1201,1312,1313,1314 カラーマップ
1300 グラフィカルインターフェイス
1301,1305 選択画面
Claims (14)
- 輝度ボリュームデータに基づいて対象物の三次元画像を表示する超音波診断装置であって、
三次元空間に設定される光源の特性を表す光源データを設定する光源情報設定部と、
前記光源に対する前記輝度ボリュームデータの光学特性を表す重み係数を設定する光学特性設定部と、
前記光学データ及び前記重み係数に基づいて、前記輝度ボリュームデータの座標に応じた位置の照度を算出し、算出された前記照度に基づいて照度ボリュームデータを作成する照度演算部と、
前記輝度ボリュームデータ及び前記照度ボリュームデータから前記三次元画像を作成する投影処理部と
を備えることを特徴とする超音波診断装置。 - 前記照度演算部は、
前記光源データに対して二次元畳み込み積分を行うことにより、二次元畳み込み積分データを生成する二次元畳み込み処理部と、
前記光源データ及び前記二次元畳み込み積分データに対して、前記重み係数に基づいて重み付き加算を行うことにより、前記照度ボリュームデータを作成する重み付き加算部と
を備えることを特徴とする請求項1に記載の超音波診断装置。 - 前記照度演算部は、
前記光源データの初期値と前記重み付き加算部による前記重み付き加算の結果とを、入力光源データとして保持する光源データ保持部を備え、
前記輝度ボリュームデータにおける照度演算開始位置から照度演算終了位置までボクセル輝度を切り替えながら、前記入力光源データに対して二次元畳み込み積分を行うことにより、二次元畳み込み積分データを生成し、前記入力光源データ及び前記二次元畳み込み積分データに対して、前記重み係数に基づいて重み付き加算を行うことにより、前記照度ボリュームデータを作成することを特徴とする請求項1又は2に記載の超音波診断装置。 - 前記投影処理部は、前記照度ボリュームデータの照度と、前記輝度ボリュームデータの輝度により参照される不透明度とに基づいて、前記三次元画像を作成することを特徴とする請求項1乃至3のいずれか1つに記載の超音波診断装置。
- 前記光学特性設定部は、前記輝度ボリュームデータの輝度と前記対象物の表面からの距離とに応じて前記重み係数を設定することを特徴とする請求項1乃至4のいずれか1つに記載の超音波診断装置。
- 前記重み係数は、前記輝度ボリュームデータの輝度と前記対象物の表面からの距離とを指標とする二次元重み係数テーブルにより規定されることを特徴とする請求項1乃至5のいずれか1つに記載の超音波診断装置。
- 前記光源情報設定部は、前記光源の強さ、前記三次元空間上の前記光源の位置、前記光源の方向、前記光源の色調、及び前記光源の形状のうち少なくとも1つを調整して光源データを設定することを特徴とする請求項1乃至6のいずれか1つに記載の超音波診断装置。
- 前記光源情報設定部は、前記光源の複数の波長に応じた前記光源データを設定し、
前記光学特性設定部は、前記複数の波長ごとに前記重み係数を設定し、
前記照度演算部は、前記複数の波長ごとに前記照度ボリュームデータを作成することを特徴とする請求項1乃至7のいずれか1つに記載の超音波診断装置。 - 前記三次元空間における視線方向の逆方向を補正光源方向として設定し、補正光源方向に光を照射する補正光源の特性を表す補正光源データを設定する補正光源情報設定部と、
前記補正光源に対する前記輝度ボリュームデータの光学特性を表す重み係数を設定する補正光学特性設定部と、
前記補正光源データ及び前記重み係数に基づいて、前記輝度ボリュームデータの座標に応じた位置の照度を算出し、算出された前記補正照度に基づいて補正照度ボリュームデータを作成する照度補正部とを備え、
前記投影処理部は、前記輝度ボリュームデータ及び前記補正照度ボリュームデータから前記三次元画像を作成することを特徴とする請求項1乃至8のいずれか1つに記載の超音波診断装置。 - 前記照度補正部は、
前記補正光源データ及び前記照度ボリュームデータを加算する加算部と、
前記補正光源データ及び前記照度ボリュームデータの加算値に対して二次元畳み込み積分を行うことにより、二次元畳み込み積分データを生成する二次元畳み込み処理部と、
前記補正光源データ及び前記二次元畳み込み積分データに対して、前記重み係数に基づいて重み付き加算を行うことにより、前記補正照度ボリュームデータを作成する重み付き加算部と
を備えることを特徴とする請求項9に記載の超音波診断装置。 - 前記重み係数を規定する二次元重み係数テーブルに対応し、前記輝度ボリュームデータの輝度と前記対象物の表面からの距離とを指標とするカラーマップを表示する表示部を備えることを特徴とする請求項1乃至10の何れか1つに記載の超音波診断装置。
- 前記表示部は、前記対象物の組織に応じた前記重み係数を規定する複数の二次元重み係数テーブルに対応する複数の前記カラーマップを選択的に表示することを特徴とする請求項11に記載の超音波診断装置。
- 前記二次元畳み込み処理部は、前記光源データに対して複数の二次元畳み込み積分を行うことにより、複数の二次元畳み込み積分データを生成することを特徴とする請求項2乃至12の何れか1つに記載の超音波診断装置。
- 輝度ボリュームデータに基づいて対象物の三次元画像を表示する超音波三次元画像作成方法であって、
三次元空間に設定される光源の特性を表す光源データを設定し、
前記光源に対する前記輝度ボリュームデータの光学特性を表す重み係数を設定し、
前記光学データ及び前記重み係数に基づいて、前記輝度ボリュームデータの座標に応じた位置の照度を算出し、算出された前記照度に基づいて照度ボリュームデータを作成し、
前記輝度ボリュームデータ及び前記照度ボリュームデータから前記三次元画像を作成することを特徴とする超音波三次元画像作成方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014538389A JP6222847B2 (ja) | 2012-09-26 | 2013-09-12 | 超音波診断装置及び超音波三次元画像作成方法 |
CN201380060365.7A CN104812312B (zh) | 2012-09-26 | 2013-09-12 | 超声波诊断装置以及超声波三维图像创建方法 |
US14/431,362 US10016181B2 (en) | 2012-09-26 | 2013-09-12 | Ultrasound diagnostic apparatus and ultrasound three-dimensional image creation method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012213185 | 2012-09-26 | ||
JP2012-213185 | 2012-09-26 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2014050601A1 true WO2014050601A1 (ja) | 2014-04-03 |
WO2014050601A9 WO2014050601A9 (ja) | 2015-02-12 |
Family
ID=50388008
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/074740 WO2014050601A1 (ja) | 2012-09-26 | 2013-09-12 | 超音波診断装置及び超音波三次元画像作成方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10016181B2 (ja) |
JP (1) | JP6222847B2 (ja) |
CN (1) | CN104812312B (ja) |
WO (1) | WO2014050601A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016135252A (ja) * | 2015-01-23 | 2016-07-28 | 東芝メディカルシステムズ株式会社 | 医用画像処理装置及び医用画像診断装置 |
JP2017505482A (ja) * | 2014-01-23 | 2017-02-16 | パーキンエルマー セルラー テクノロジーズ ジャーマニー ゲーエムベーハー | インビボ画像から哺乳動物胸郭への組織内部の自動化検出のための方法およびシステム |
JP2017514633A (ja) * | 2014-05-09 | 2017-06-08 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 3d超音波ボリュームを所望の方向に配置する撮像システム及び方法 |
US9999400B2 (en) | 2015-07-29 | 2018-06-19 | Perkinelmer Health Services, Inc. | Systems and methods for automated segmentation of individual skeletal bones in 3D anatomical images |
US10136869B2 (en) | 2016-03-25 | 2018-11-27 | Perkinelmer Health Sciences, Inc. | Systems and methods for characterizing a central axis of a bone from a 3D anatomical image |
US10813614B2 (en) | 2017-05-24 | 2020-10-27 | Perkinelmer Health Sciences, Inc. | Systems and methods for automated analysis of heterotopic ossification in 3D images |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102054680B1 (ko) * | 2013-01-23 | 2020-01-22 | 삼성전자주식회사 | 영상 처리 장치, 초음파 영상 장치 및 영상 처리 방법 |
KR101851221B1 (ko) * | 2013-07-05 | 2018-04-25 | 삼성전자주식회사 | 초음파 영상 장치 및 그 제어 방법 |
WO2017163103A1 (en) * | 2016-03-21 | 2017-09-28 | Ultrasonix Medical Corporation | Visualization of ultrasound vector flow imaging (vfi) data |
US11259782B2 (en) | 2017-06-20 | 2022-03-01 | Canon Medical Systems Corporation | Medical imaging data processing apparatus and method |
CN110584709B (zh) * | 2019-08-14 | 2022-03-11 | 深圳市德力凯医疗设备股份有限公司 | 一种脑血流数据的采集方法、存储介质及超声设备 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000254123A (ja) * | 1993-08-05 | 2000-09-19 | Olympus Optical Co Ltd | 超音波画像診断装置 |
JP2003061956A (ja) * | 2001-08-30 | 2003-03-04 | Toshiba Corp | 超音波診断装置、医用診断装置及び画像処理方法 |
JP2008259697A (ja) * | 2007-04-12 | 2008-10-30 | Fujifilm Corp | 画像処理方法および装置ならびにプログラム |
JP2010188118A (ja) * | 2009-01-20 | 2010-09-02 | Toshiba Corp | 超音波診断装置、超音波画像処理装置、画像処理方法および画像表示方法 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5497776A (en) * | 1993-08-05 | 1996-03-12 | Olympus Optical Co., Ltd. | Ultrasonic image diagnosing apparatus for displaying three-dimensional image |
JP5361103B2 (ja) * | 2000-10-24 | 2013-12-04 | 株式会社東芝 | 画像処理装置 |
JP2006130071A (ja) | 2004-11-05 | 2006-05-25 | Matsushita Electric Ind Co Ltd | 画像処理装置 |
US8465433B2 (en) * | 2008-05-27 | 2013-06-18 | Volusonics Medical Imaging Ltd. | Ultrasound garment |
CN102753103B (zh) * | 2010-02-09 | 2015-09-30 | 株式会社日立医疗器械 | 超声波诊断装置以及超声波图像显示方法 |
US8839672B2 (en) * | 2010-10-19 | 2014-09-23 | Board Of Regents, The University Of Texas System | Combined ultrasound and photoacoustic imaging of metal objects |
-
2013
- 2013-09-12 JP JP2014538389A patent/JP6222847B2/ja active Active
- 2013-09-12 CN CN201380060365.7A patent/CN104812312B/zh active Active
- 2013-09-12 US US14/431,362 patent/US10016181B2/en active Active
- 2013-09-12 WO PCT/JP2013/074740 patent/WO2014050601A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000254123A (ja) * | 1993-08-05 | 2000-09-19 | Olympus Optical Co Ltd | 超音波画像診断装置 |
JP2003061956A (ja) * | 2001-08-30 | 2003-03-04 | Toshiba Corp | 超音波診断装置、医用診断装置及び画像処理方法 |
JP2008259697A (ja) * | 2007-04-12 | 2008-10-30 | Fujifilm Corp | 画像処理方法および装置ならびにプログラム |
JP2010188118A (ja) * | 2009-01-20 | 2010-09-02 | Toshiba Corp | 超音波診断装置、超音波画像処理装置、画像処理方法および画像表示方法 |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017505482A (ja) * | 2014-01-23 | 2017-02-16 | パーキンエルマー セルラー テクノロジーズ ジャーマニー ゲーエムベーハー | インビボ画像から哺乳動物胸郭への組織内部の自動化検出のための方法およびシステム |
JP2017514633A (ja) * | 2014-05-09 | 2017-06-08 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 3d超音波ボリュームを所望の方向に配置する撮像システム及び方法 |
US10376241B2 (en) | 2014-05-09 | 2019-08-13 | Koninklijke Philips N.V. | Imaging systems and methods for positioning a 3D ultrasound volume in a desired orientation |
JP2016135252A (ja) * | 2015-01-23 | 2016-07-28 | 東芝メディカルシステムズ株式会社 | 医用画像処理装置及び医用画像診断装置 |
US9999400B2 (en) | 2015-07-29 | 2018-06-19 | Perkinelmer Health Services, Inc. | Systems and methods for automated segmentation of individual skeletal bones in 3D anatomical images |
US10178982B2 (en) | 2015-07-29 | 2019-01-15 | Perkinelmer Health Sciences, Inc. | System and methods for automated segmentation of individual skeletal bones in 3D anatomical images |
US10136869B2 (en) | 2016-03-25 | 2018-11-27 | Perkinelmer Health Sciences, Inc. | Systems and methods for characterizing a central axis of a bone from a 3D anatomical image |
US10548553B2 (en) | 2016-03-25 | 2020-02-04 | Perkinelmer Health Sciences, Inc. | Systems and methods for characterizing a central axis of a bone from a 3D anatomical image |
US10813614B2 (en) | 2017-05-24 | 2020-10-27 | Perkinelmer Health Sciences, Inc. | Systems and methods for automated analysis of heterotopic ossification in 3D images |
Also Published As
Publication number | Publication date |
---|---|
CN104812312A (zh) | 2015-07-29 |
US10016181B2 (en) | 2018-07-10 |
JPWO2014050601A1 (ja) | 2016-08-22 |
JP6222847B2 (ja) | 2017-11-01 |
WO2014050601A9 (ja) | 2015-02-12 |
US20160038124A1 (en) | 2016-02-11 |
CN104812312B (zh) | 2016-11-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6222847B2 (ja) | 超音波診断装置及び超音波三次元画像作成方法 | |
JP6427486B2 (ja) | 超音波診断装置及び超音波三次元画像作成方法 | |
JP6202757B2 (ja) | 超音波診断装置及び超音波二次元断層画像生成方法 | |
JP5730196B2 (ja) | 超音波診断装置、超音波画像処理装置、超音波画像生成方法 | |
JP4847334B2 (ja) | 超音波撮像装置及び投影像生成方法 | |
WO2014162966A1 (ja) | 超音波診断装置、及び弾性評価方法 | |
CN102028500B (zh) | 超声波诊断装置、超声波图像处理装置、以及超声波图像处理方法 | |
JP5723790B2 (ja) | 超音波診断装置 | |
WO2007043310A1 (ja) | 画像表示方法及び医用画像診断システム | |
JP5848709B2 (ja) | 超音波診断装置及び超音波画像表示方法 | |
WO2011099410A1 (ja) | 超音波診断装置及び超音波画像表示方法 | |
WO2011052400A1 (ja) | 超音波診断装置及び画像構成方法 | |
JP2012005593A (ja) | 三次元超音波画像を生成表示する超音波診断装置 | |
JP5996268B2 (ja) | 超音波診断装置、画像処理装置、及びプログラム | |
JP5653045B2 (ja) | 超音波診断装置 | |
JP7286025B2 (ja) | 胎盤を評価するためのシステム及び方法 | |
EP2508136A1 (en) | Apparatus and method for ultrasound imaging with contrast agents | |
KR20150005799A (ko) | 초음파 영상 장치 및 그 제어 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13842669 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014538389 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14431362 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13842669 Country of ref document: EP Kind code of ref document: A1 |