JP5996268B2 - Ultrasonic diagnostic apparatus, image processing apparatus, and program - Google Patents

Ultrasonic diagnostic apparatus, image processing apparatus, and program Download PDF

Info

Publication number
JP5996268B2
JP5996268B2 JP2012117819A JP2012117819A JP5996268B2 JP 5996268 B2 JP5996268 B2 JP 5996268B2 JP 2012117819 A JP2012117819 A JP 2012117819A JP 2012117819 A JP2012117819 A JP 2012117819A JP 5996268 B2 JP5996268 B2 JP 5996268B2
Authority
JP
Japan
Prior art keywords
image data
medical
processing
voxel
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2012117819A
Other languages
Japanese (ja)
Other versions
JP2013244047A (en
Inventor
吉田 哲也
哲也 吉田
哲也 川岸
哲也 川岸
田中 豪
豪 田中
Original Assignee
東芝メディカルシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 東芝メディカルシステムズ株式会社 filed Critical 東芝メディカルシステムズ株式会社
Priority to JP2012117819A priority Critical patent/JP5996268B2/en
Publication of JP2013244047A publication Critical patent/JP2013244047A/en
Application granted granted Critical
Publication of JP5996268B2 publication Critical patent/JP5996268B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  Embodiments described herein relate generally to an ultrasonic diagnostic apparatus, an image processing apparatus, and a program that generate medical two-dimensional image data from medical three-dimensional image data.

  In recent years, in the field of medical image acquisition apparatuses, it has become possible to acquire three-dimensional image data using various modalities such as CT apparatuses, MRI apparatuses, and ultrasonic diagnostic apparatuses. The ultrasonic diagnostic apparatus is used for, for example, scanning a fetus three-dimensionally and observing and diagnosing the fetus's body and face in three dimensions.

  By the way, when displaying such 3D image data on a typical display device such as a screen or a printer, the 3D image data is expressed as a 2D image. Many such methods and apparatuses have been proposed (see, for example, Patent Document 1).

JP 2010-519627 A

  By the way, in order to express the three-dimensional image data as a two-dimensional image, a perspective rendering method is generally used. In the two-dimensional image data generated by projection by this perspective projection method, the amount of data per region increases as the distance from the viewpoint (reference point) increases (the range becomes wider). Therefore, the two-dimensional image data generated by the projection by the perspective projection method is a display image in which a sense of perspective is difficult to be felt, unlike a scene when observed with the human naked eye. In addition, the region of interest is not easily noticeable.

  Furthermore, as a problem of visibility, for example, it is difficult to acquire three-dimensionally high S / N ratio data in an ultrasonic diagnostic image. That is, a part of the volume data often lacks reflected echoes or artifacts. For example, when data far from the viewpoint (reference point) is a noise region or the like, the noise becomes conspicuous and the visibility is lowered.

  The present invention has been made in view of the above circumstances, and is an ultrasonic diagnostic apparatus, an image processing apparatus, and a program for generating medical two-dimensional image data from a medical three-dimensional image data set. An object of the present invention is to provide an ultrasonic diagnostic apparatus, an image processing apparatus, and a program capable of generating medical two-dimensional image data with improved performance.

An ultrasonic diagnostic apparatus according to an embodiment
An ultrasonic diagnostic apparatus for generating medical two-dimensional image data from a medical three-dimensional image data set,
A three-dimensional image data collection unit for collecting a medical three-dimensional image data set corresponding to a three-dimensional region of the subject;
A filter that executes, for each voxel of the medical 3D image data set, a filtering process according to a relative position between each voxel and a predetermined reference point set in the 3D space of the medical 3D image data set A process execution unit;
A rendering processing unit that performs volume rendering processing on the medical three-dimensional image data set after being filtered by the filtering processing execution unit;
It is characterized by comprising.

An ultrasonic diagnostic apparatus according to an embodiment
An ultrasonic diagnostic apparatus for generating medical two-dimensional image data from a medical three-dimensional image data set,
A three-dimensional image data collection unit for collecting a medical three-dimensional image data set by scanning a subject with ultrasound;
Executing a filter process for executing a filter process on each voxel of the medical three-dimensional image data set in accordance with the magnitude of the S / N ratio of the received signal obtained when ultrasonic waves are transmitted to the subject. And
A rendering processing unit for performing volume rendering processing on the medical three-dimensional image data set after being filtered by the filtering processing execution unit;
It is characterized by comprising.

An image processing apparatus according to an embodiment includes:
An image processing apparatus for generating medical two-dimensional image data from a medical three-dimensional image data set,
A 3D image data storage unit for storing medical 3D image data collected corresponding to the 3D region of the subject;
For each voxel of the medical three-dimensional image data set, a filter processing execution unit that executes a filtering process according to a relative position between each voxel and a predetermined reference point;
A rendering processing unit for performing volume rendering processing on the medical three-dimensional image data set after being filtered by the filtering processing execution unit;
It is characterized by comprising.

A program according to an embodiment is:
On the computer,
A 3D image data collection function for collecting a medical 3D image data set corresponding to a 3D region of the subject;
For each voxel of the medical three-dimensional image data set, a filter processing execution function for executing a filtering process according to the relative position of each voxel and a predetermined reference point;
A rendering processing function for performing volume rendering processing on the medical three-dimensional image data set after being filtered by the filtering processing execution function;
It is characterized by realizing.

FIG. 1 is a block diagram showing a system configuration of an ultrasonic diagnostic apparatus according to an embodiment of the present invention. FIG. 2 is a diagram illustrating a flowchart of processing from collection of medical 3D image data to generation of medical 2D image data. FIG. 3 is a diagram illustrating a display example when the present embodiment is applied to a process of generating a “virtual endoscopic image” that is a perspective projection image generated and displayed in the fly-through mode. FIG. 4 is a diagram showing a display example when the present embodiment is applied to a process of generating a “virtual endoscope image” that is a perspective projection image generated and displayed in the fly-through mode. FIG. 5 is a diagram illustrating a display example when the present embodiment is applied to the generation processing of the “virtual endoscope image” that is a perspective projection image generated and displayed in the fly-through mode. FIG. 6 is a diagram illustrating a display example of a perspective projection image that has been uniformly filtered by a conventional technique. FIG. 7 is a diagram showing a display example of a perspective projection image that has been uniformly filtered by a conventional technique.

  Hereinafter, an ultrasonic diagnostic apparatus, an image processing apparatus, and a program according to an embodiment of the present invention will be described.

  FIG. 1 is a block diagram showing a system configuration of an ultrasonic diagnostic apparatus according to an embodiment of the present invention. As shown in the figure, the ultrasonic diagnostic apparatus 1 includes an ultrasonic probe 12, an input device 13, a monitor 14, an ultrasonic transmission unit 21, an ultrasonic reception unit 22, and a B-mode processing unit 23. , Blood flow detection unit 24, RAW data memory 25, volume data generation unit 26, image processing unit 28, control processor (CPU) 29, display processing unit 30, storage unit 31, interface unit 32, Are provided. Hereinafter, each component will be described.

  The ultrasonic probe 12 is a device (probe) that transmits ultrasonic waves to the subject P and receives reflected waves from the subject based on the transmitted ultrasonic waves. A piezoelectric vibrator, a matching layer, a backing material, and the like. The piezoelectric vibrator transmits an ultrasonic wave in a desired direction within the scan region based on a drive signal from the ultrasonic transmission unit 21, and converts a reflected wave from the subject into an electric signal. The matching layer is an intermediate layer that is provided in the piezoelectric vibrator and efficiently propagates ultrasonic energy. The backing material prevents ultrasonic waves from propagating backward from the piezoelectric vibrator.

  When ultrasonic waves are transmitted from the ultrasonic probe 12 to the subject P, the transmitted ultrasonic waves are successively reflected by the discontinuous surface of the acoustic impedance of the body tissue, and are transmitted as echo signals by the ultrasonic probe 12. Received. The amplitude of this echo signal depends on the difference in acoustic impedance at the discontinuous surface that is to be reflected. Further, the echo when the transmitted ultrasonic pulse is reflected by the moving bloodstream undergoes a frequency shift due to the Doppler effect depending on the velocity component in the ultrasonic transmission / reception direction of the moving body.

  Note that the ultrasonic probe 12 according to the present embodiment is capable of collecting medical three-dimensional image data (hereinafter referred to as a volume data set) relating to a diagnosis target scanning region in a subject, A probe in which ultrasonic transducers are arranged in a two-dimensional matrix) or a mechanical 4D probe (a probe capable of performing ultrasonic scanning while mechanically rolling an ultrasonic transducer array in a direction perpendicular to the arrangement direction). And

  However, without being limited to this example, it is also possible to acquire a volume data set by adopting, for example, a one-dimensional array probe as the ultrasonic probe 12 and performing ultrasonic scanning while manually swinging the probe. .

  The input device 13 is connected to the device main body 11 and sets various instructions and conditions from the operator, and processing criteria in filter processing described later (hereinafter referred to as filter processing criteria (reference points, reference depth, and region of interest)). Various switches, buttons, a trackball, a mouse, a keyboard, and the like for fetching instructions, various image quality condition setting instructions and the like into the apparatus main body 11 are provided.

  In addition, the input device 13 has a dedicated switch for inputting a diagnostic region, a dedicated knob for controlling the range of color data used for imaging, and the transparency (opacity of voxels) in the near-luminal blood flow rendering function described later. ) Has special knobs for controlling.

  The monitor 14 displays in vivo morphological information and blood flow information as an image based on the video signal from the display processing unit 30.

  The ultrasonic transmission unit 21 has a trigger generation circuit, a delay circuit, a pulsar circuit, and the like (not shown). The trigger generation circuit repeatedly generates a trigger pulse for forming a transmission ultrasonic wave at a predetermined rate frequency fr Hz (period: 1 / fr second). Further, in the delay circuit, a delay time required for focusing the ultrasonic wave into a beam shape for each channel and determining the transmission directivity is given to each trigger pulse. The pulser circuit applies a drive pulse to the ultrasonic probe 12 at a timing based on this trigger pulse.

  The ultrasonic transmission unit 21 has a function capable of instantaneously changing the transmission frequency, the transmission drive voltage, and the like in order to execute a predetermined scan sequence in accordance with an instruction from the control processor 29. In particular, the change of the transmission drive voltage is realized by a linear amplifier type transmission circuit capable of instantaneously switching the value or a mechanism for electrically switching a plurality of power supply units.

  The ultrasonic receiving unit 22 includes an amplifier circuit, an A / D converter, a delay circuit, an adder and the like which are not shown. The amplifier circuit amplifies the echo signal captured via the ultrasonic probe 12 for each channel. The A / D converter converts the amplified analog echo signal into a digital echo signal. The delay circuit determines the reception directivity for the digitally converted echo signal, gives a delay time necessary for performing the reception dynamic focus, and then performs an addition process in the adder. By this addition, the reflection component from the direction corresponding to the reception directivity of the echo signal is emphasized, and a comprehensive beam for ultrasonic transmission / reception is formed by the reception directivity and the transmission directivity.

  The B-mode processing unit 23 receives an echo signal from the ultrasonic wave receiving unit 22, performs logarithmic amplification, envelope detection processing, and the like, and generates data in which the signal intensity is expressed by brightness.

  The blood flow detection unit 24 extracts a blood flow signal from the echo signal received from the ultrasonic receiving unit 22 and generates blood flow data. Extraction of blood flow is usually performed by CFM (Color Flow Mapping). In this case, the blood flow signal is analyzed, and blood flow information such as average velocity, dispersion, power, etc. is obtained for multiple points as blood flow data.

  The RAW data memory 25 generates B-mode RAW data, which is B-mode data on a three-dimensional ultrasonic scanning line, using a plurality of B-mode data received from the B-mode processing unit 23. The RAW data memory 25 generates blood flow RAW data, which is blood flow data on a three-dimensional ultrasonic scanning line, using a plurality of blood flow data received from the blood flow detection unit 24.

  In order to reduce noise and improve image connection, a three-dimensional image filter process (hereinafter simply referred to as filter process) is inserted after the RAW data memory 25 to perform spatial smoothing. Also good.

  The volume data generation unit 26 generates a B-mode volume data set from the B-mode RAW data received from the RAW data memory 25 by executing RAW-voxel conversion. This RAW-voxel conversion is to generate B-mode voxel data on each line of sight within the visual volume used in the near-luminal blood flow rendering function described later by interpolation processing taking into account spatial position information. Similarly, the volume data generation unit 26 generates a blood flow volume data set on each line of sight within the visual volume from the blood flow RAW data received from the RAW data memory 25 by executing RAW-voxel conversion.

  The image processing unit 28 performs volume rendering processing, multi-planar conversion display (MPR), maximum intensity projection (MIP) on the volume data set output from the volume data generation unit 26. And predetermined image processing such as “filter processing” to be described later. The image processing unit 28 improves the visibility of the region of interest by performing a filtering process described later on the volume data set output from the volume data generating unit 26 before performing the volume rendering process. .

  In this embodiment, the perspective projection method may be applied to the volume rendering process, or the parallel projection method may be applied. In the following example, a case where a perspective projection method is applied to volume rendering processing will be described.

  The control processor 29 has a function as an information processing apparatus (computer) and controls the operation of the main body of the ultrasonic diagnostic apparatus. The control processor 29 reads out a dedicated program for realizing a near-luminal blood flow rendering function, which will be described later, from the storage unit 31, develops it on its own memory, and executes calculation / control related to various processes.

  The display processing unit 30 executes various types of image data generated and processed by the image processing unit 28 such as dynamic range, brightness (brightness), contrast, γ curve correction, and RGB conversion.

  The storage unit 31 is for realizing a dedicated program for realizing a function for rendering a blood flow near the lumen described later, diagnostic information (patient ID, doctor's findings, etc.), diagnostic protocol, transmission / reception conditions, and speckle removal function. , A body mark generation program, a conversion table for presetting the range of color data used for imaging for each diagnostic part, and other data groups. Further, it is also used for storing images in an image memory (not shown) as required. Data in the storage unit 31 can be transferred to an external peripheral device via the interface unit 32.

  Hereinafter, “filter processing” executed by the image processing unit 28 under the control of the control processor 29 will be described. FIG. 2 is a flowchart illustrating a series of processes from collection of medical three-dimensional image data (volume data set) to generation of medical two-dimensional image data (perspective projection image data) under the control of the control processor 29.

  First, the control processor 29 collects a volume data set for the diagnosis target region of the subject P by the above-described volume data set collection processing (step S1). Subsequently, the image processing unit 28 performs MPR processing on the volume data set collected in step S1 to generate each cross-sectional image (MPR image).

  The user operates the input device 13 to specify the perspective / line of sight of the perspective projection image to be displayed on the MPR image displayed on the monitor 14, and the “filter processing standard” and “filter processing content” to be described later "Is specified. The control processor 29 sets “filter processing reference” based on the input operation to the input device 13 (step S2) and sets “filter processing content” (step S3).

  Hereinafter, the “filter processing standard” and “filter processing content” will be described in detail.

  The “filtering criterion” is a criterion for determining what filtering process is to be performed on which voxel of the volume data set when the filtering process is performed in step S4. The image processing unit 28 changes the “filter processing content” to be applied to each voxel according to the filter processing reference (reference point, reference depth, region of interest).

  Specifically, when “reference point” is used as the filter processing reference, “filter processing content” applied to each voxel is changed according to the relative position between each voxel and the reference point. When “reference depth” is used as the filter processing reference, “filter processing content” to be applied to the voxel of each depth is changed according to the relative position between each depth and the reference depth. When “region of interest” is used as the filter processing reference, “filter processing content” applied to each voxel is changed inside and outside the region of interest. Specific examples will be described later.

  In this embodiment, “filter processing” is a concept including “processing for giving a predetermined color tone” to each voxel in addition to normal filtering. Accordingly, examples of the above-described “change in filter processing content” include “change in filter processing intensity (coefficients, weights)” and “change in color tone applied to each voxel”.

  Specifically, the parameters relating to the strength of the filter processing can include the number of taps, filter coefficients (weights), kernel size, and the like. The type of filter processing is arbitrary as long as it is a filter processing for general smoothing or noise reduction. Therefore, examples of the filter processing performed by the image processing unit 28 on the volume data set include arbitrary three-dimensional filter processing such as a Gaussian filter, an average filter, or a median filter.

  FIG. 3 shows a display example when the present embodiment is applied to a generation process of a “virtual endoscope image” that is a perspective projection image generated and displayed in a lumen display mode called “fly-through mode”. FIG. In the example shown in FIG. 3, the “reference point” is set at the center of the image as the filter processing reference, and the filter processing coefficient is changed between the region near the center of the image and the region near the periphery of the image.

  More specifically, the “relative position between each voxel and the reference point” is set as “the distance between each voxel and the reference point”, and a filter process with relatively weak smoothing is applied to an area near the center of the image. On the other hand, the region near the periphery of the image is strongly blurred by performing a strong smoothing process. That is, by relatively improving the visibility of the region of interest, the image is close to the appearance when viewed with the naked eye and is suitable for observation.

  Here, the “fly-through mode” will be described in detail. The fly-through mode is a type of lumen display mode, in which the observer's viewpoint and line-of-sight direction are virtually set in the lumen organ of the volume data obtained by three-dimensional scanning of the subject, and observation is performed from the viewpoint. In this mode, the inner surface of the hollow organ is observed as virtual endoscopic image (fly-through image) data.

  In this fly-through mode, endoscopic image data can be generated and displayed based on the volume data collected from outside the subject, greatly reducing the degree of invasiveness to the subject at the time of examination. be able to. Furthermore, it is impossible with conventional endoscopy because the viewpoint and line-of-sight direction can be arbitrarily set even for luminal organs such as thin digestive tracts and blood vessels where it is difficult to insert an endoscope. It is possible to perform highly accurate inspections safely and efficiently.

  In the inversion method, the form of the lumen viewed as low brightness is viewed from the outside of the tube. On the other hand, in the fly-through method used in the fly-through mode, the lumen is visualized as viewed from the inside of the tube. According to this fly-through method, for example, in the confirmation of the presence of an intraluminal polyp, even if the endoscope cannot be inserted due to stenosis or the like, it can be observed as if it were observed with an endoscope. It becomes possible.

  This fly-through method is a technique in which a technology that has been commercialized as a virtual endoscope using CT data is applied to an ultrasonic diagnostic apparatus, and technically has mainly the following two features.

  The first feature is that, while the parallel projection method is generally used as the projection method, the perspective projection method is used as the projection method in the fly-through method. In the parallel projection method, the viewpoint is projected at infinity, so it is difficult to grasp the sense of depth. On the other hand, in the perspective projection method, an image obtained using an endoscope can be displayed by a so-called perspective method. Therefore, it can be said that the display method is suitable for observing the lumen of the tube.

  The second feature is that the fly-through method can automatically move the viewpoint along the center axis by automatically recognizing the center axis of the lumen by automatically tracking the fly-through lumen. It is. Therefore, by using the fly-through method, once an operator wants to specify a lumen to be observed, an ultrasonic diagnostic apparatus in which the viewpoint is mechanically moved within the lumen and the lumens are sequentially displayed. Is realized. Of course, it may be configured to manually change the viewpoint position and the line-of-sight direction.

  In addition, according to the fly-through method, an MPR image orthogonal to the line of sight can also be displayed, and can be used as navigation for drawing three orthogonal sections of the lumen.

  As described above, the fly-through method has many advantages. However, since the perspective projection method is used for the volume rendering process, “the generated two-dimensional image data is a display image in which it is difficult to feel the perspective. And the disadvantage that the region of interest is difficult to display.

  However, the demerit resulting from the perspective projection method can be overcome by applying this embodiment to the fly-through method. That is, this one embodiment can make the fly-through method more useful.

  FIG. 4 is a diagram showing a display example when the present embodiment is applied to the generation processing of the “virtual endoscope image” that is a perspective projection image generated and displayed in the fly-through mode. Hereinafter, the case where “reference depth” is used as the filter processing reference will be described with reference to FIG.

  When “reference depth” is used as a filter processing reference, for example, a pointer generation unit (not shown) for designating a pointer on a perspective projection image is provided in the ultrasonic diagnostic apparatus, and the pointer is moved by an operation of the input device 13 by the user. Based on the information regarding the installed reference depth (referred to as depth information), the surrounding depth and other depths may be filtered with different filter processing coefficients (or different tones) Good). A pointer P shown in FIG. 4 is a UI (User Interface) for the user to specify a reference depth on the virtual endoscopic image.

  In the example shown in FIG. 4, a relatively smooth filtering process may be applied to the reference depth in which the pointer P is installed, and a strong smoothing filter process may be performed for other depths. As a result, the reference depth on which the pointer P is installed has little blur and an image with clear edges and the like can be obtained. Also, since the depth other than the reference depth where the pointer P is installed is strongly blurred, noise and artifacts are reduced, the visibility of the entire display image is improved, and image creation for third parties is also possible. It is easy to recognize where people are paying attention. The depth information on which the pointer P is installed may be configured to be stored in the storage unit 31 together with the perspective projection image.

  Hereinafter, a case where “region of interest (ROI)” is used as a filter processing criterion will be described with reference to FIG. FIG. 5 is a diagram illustrating a display example when the present embodiment is applied to a process of generating a “virtual endoscope image” that is a perspective projection image generated and displayed in the fly-through mode.

  When the “region of interest” is used as the filter processing reference, for example, the ROI generation unit (not shown) capable of setting the region of interest on the perspective projection image is provided in the ultrasonic diagnostic apparatus, and the interest set by the ROI generation unit Filter processing may be performed by changing the filter processing coefficient between the region and outside the range (or the color tone may be different). The region of interest ROI shown in FIG. 5 is a UI for the user to set a desired region as a region of interest on the virtual endoscopic image.

  In the example shown in FIG. 5, a filter process with relatively weak smoothing may be performed within the range in which the region of interest ROI is set, and a strong smoothing filter process may be performed with respect to the other areas. As a result, an image in which the region of interest ROI is set is less blurred and the edges are clear can be obtained. Further, since the region outside the region of interest ROI is strongly blurred, noise and artifacts are reduced, the visibility of the display image as a whole is improved, and the image creator can also improve the image creator. It is easy to recognize where you are paying attention. In addition, you may comprise so that the ROI information which is the information which concerns on the region of interest ROI can be memorize | stored in the storage unit 31 with the said perspective projection image.

  In addition to the above-described filter processing criteria, each voxel is filtered according to, for example, “the magnitude of the S / N ratio of the received signal obtained when transmitting an ultrasonic wave to the subject”. You may make it perform. Specifically, for example, a region where the value of the S / N ratio is large is considered to have a clear received signal, and filtering processing with relatively weak smoothing is performed, while a region with a small S / N ratio is subjected to strong smoothing processing. It will blur strongly. That is, a filtering process that strongly blurs a region where a lot of noise components are included and the received signal is not clear is performed. Thereby, the visibility (impression) of the display image does not depend on the area where the value of the S / N ratio is small.

  Furthermore, different voxels may be applied to each voxel of the volume data set collected by setting the gain so that the noise level is spatially constant, depending on the magnitude of the received signal. However, it is desirable that the echo level be relatively uniform when performing perspective projection. The gain is corrected in the post process, and the rendering process is performed on the corrected volume data set.

  By the way, as an example of “change of filter processing content”, the following example can be given in addition to “change of coefficient (weight) of filter processing” as described above.

  For example, “a different color tone may be assigned” to each voxel according to the filter processing standard. Specifically, for example, each voxel is colored by changing the color tone according to the distance from a predetermined reference point. Further, for example, after setting a coloring function for setting the relationship between the distance between the voxel and the reference point and the coloring ratio as a two-dimensional function, each voxel may be colored according to the coloring function. Furthermore, it may be “increasing the degree of mixing of a specific color tone (for example, increasing the ratio of B in RGB)” according to the filter processing standard.

  When various settings are completed in step S2 and step S3 described above, the image processing unit 28 executes the filter processing according to the filter processing standard as described above for each voxel of the volume data set (step S4).

  The process in step S4 is a process for improving the visibility of the diagnosis target region in the display of the perspective projection image data (generated in step S5 described later) of the volume data set.

  After completing the processing in step S4, the image processing unit 28 performs volume rendering processing on the volume data set after the filtering processing in step S4 by, for example, a perspective projection method, and generates perspective projection image data (medical two-dimensional image data). (Step S5). Thereafter, the perspective projection image data is subjected to a predetermined conversion process or the like by the display processing unit 30, and is then output to the monitor 14 for display and stored in the storage unit 31.

  If a reference point for the filtering process is set in step S2, the reference point may be used in the volume rendering process in step S5. That is, the reference point in the filtering process in step S4 and the reference point in the volume rendering process in step S5 may be shared. In this case, the volume rendering image generated by the volume rendering process is drawn with the reference point as the drawing center.

  After finishing the process of step S5, the control processor 29 updates the filtering process reference (reference point, reference depth, and region of interest) set in step S2 and / or the filtering process content set in step S3. Whether or not (step S6).

  Specifically, in step S6, for example, whether or not a “signal for updating the setting related to the filtering process” is input from the input device 13 (an operation for updating the reference is performed in the input device 13). Whether or not).

  When step S6 is branched to YES, the user operates the input device 13, for example, and visually recognizes the perspective projection image (and orthogonal three-section images, etc.) generated in step S5 and displayed on the monitor 14. The viewpoint / line of sight of the perspective projection image to be displayed is designated on the MPR image, and the “filter processing reference” and “filter processing content” are designated (step S7). Thereafter, the process proceeds to step S2. On the other hand, when step S6 is branched to NO, the process of generating the perspective projection image is terminated.

  By the way, the above-described series of processing by the ultrasonic diagnostic apparatus according to the present embodiment is independent of the ultrasonic diagnostic apparatus by programming or by reading the program into a storage medium after programming. Sales and distribution as a single software product can be facilitated, and the technology according to the present embodiment can be used on other hardware.

  As described above, according to the present embodiment, there is provided an ultrasonic diagnostic apparatus, an image processing apparatus, and a program that generate medical two-dimensional image data from a medical three-dimensional image data set. An ultrasonic diagnostic apparatus, an image processing apparatus, and a program capable of generating improved medical two-dimensional image data can be provided.

  Specifically, according to the present embodiment, by applying spatially non-uniform filtering processing to a three-dimensional volume data set, the medical data generated by volume rendering processing is performed. The visibility of the dimensional image data is improved.

  That is, in the medical two-dimensional image data generated by the present embodiment, as shown in FIGS. 3 to 5 described above, the region of interest is displayed with a high resolution, and the region of non-interest is displayed. Becomes inconspicuous, and the attention area (diagnostic area) of the image creator becomes clear at a glance. Further, information unnecessary for diagnosis such as noise is made inconspicuous by the above-described filter processing, and thus the visibility of the entire display image is improved.

  On the other hand, in the case of the conventional technique, since there is no technical idea related to the characteristic part of the present embodiment, only a perspective projection image that is uniformly filtered with respect to all voxels as shown in FIGS. 6 and 7 is obtained. I can't. 6 and 7 are diagrams showing display examples of perspective projection images that have been uniformly filtered by a conventional technique. Note that the example shown in FIG. 6 is an example in which relatively stronger filter processing is performed than the example shown in FIG.

  In particular, when performing the rendering process by the perspective projection method as in the fly-through mode described above, the disadvantages of the perspective projection method cannot be eliminated unless the present embodiment is used. A perspective projection image with good visibility as shown in FIG.

  As mentioned above, although this invention was demonstrated based on one Embodiment, this invention is not limited to one Embodiment mentioned above, A various deformation | transformation and application are possible within the range of the summary of this invention. Of course.

<< First Modification >>
Hereinafter, in order to avoid duplication of explanation, differences from the ultrasonic diagnostic apparatus, the image processing apparatus, and the program according to the above-described embodiment will be described. In the first modification, the reference point is set by the following process.

  That is, before the filter processing by the image processing unit 28, the image processing unit 28 performs volume rendering processing on the volume data set to generate perspective projection image data (first volume rendering image data). The ROI generation unit (not shown) may set a region of interest for the volume rendering image data, and a plurality of reference points (for example, all regions of the region of interest) may be set in the region of interest.

  After setting the reference points in this way, the image processing unit 28 executes filter processing that is changed according to the distance between the reference point having the smallest distance from the voxel among the plurality of reference points and each voxel. To do.

  Note that the volume rendering process is performed again on the volume data set after this filtering process to generate perspective projection image data (second volume rendering image data), and then the second volume rendering image data is described above. The ROI generation unit (not shown) sets the region of interest again, sets the reference point, and executes the filter process again on the “volume data set before executing the filter process” based on the reference point. Also good.

<< Second Modification >>
Hereinafter, in order to avoid duplication of explanation, differences from the ultrasonic diagnostic apparatus, the image processing apparatus, and the program according to the above-described embodiment will be described. In the second modification, the image processing unit 28 changes the filter processing according to the following criteria and executes it.

  That is, the image processing unit 28 calculates the first coefficient for each voxel of the volume data set based on the distance between the processing target voxel and the reference point, and based on the S / N ratio of the processing target voxel. To calculate the second coefficient. Then, the image processing unit 28 changes the filter coefficient based on both the first coefficient and the second coefficient, and executes the filter process (for example, the smoothing process).

  Here, “change the filter coefficient based on both the first coefficient and the second coefficient” means, for example, the sum or multiplication of the first coefficient and the second coefficient, The third coefficient may be calculated and set by adopting the larger value or the like, and the filter coefficient in the filter processing may be changed based on the third coefficient. Specifically, the image processing unit 28 may perform the smoothing process by changing the filter coefficient so that the blur becomes stronger as the distance between the voxel and the reference point increases, for example.

  Although several embodiments of the present invention have been described, these embodiments are presented by way of example and are not intended to limit the scope of the invention. These novel embodiments can be implemented in various other forms, and various omissions, replacements, and changes can be made without departing from the scope of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, and are included in the invention described in the claims and the equivalents thereof.

    P ... Pointer, ROI ... Region of interest, 1 ... This ultrasonic diagnostic apparatus, 11 ... Main unit, 12 ... Ultrasonic probe, 13 ... Input device, 14 ... Monitor, 21 ... Ultrasonic transmission unit, 22 ... Ultrasonic reception unit 23 ... B-mode processing unit, 24 ... blood flow detection unit, 25 ... RAW data memory, 26 ... volume data generation unit, 28 ... image processing unit, 29 ... control processor, 30 ... display processing unit, 31 ... storage unit, 32: Interface unit.

Claims (13)

A three-dimensional image data storing unit for storing a medical three-dimensional image data collected in response to the three-dimensional region in the subject,
For each voxel of the medical three-dimensional image data, a processing unit for the blurring processing in accordance with the relative position between each of the voxels and Jo Tokoro of standards run, rendering the medical three-dimensional image data processed An image processing apparatus comprising:
A 3D image data storage unit for storing medical 3D image data collected corresponding to the 3D region of the subject;
For each voxel of the medical three-dimensional image data, a processing unit that performs a process of assigning a color according to a relative position between each voxel and a predetermined reference, and renders the medical three-dimensional image data after processing;
An image processing apparatus comprising:
A 3D image data storage unit for storing medical 3D image data collected corresponding to the 3D region of the subject;
For each voxel of the medical 3D image data, filter processing using a filter coefficient corresponding to the relative position between the voxel and a predetermined reference is performed, and the medical 3D image data after the filter processing is rendered With processing part
An image processing apparatus comprising:
A 3D image data storage unit for storing medical 3D image data collected corresponding to the 3D region of the subject;
For each voxel of the medical three-dimensional image data, a filter coefficient based on both a first coefficient corresponding to a relative position between the voxel and a predetermined reference and a second coefficient based on the S / N ratio of the voxel A processing unit that executes a filtering process using an image and renders the medical three-dimensional image data after the filtering process;
An image processing apparatus comprising:
A 3D image data storage unit for storing medical 3D image data collected corresponding to the 3D region of the subject;
For each voxel of the medical 3D image data, a filtering process is performed according to the relative position between each voxel and a predetermined reference point, and the filtered medical 3D image data is viewed from the reference point. A rendering unit that renders with the perspective projection method
An image processing apparatus comprising:
The image processing apparatus according to claim 1, wherein the blurring process includes a smoothing process. The image processing apparatus according to claim 1, wherein the relative position between the voxel and the predetermined reference is a distance between the voxel and the predetermined reference. An ultrasonic diagnostic apparatus comprising the image processing apparatus according to claim 1. On the computer,
A 3D image data storage unit for storing medical 3D image data collected corresponding to the 3D region of the subject;
For each voxel of the medical three-dimensional image data, a processing unit that performs a blurring process according to a relative position between the voxel and a predetermined reference, and renders the processed medical three-dimensional image data;
A program characterized by realizing.
On the computer,
A 3D image data storage unit for storing medical 3D image data collected corresponding to the 3D region of the subject;
For each voxel of the medical three-dimensional image data, a processing unit that performs a process of assigning a color according to a relative position between each voxel and a predetermined reference, and renders the medical three-dimensional image data after processing;
A program characterized by realizing.
On the computer,
A 3D image data storage unit for storing medical 3D image data collected corresponding to the 3D region of the subject;
For each voxel of the medical 3D image data, filter processing using a filter coefficient corresponding to the relative position between the voxel and a predetermined reference is performed, and the medical 3D image data after the filter processing is rendered With processing part
A program characterized by realizing.
On the computer,
A 3D image data storage unit for storing medical 3D image data collected corresponding to the 3D region of the subject;
For each voxel of the medical three-dimensional image data, a filter coefficient based on both a first coefficient corresponding to a relative position between the voxel and a predetermined reference and a second coefficient based on the S / N ratio of the voxel A processing unit that executes a filtering process using an image and renders the medical three-dimensional image data after the filtering process;
A program characterized by realizing.
On the computer,
A 3D image data storage unit for storing medical 3D image data collected corresponding to the 3D region of the subject;
For each voxel of the medical 3D image data, a filtering process is performed according to the relative position between each voxel and a predetermined reference point, and the filtered medical 3D image data is viewed from the reference point. A rendering unit that renders with the perspective projection method
A program characterized by realizing.
JP2012117819A 2012-05-23 2012-05-23 Ultrasonic diagnostic apparatus, image processing apparatus, and program Active JP5996268B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2012117819A JP5996268B2 (en) 2012-05-23 2012-05-23 Ultrasonic diagnostic apparatus, image processing apparatus, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2012117819A JP5996268B2 (en) 2012-05-23 2012-05-23 Ultrasonic diagnostic apparatus, image processing apparatus, and program

Publications (2)

Publication Number Publication Date
JP2013244047A JP2013244047A (en) 2013-12-09
JP5996268B2 true JP5996268B2 (en) 2016-09-21

Family

ID=49844324

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2012117819A Active JP5996268B2 (en) 2012-05-23 2012-05-23 Ultrasonic diagnostic apparatus, image processing apparatus, and program

Country Status (1)

Country Link
JP (1) JP5996268B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7019610B2 (en) 2017-01-24 2022-02-15 ナブテスコ株式会社 Automatic door device with guide rail and stopper, stopper attached to guide rail and mounting method to attach stopper to guide rail

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102297346B1 (en) * 2014-10-31 2021-09-03 삼성메디슨 주식회사 Medical image apparatus and displaying medical image thereof
KR20170068944A (en) 2015-12-10 2017-06-20 삼성메디슨 주식회사 Method of displaying a ultrasound image and apparatus thereof
US10198621B2 (en) * 2016-11-28 2019-02-05 Sony Corporation Image-Processing device and method for foreground mask correction for object segmentation

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4768495B2 (en) * 2006-04-04 2011-09-07 日立アロカメディカル株式会社 Ultrasonic diagnostic equipment
JP4909137B2 (en) * 2007-03-15 2012-04-04 日立アロカメディカル株式会社 Volume data processing apparatus and method
JP5523681B2 (en) * 2007-07-05 2014-06-18 株式会社東芝 Medical image processing device
US8224056B2 (en) * 2009-12-15 2012-07-17 General Electronic Company Method for computed tomography motion estimation and compensation

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7019610B2 (en) 2017-01-24 2022-02-15 ナブテスコ株式会社 Automatic door device with guide rail and stopper, stopper attached to guide rail and mounting method to attach stopper to guide rail

Also Published As

Publication number Publication date
JP2013244047A (en) 2013-12-09

Similar Documents

Publication Publication Date Title
US10278670B2 (en) Ultrasound diagnostic apparatus and method of controlling ultrasound diagnostic apparatus
US9173632B2 (en) Ultrasonic diagnosis system and image data display control program
JP5872216B2 (en) Ultrasonic diagnostic apparatus and ultrasonic image processing apparatus
JP5422264B2 (en) Ultrasonic diagnostic apparatus and medical image processing apparatus
JP5762076B2 (en) Ultrasonic diagnostic apparatus, ultrasonic image processing apparatus, and medical image diagnostic apparatus
US20160338672A1 (en) Ultrasound diagnosis apparatus and ultrasound imaging method
JP5984243B2 (en) Ultrasonic diagnostic apparatus, medical image processing apparatus, and program
JP5253893B2 (en) Medical image processing apparatus, ultrasonic diagnostic apparatus, and ultrasonic image acquisition program
JP6121766B2 (en) Ultrasonic diagnostic apparatus, image processing apparatus, and image processing method
JP5996268B2 (en) Ultrasonic diagnostic apparatus, image processing apparatus, and program
JP6125380B2 (en) Ultrasonic diagnostic apparatus, medical image processing apparatus, and image processing program
JP5942217B2 (en) Ultrasonic diagnostic apparatus, ultrasonic image processing apparatus, and ultrasonic image processing program
US20120203111A1 (en) Ultrasonic diagnostic apparatus, ultrasonic image processing apparatus, and ultrasonic image acquisition method
KR20100049475A (en) Ultrasound system and method for providing 3-dimensional ultrasound images
JP2014054362A (en) Ultrasonic diagnostic apparatus, image processing apparatus and program
KR101107478B1 (en) Ultrasound system and method for forming a plurality of 3 dimensional ultrasound images
US20120095341A1 (en) Ultrasonic image processing apparatus and ultrasonic image processing method
JP5498090B2 (en) Image processing apparatus and ultrasonic diagnostic apparatus
JP5823184B2 (en) Ultrasonic diagnostic apparatus, medical image processing apparatus, and medical image processing program
JP2020531074A (en) Ultrasound system with deep learning network for image artifact identification and removal
JP2013099386A (en) Ultrasonic diagnostic apparatus and medical image processing apparatus
EP3838162A1 (en) Systems and methods for assessing a placenta
US20210161510A1 (en) Ultrasonic diagnostic apparatus, medical imaging apparatus, training device, ultrasonic image display method, and storage medium
JP2019213859A (en) Analyzer and program
JP2021049129A (en) Ultrasound diagnostic device and image processing device

Legal Events

Date Code Title Description
RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20131219

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20131226

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20140109

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20150420

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20160127

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20160301

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20160425

A711 Notification of change in applicant

Free format text: JAPANESE INTERMEDIATE CODE: A711

Effective date: 20160511

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20160726

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20160824

R150 Certificate of patent or registration of utility model

Ref document number: 5996268

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350