JP4677199B2 - Ultrasonic diagnostic equipment - Google Patents

Ultrasonic diagnostic equipment Download PDF

Info

Publication number
JP4677199B2
JP4677199B2 JP2004118985A JP2004118985A JP4677199B2 JP 4677199 B2 JP4677199 B2 JP 4677199B2 JP 2004118985 A JP2004118985 A JP 2004118985A JP 2004118985 A JP2004118985 A JP 2004118985A JP 4677199 B2 JP4677199 B2 JP 4677199B2
Authority
JP
Japan
Prior art keywords
tomographic image
subject
ultrasonic
volume data
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2004118985A
Other languages
Japanese (ja)
Other versions
JP2005296436A5 (en
JP2005296436A (en
Inventor
毅 三竹
隆雄 岩崎
修 荒井
Original Assignee
株式会社日立メディコ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立メディコ filed Critical 株式会社日立メディコ
Priority to JP2004118985A priority Critical patent/JP4677199B2/en
Publication of JP2005296436A publication Critical patent/JP2005296436A/en
Publication of JP2005296436A5 publication Critical patent/JP2005296436A5/ja
Application granted granted Critical
Publication of JP4677199B2 publication Critical patent/JP4677199B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an ultrasonic diagnostic apparatus, and more particularly to a technique suitable for making a diagnosis by comparing two tomographic images of the same part that are imaged with time.

  The ultrasonic diagnostic apparatus transmits and receives ultrasonic waves to and from the subject via the probe, and reconstructs and displays the tomographic image of the imaging region based on the reflected echo signal output from the probe, Diagnose the imaging site non-invasively and in real time.

  In such an ultrasonic diagnostic apparatus, for example, in order to confirm the therapeutic effect of the affected area, a tomographic image of the affected area (hereinafter referred to as a pre-treated tomographic image) taken before the treatment and an image taken during or after the treatment are used. A tomographic image of an affected area (hereinafter referred to as a post-treatment tomographic image) is displayed for comparison and diagnosis is performed. In such a case, in normal ultrasonic scanning, the probe is held and operated by hand, so during imaging during or after treatment, the position and inclination of the probe during imaging before treatment It is difficult to match the same. As a result, it becomes difficult to match the positions and inclinations of the display cross sections of the post-treatment tomographic image and the pre-treatment tomographic image.

Therefore, in order to reduce the positional deviation of the display cross section, the deviation between the pre-treatment tomographic image and the post-treatment tomographic image is obtained as a correlation index, and the position and inclination of the probe are manually adjusted so that the obtained correlation index becomes small. By doing so, it has been proposed to match the display cross section of the post-treatment tomographic image with the display cross section of the pre-treatment tomographic image (for example, see Patent Document 1).
Japanese Patent Laid-Open No. 2000-200557

  By the way, in Patent Document 1, since the probe is manually adjusted in order to match the post-treatment tomogram with the pre-treatment tomogram, the operation becomes complicated and takes time. Therefore, a pre-treatment tomogram is acquired as three-dimensional volume data (hereinafter referred to as volume data), and a pre-treatment tomogram corresponding to the scan plane of the post-treatment tomogram is extracted from the acquired volume data and displayed in real time. A method is conceivable. However, since the correspondence between the position coordinates of the volume data and the subject is not considered, there are cases where the display cross sections of the pre-treatment tomographic image and the post-treatment tomographic image cannot be made to coincide with each other.

  Volume data is acquired by scanning the probe from the viewpoint of simplicity and real-time characteristics, but the acquired data is limited within the scanning range of the probe. If the scanning range is narrow, the tomographic data corresponding to the post-treatment tomographic image may not be extracted from the volume data.

  An object of the present invention is to easily match the display cross section of a tomographic image captured earlier and a tomographic image captured later.

In order to solve the above-described problems, the ultrasonic diagnostic apparatus of the present invention stores volume data of a tomographic image of a subject acquired by the ultrasonic diagnostic apparatus in association with a preset subject coordinate system of the subject. Based on the detection values of the storage means and the sensor for detecting the position and inclination of the ultrasonic probe, the scan plane of the ultrasonic tomographic image (for example, post-treatment tomographic image) imaged by the ultrasonic probe is calculated. Calculating means for extracting, tomographic image data corresponding to the scan plane of the ultrasonic tomographic image from volume data, and reconstructing a reference tomographic image (for example, pre-treatment tomographic image), ultrasonic tomographic image and reference and display means for displaying the tomographic image, the subject coordinate system the volume data are stored in association in the storage means, subject to be a reference point on the ultrasonic tomographic image captured Of the state in which specific sites were visualized defined based on detection values of the position and inclination of the ultrasonic probe, said calculation means, wherein the ultrasonic tomographic image captured after the acquisition of the volume data, The coordinates of the ultrasonic tomographic image imaged after the acquisition of the volume data based on the detected values of the position and inclination of the ultrasonic probe in a state where the specific part of the subject to be used as a reference point is depicted A system is associated with the subject coordinate system of the volume data .

  As described above, the volume data stored in the storage unit is associated with the subject coordinate system. Further, when an ultrasonic tomographic image is taken at an interval, the subject coordinate system of the ultrasonic tomographic image can be specified based on the position and inclination of the probe. Therefore, tomographic image data corresponding to the scan plane of the imaged ultrasonic tomographic image can be extracted from the volume data. As a result, the display cross sections of the reference tomographic image and the ultrasonic tomographic image can be easily displayed so as to coincide with each other.

  In this case, with respect to the subject coordinate system, a specific part of the subject can be set as the origin. For example, when determining the therapeutic effect of the abdomen, the xiphoid process and the ribs may be set as the origin. This makes it easy to align the ultrasonic probe with the origin. Then, when starting tomographic image acquisition, the ultrasound probe is aligned with the origin of the subject coordinate system so that the subject coordinate system at the time of volume data acquisition and ultrasound imaging is easily matched. This makes it easy to extract tomographic image data corresponding to the scan plane of the ultrasonic tomographic image.

  In addition, as an image capturing apparatus that acquires volume data, it is desirable to apply an ultrasonic imaging apparatus from the viewpoints of simplicity and real-time characteristics, but when acquiring a wide range of volume data, an X-ray CT apparatus or a magnetic A resonance imaging (MRI) apparatus can be used.

  According to the present invention, since the display cross section of the tomographic image captured earlier and the tomographic image captured later can be easily matched, the cross sections of the pre-treatment tomographic image and the post-treatment tomographic image are displayed in alignment. Thus, the diagnosis can be effectively performed, such as determining the therapeutic effect of the affected area.

  First Embodiment A first embodiment of an ultrasonic diagnostic apparatus to which the present invention is applied will be described with reference to FIGS. 1 to 6. FIG. 1 is a configuration diagram of an ultrasonic diagnostic apparatus to which the present invention is applied.

  As shown in FIG. 1, the ultrasonic diagnostic apparatus is roughly divided into a system for reconstructing an ultrasonic tomographic image and a system for reconstructing a reference tomographic image. The system for reconstructing an ultrasonic tomogram includes a probe 10 that is an ultrasonic probe in which a plurality of transducers that transmit and receive ultrasonic waves to and from a subject are arranged in, for example, an arc shape, and a probe. The transmission / reception unit 12 that supplies a drive signal to the probe 10 and performs processing such as amplification, analog-digital conversion, and phasing addition on the reflection echo signal output from the probe 10, and the reflection echo output from the transmission / reception unit 12 An ultrasonic image constructing unit 14 for reconstructing an ultrasonic tomographic image based on the signal, an image memory 16 for storing the reconstructed ultrasonic tomographic image, and a display unit for displaying the ultrasonic tomographic image read from the image memory 16 18.

  A position sensor 20 is attached to the probe 10. The position sensor 20 includes a magnetic sensor that detects, for example, a magnetic signal generated from a source 22 attached to a bed or the like. The position sensor 20 detects the three-dimensional position and inclination of the probe 10 in the source coordinate system S. The source coordinate system S is a three-dimensional orthogonal coordinate system with the source 22 as the origin So, and the X axis is aligned with the short side direction of the bed on which the subject lies, the Y axis is aligned with the longitudinal direction of the bed, and the Z axis is aligned with the vertical direction. ing. The source coordinate system S is not limited to a three-dimensional orthogonal coordinate system, and any source coordinate system may be used as long as the position of the probe 10 can be specified. Further, the position sensor 20 is not limited to one using a magnetic field, and may be one using light, for example.

  On the other hand, the system for reconstructing the reference tomographic image according to the present invention is a three-dimensional volume data (hereinafter referred to as the following) output from the transmission / reception unit 12 when the probe 10 is scanned in the direction perpendicular to the ultrasonic scan plane. Volume data creation unit 26 that associates volume data with a subject coordinate system P set in advance on the subject, and a volume data storage unit 28 as a storage unit that stores the volume data output from the volume data creation unit 26. And a reference image constructing unit 30 that is an extracting means for extracting predetermined tomographic image data from the volume data in the volume data storage unit 28 to reconstruct a reference tomographic image and outputting the reconstructed reference tomographic image to the image memory 16. Etc.

  For the subject coordinate system P, a specific part of the subject is set in advance as the origin Po. For example, when determining the therapeutic effect of the abdomen, the xiphoid process and the ribs can be set as the origin Po. The predetermined tomographic image data extracted by the reference image forming unit 30 is an ultrasonic tomographic image calculated based on the detection value of the position sensor 20 that detects the position and inclination of the probe 10 during scanning. Data corresponding to the scan plane. As for the volume data to be input to the volume data storage unit 28, volume data associated with the position data of the subject coordinate system is acquired by another ultrasonic imaging apparatus, and the acquired volume data is input via the network. Alternatively, the acquired volume data may be input via a general storage medium such as a magneto-optical disk. Further, tomographic image data output from the ultrasound image constructing unit 14 may be input to the volume data creating unit 26 instead of inputting the reflected echo signal output from the transmitting / receiving unit 12. In short, the data correlated with the ultrasonic tomographic image may be associated with the position coordinates of the subject.

  In addition, there are provided a control unit 40 that controls each part constituting a system for reconstructing an ultrasonic tomogram and each part constituting a system for reconstructing a reference tomogram. In FIG. 1, for convenience of explanation, the flow of commands according to the present invention is indicated by broken lines. The control unit 40 is imaged by the probe 10 based on the coordinate system setting means 43 that sets the subject coordinate system P with the specific part of the subject as the origin and the detection value of the position sensor 20 of the probe 10. Scan plane calculation means 45 for calculating the coordinates of the scan plane of the ultrasonic tomographic image in association with the subject coordinate system P. In addition, a command from the console 41 is input to the control unit 40. For example, by inputting a command to the console 41, the system is operated mainly on the system for reconstructing the reference tomographic image before the affected area is treated, while the ultrasonic tomographic image is reconstructed during or after the treatment. The processing system can be switched as necessary, for example, the system is operated mainly.

  Here, processing according to the present invention will be described with reference to FIGS. In addition, although the example which determines the treatment effect of an affected part is demonstrated, in short, this invention can be applied when making a diagnosis by contrasting the two tomographic images of the same site imaged with time. FIG. 2 is a flowchart showing a pre-treatment process for associating position data with volume data acquired by ultrasonic imaging. FIG. 3 is a flowchart showing a post-treatment process for displaying a reference image corresponding to a scan plane of an ultrasonic tomographic image captured by ultrasonic imaging. FIG. 4 is a conceptual diagram for explaining the processing of FIGS. 2 and 3.

  The process before treating an affected part (for example, abdominal tumor) is demonstrated using FIG.2 and FIG.4. First, the subject coordinate system P is initially set for the subject. For example, by scanning the probe 10 in contact with the body surface of the subject, an ultrasonic tomogram is displayed on the display unit 18 in real time. When a specific part (for example, a xiphoid process) of a subject is drawn on the displayed ultrasonic tomographic image, the position sensor 20 indicates the position and inclination of the probe 10 in the source coordinate system S when drawn. It is detected (S102). The detection value of the position sensor 20 may be fixed by pressing the input key of the console 41 when the xiphoid process is drawn, or the detection value is automatically obtained based on the luminance distribution of the ultrasonic tomogram. May be included. The object coordinate system P is initially set by the control unit 40 using the detected position and inclination of the probe 10 as the origin Po (Px, Py, Pz) (S104). In short, the subject coordinate system P with the specific part of the subject as an objective reference is set.

  The subject coordinate system P is a three-dimensional orthogonal coordinate system in which the X axis is aligned with the short direction of the bed on which the subject lies, the Y axis is aligned with the longitudinal direction of the bed, and the Z axis is aligned with the vertical direction. However, it is not limited to a three-dimensional orthogonal coordinate system, and any coordinate system that can specify the position of the probe 10 may be used. The set subject coordinate system P is held in the storage unit of the control unit 40.

  Next, volume data is acquired. For example, the probe 10 is brought into contact with the body surface near the abdomen. The position and inclination of the probe 10 at that time are detected by the position sensor 20. The volume coordinate system V is set by the control unit 40 using the detected position as the origin Vo (Vx, Vy, Vz) of the volume data area (S106). That is, the volume coordinate system V is set with the origin point Vo as the scanning start point of the probe 10. Here, the volume coordinate system V is preferably an orthogonal coordinate system similar to the subject coordinate system P, but is not limited thereto. A position vector (hereinafter referred to as position data PV) from the origin Po of the subject coordinate system P to the origin Vo of the volume coordinate system V is calculated and stored in the storage means of the control unit 40.

  Next, the probe 10 positioned at the origin Vo is scanned in a direction perpendicular to the ultrasonic scan plane. Thereby, a reflected echo signal is acquired in each scanning line. The acquired reflected echo signal is output from the transmission / reception unit 12 to the volume data creation unit 26 as three-dimensional volume data (hereinafter referred to as volume data) based on the position and inclination of the probe 10 (S108). The scanning range of the probe 10 at that time is set as a volume data area (SizeX, SizeY, SizeZ). Even when the probe 10 meanders and scans, the volume data can be acquired in correspondence with each coordinate of the volume coordinate system V by using the detection value of the position sensor 20. The acquired volume data includes data corresponding to the affected area (for example, an abdominal tumor). In addition, when the number of acquired volume data is small, the number of data can be increased by performing an interpolation process based on a weighting coefficient or the like.

  Next, the volume data is associated with the position data of the subject coordinate system P (S110). For example, the volume data creation unit 26 uses the position data (x, y, z) and the position data PV of the volume data coordinate system V output from the control unit 40 to voxel the volume data acquired by the process of S108. Each is associated with each coordinate of the subject coordinate system P. The associated volume data is stored in the volume data storage unit 28 (S112).

  The processing after treating the abdominal tumor will be described with reference to FIGS. 3 and 4. Note that this process may be executed during treatment. First, as in the process of S102, the probe 10 is moved to the origin Po of the object coordinate system P set in the process of S104 by scanning the probe 10 for the position of the xiphoid process of the object. (S202). As a result, the coordinate system after treatment is again aligned with the subject coordinate system P (S204). At this time, a body mark indicating the position of the subject is displayed on the display unit 18 in association with the subject coordinate system P. On the displayed body mark, the volume data area set in the process of S108 is also displayed (S206). The displayed volume data corresponds to the subject coordinate system P and is colored, for example. Further, the position of the probe 10 is also moved and displayed according to the detection value of the position sensor 20. Thus, the examiner can quickly move the probe 10 to the vicinity of the affected part while referring to the display screen. If a body mark or the like is not displayed, the probe 10 may be moved to the vicinity of the affected part according to the judgment of the examiner.

  Next, ultrasonic scanning is performed on the volume data area by the probe 10 (S208). An ultrasonic tomographic image (for example, a post-treatment tomographic image) is displayed by ultrasonic scanning (S209). When the affected part is depicted in the displayed ultrasonic tomographic image, the scan surface of the post-treatment tomographic image is calculated by the control unit 40 (S210). For example, the position and the inclination of the probe 10 when the tumor is drawn are taken into the control unit 40 via the console 41 or automatically. The coordinate position in the subject coordinate system P of the scan plane of the post-treatment tomographic image is calculated based on the captured position and inclination. Tomographic image data corresponding to the calculated scan plane is extracted from the volume data in the volume data storage unit 28 (S212). The extracted tomographic image data is reconstructed as a reference tomographic image (for example, a pre-treatment tomographic image) by the reference image construction unit 30 (S214). The reconstructed pre-treatment tomogram is displayed on the display unit 18 at the same time on the same screen side by side with the post-treatment tomogram displayed in S210 (S216).

  Thus, in this embodiment, the volume data stored in the volume data storage unit 28 before the treatment is associated with the subject coordinate system P. Further, when a post-treatment tomographic image is captured, the subject coordinate system P of the post-treatment tomographic image can be specified based on the position and inclination of the probe 10. Therefore, tomographic image data corresponding to the scan plane of the imaged post-treatment tomographic image can be extracted from the volume data. As a result, the display sections of the pre-treatment tomographic image and the post-treatment tomographic image can be displayed in an easily matched manner. In other words, since the tomographic image captured earlier can be easily matched with the display cross section of the tomographic image captured later at a time interval, diagnosis such as determining the therapeutic effect of the affected area is effectively performed. be able to.

  Further, in the present embodiment, the subject coordinate system P is set with a specific part (for example, a xiphoid projection) of the subject as an origin. Therefore, not only before the treatment but also during or after the treatment with a time interval, for example, the probe 10 can be easily adjusted to the origin Po, and the subject coordinate system P can be specified accurately. As a result, the object coordinate system P at the time of volume data acquisition (for example, before treatment) and at the time of ultrasonic imaging (for example, during treatment or after treatment) can be matched, corresponding to the scan plane of the ultrasonic tomographic image. The process of extracting tomographic image data is simplified.

  In addition to the xiphoid process, the specific part of the subject set as the origin Po may be a rib or the like. In short, it is a part that is an objective reference for determining the correspondence between the volume data and the position coordinates of the subject. I just need it. Moreover, although the example which contrasts the ultrasonic tomogram before and behind treatment was demonstrated, you may make it contrast the color flow mapping image (CFM image) before and behind treatment.

  FIG. 5 is a comparative example of a display screen for explaining the effect of the present embodiment. FIG. 5A is a conventional display example, and FIG. 5B is a display example of the present embodiment. As shown in FIG. 5A, conventionally, since the positional relationship between the volume data acquired before the treatment and the subject is not associated with each other, the display cross sections of the pre-treatment tomographic image 42 and the post-treatment tomographic image 44 are displayed. Is displayed shifted. In this regard, according to the present embodiment, as shown in FIG. 5B, the display cross sections of the pre-treatment tomographic image 46 and the post-treatment tomographic image 48 can be displayed in conformity with each other. Diagnosis can be effectively performed, such as determination.

  Further, the pre-treatment tomographic image 42 shown in FIG. 5A is displayed as a still image, whereas the pre-treatment tomographic image 46 shown in FIG. 5B is a change in the scan plane of the post-treatment tomographic image 48. The display cross section is displayed as a moving image that changes in real time. Therefore, according to this embodiment, in addition to being able to determine the therapeutic effect, there is also an effect that the usability of the apparatus is improved. In this embodiment, since the convex probe 10 in which a plurality of transducers are arranged in an arc shape is used, each tomogram in FIG. 5 is displayed in a sector shape. The same effect can be obtained even when a type probe is used.

  FIG. 6 is an example of a graphical user interface (GUI). As shown in FIG. 6, the GUI includes a registration button for instructing the start or end of detection processing for a specific part of the subject, and a system for reconstructing an ultrasonic tomographic image as a processing system for acquiring volume data. A switching button for switching to a system for reconstructing a reference tomographic image, a volume reading button for instructing storage or acquisition of volume data, a start button for starting ultrasonic imaging, and a post-treatment tomographic image 54 whose scan plane changes in real time. It has a freeze button that stops at the desired scan plane. Thus, the examiner can acquire volume data before treatment interactively via the displayed GUI, or can capture a post-treatment tomogram, thereby improving the usability of the apparatus. Further, the position and size may be displayed using the volume data area as ROI information.

  Second Embodiment An ultrasonic diagnostic apparatus to which the present invention is applied will be described with reference to FIG. This embodiment differs from the first embodiment in that volume data is acquired using an X-ray CT apparatus or a magnetic resonance imaging (MRI) apparatus instead of acquiring volume data by ultrasonic scanning before treatment. There is to do. Therefore, description of the same parts as those in the first embodiment is omitted, and differences will be described.

  FIG. 7 is a configuration diagram of the ultrasonic diagnostic apparatus of the present embodiment. As shown in FIG. 7, volume data is acquired by the diagnostic imaging apparatus 60, and the acquired volume data is output to the volume data creation unit 26. As the diagnostic imaging apparatus 60, for example, an X-ray CT apparatus or a magnetic resonance imaging (MRI) apparatus can be used.

  In this way, when an X-ray CT apparatus, an MRI apparatus, or the like is applied instead of the ultrasonic imaging apparatus, the xiphoid process can be detected with reference to the display image in the process of S102 of FIG. For example, a diagnostic image of the subject is captured by an X-ray CT apparatus. When the xiphoid process is drawn on the captured diagnostic image, the position of the xiphoid process is specified by indicating the position of the drawn xiphoid process using a pointing device of the console 41, for example. Then, a subject coordinate system P having the specified sword projection as the origin Po is set. Each voxel of the volume data is associated with the set coordinate position of the subject coordinate system P. Other processes are basically the same as those in the first embodiment. The coordinate position of the xiphoid protrusion may be automatically detected by image processing based on the luminance distribution of the image.

  According to the present embodiment, the area of volume data acquired before treatment can be made wider than when acquired by, for example, ultrasonic scanning. Therefore, for example, even when the affected area covers a plurality of parts of the subject, the volume data is associated with a wide range of parts of the subject, so that the pre-treatment tomogram corresponding to the post-treatment tomogram is accurately reconstructed for each affected part. In addition, the display sections can be matched. Therefore, in addition to the effects of the first embodiment, the diagnosis can be performed more effectively. Moreover, the difference in the therapeutic effect of each affected part can be compared by arranging the tomographic images of each affected part before and after treatment on the display screen of the display unit 18 and displaying them at the same time.

  As mentioned above, although this invention was demonstrated based on 1st and 2nd embodiment, it is not restricted to this. For example, although the example which determines the therapeutic effect of the tumor of an abdominal part was demonstrated, this invention is applicable to the therapeutic effect determination of various affected parts. In addition, the present invention is not limited to the determination of the therapeutic effect, and the present invention is used when matching the display cross section of the tomographic image captured earlier and the tomographic image captured later, such as when comparing the progress of the disease with time. Can be applied.

  In addition, when determining the therapeutic effect of a part with movement (for example, a heart or a blood vessel), in addition to associating the coordinate position of the subject with each voxel of the volume data, movement information (for example, heart rate) is associated with each voxel. It is desirable to relate the phase data (pulse wave). Thereby, even when the shape of the affected part changes with time, tomographic images before and after treatment in the same time phase or the same pulse wave can be displayed.

  Also, instead of displaying the pre-treatment tomographic image and the post-treatment tomographic image side by side, the difference between the pre-treatment tomographic image and the post-treatment tomographic image is obtained, and the obtained difference is displayed or corresponds to the obtained difference. The image may be displayed with color. It is also possible to change the hue data of one tomographic image to make a translucent tomographic image and to display the translucent tomographic image superimposed on the other tomographic image. Thereby, determination of a therapeutic effect becomes still easier.

1 is a configuration diagram of an ultrasonic diagnostic apparatus according to an embodiment to which the present invention is applied. It is a flowchart which shows the process before the treatment which matches position data with the volume data acquired by ultrasonic imaging. It is a flowchart which shows the process after the treatment which displays the reference image corresponding to the scanning surface of the ultrasonic tomogram imaged by ultrasonic imaging. It is a conceptual diagram for demonstrating the process of FIG.2 and FIG.3. It is a display example of embodiment which applied the example of a conventional display, and this invention. It is an example of a graphical user interface (GUI). It is a block diagram of the ultrasound diagnosing device of other embodiment to which this invention is applied.

Explanation of symbols

DESCRIPTION OF SYMBOLS 10 Probe 12 Transmission / reception part 14 Ultrasonic image structure part 16 Image memory 18 Display part 20 Position sensor 26 Volume data preparation part 28 Volume data storage part 30 Reference image structure part 40 Control part

Claims (4)

  1. Storage means for storing the volume data of the subject acquired by the ultrasonic diagnostic apparatus in association with the subject coordinate system preset for the subject;
    Based on the detection value of the sensor that detects the position and tilt of the ultrasonic probe, the coordinates of the scan plane of the ultrasonic tomographic image captured by the ultrasonic probe are calculated in association with the subject coordinate system. Calculating means for
    Extracting means for extracting tomographic image data corresponding to the scan plane of the ultrasonic tomographic image from the volume data and reconstructing a reference tomographic image;
    Display means for displaying the ultrasonic tomogram and the reference tomogram,
    The subject coordinate system in which the volume data is stored in association with the storage means is the ultrasound in a state where a specific portion of the subject to be used as a reference point is depicted on the ultrasonic tomographic image to be imaged. Defined based on probe position and tilt detection values,
    The calculation means calculates the position and inclination of the ultrasonic probe in a state where a specific part of the subject to be used as a reference point is depicted on the ultrasonic tomographic image captured after acquiring the volume data. An ultrasonic diagnostic apparatus , wherein a coordinate system of the ultrasonic tomographic image captured after acquisition of the volume data is associated with the subject coordinate system of the volume data based on a detection value .
  2. The hood of the subject, the is in one or Re not have the xiphoid and ribs of the subject, according to claim 1, characterized in that to set the specific site as the origin of the subject coordinate system Ultrasound diagnostic equipment.
  3.   The ultrasonic diagnostic apparatus according to claim 1, wherein the reference tomographic image has a display cross section that changes in real time following a change in a scan plane of the ultrasonic tomographic image.
  4.   The volume data is acquired before treatment of the subject, and the ultrasonic tomographic image is captured during or after treatment of the subject. An ultrasonic diagnostic apparatus according to 1.
JP2004118985A 2004-04-14 2004-04-14 Ultrasonic diagnostic equipment Active JP4677199B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004118985A JP4677199B2 (en) 2004-04-14 2004-04-14 Ultrasonic diagnostic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004118985A JP4677199B2 (en) 2004-04-14 2004-04-14 Ultrasonic diagnostic equipment

Publications (3)

Publication Number Publication Date
JP2005296436A JP2005296436A (en) 2005-10-27
JP2005296436A5 JP2005296436A5 (en) 2005-10-27
JP4677199B2 true JP4677199B2 (en) 2011-04-27

Family

ID=35328691

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004118985A Active JP4677199B2 (en) 2004-04-14 2004-04-14 Ultrasonic diagnostic equipment

Country Status (1)

Country Link
JP (1) JP4677199B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015083471A1 (en) * 2013-12-05 2015-06-11 オリンパス株式会社 Ultrasonic observation device, ultrasonic observation device operation method, and ultrasonic observation device operation program
US9888906B2 (en) 2013-12-12 2018-02-13 Konica Minolta, Inc. Ultrasound diagnostic apparatus, ultrasound image processing method, and non-transitory computer readable recording medium

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101123911B (en) * 2005-11-25 2012-02-15 株式会社东芝 Medical diagnostic imaging device
JP5283839B2 (en) * 2005-11-25 2013-09-04 東芝メディカルシステムズ株式会社 Medical diagnostic imaging system
JP5543681B2 (en) * 2006-03-15 2014-07-09 株式会社日立メディコ Ultrasonic diagnostic equipment
EP2255847A1 (en) * 2006-08-11 2010-12-01 Koninklijke Philips Electronics N.V. Ultrasound system for cerebral blood flow monitoring
JP4818846B2 (en) * 2006-08-16 2011-11-16 富士フイルム株式会社 Medical image processing apparatus and medical image processing program
JP5324041B2 (en) * 2006-11-09 2013-10-23 株式会社日立メディコ Ultrasonic diagnostic equipment
EP2104919A2 (en) * 2006-11-27 2009-09-30 Philips Electronics N.V. System and method for fusing real-time ultrasound images with pre-acquired medical images
JP2008259764A (en) * 2007-04-13 2008-10-30 Toshiba Corp Ultrasonic diagnostic equipment and diagnosis program of the equipment
EP2286370A4 (en) * 2008-05-02 2014-12-10 Eyeic Inc System for using image alignment to map objects across disparate images
JP2009297072A (en) 2008-06-10 2009-12-24 Toshiba Corp Ultrasonic diagnostic apparatus and medical image processing apparatus
US20110144500A1 (en) * 2008-07-22 2011-06-16 Hitachi Medical Corporation Ultrasonic diagnostic apparatus and method for calculating coordinates of scanned surface thereof
US20110224550A1 (en) * 2008-11-14 2011-09-15 Hitachi Medical Corporation Ultrasound diagnostic system and method for generating standard image data for the ultrasound diagnostic system
JP5284123B2 (en) * 2009-01-20 2013-09-11 株式会社東芝 Ultrasonic diagnostic apparatus and position information acquisition program
JP5513790B2 (en) * 2009-07-06 2014-06-04 株式会社東芝 Ultrasonic diagnostic equipment
WO2011052402A1 (en) 2009-10-27 2011-05-05 株式会社 日立メディコ Magnetic-field measurement jig, magnetic-field measurement program, and inspection device provided with magnetic position detector
JP5950619B2 (en) * 2011-04-06 2016-07-13 キヤノン株式会社 Information processing device
CN103476343A (en) * 2011-05-30 2013-12-25 松下电器产业株式会社 Ultrasound diagnostic apparatus and image acquisition method using ultrasonic waves
JP2013135961A (en) * 2013-04-08 2013-07-11 Toshiba Corp Ultrasonic diagnosis apparatus and medical image processing device
JP5685637B2 (en) * 2013-12-09 2015-03-18 株式会社日立メディコ Ultrasonic diagnostic equipment
JP6305773B2 (en) * 2014-01-21 2018-04-04 キヤノンメディカルシステムズ株式会社 Ultrasonic diagnostic apparatus, image processing apparatus, and program
JP6510200B2 (en) * 2014-08-25 2019-05-08 キヤノンメディカルシステムズ株式会社 Ultrasonic diagnostic apparatus and control program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6268442A (en) * 1985-09-24 1987-03-28 Toshiba Corp Ultrasonic diagnostic apparatus
JPH02128759A (en) * 1988-11-10 1990-05-17 Hiroshi Furuhata Ultrasonic probe support apparatus for diagnosis of head
JPH07311834A (en) * 1994-05-19 1995-11-28 Toshiba Corp Image processor and its aid
JPH10151131A (en) * 1996-11-25 1998-06-09 Hitachi Medical Corp Ultrasonograph
JP2000300557A (en) * 1999-04-22 2000-10-31 Aloka Co Ltd Ultrasonic diagnostic device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6268442A (en) * 1985-09-24 1987-03-28 Toshiba Corp Ultrasonic diagnostic apparatus
JPH02128759A (en) * 1988-11-10 1990-05-17 Hiroshi Furuhata Ultrasonic probe support apparatus for diagnosis of head
JPH07311834A (en) * 1994-05-19 1995-11-28 Toshiba Corp Image processor and its aid
JPH10151131A (en) * 1996-11-25 1998-06-09 Hitachi Medical Corp Ultrasonograph
JP2000300557A (en) * 1999-04-22 2000-10-31 Aloka Co Ltd Ultrasonic diagnostic device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015083471A1 (en) * 2013-12-05 2015-06-11 オリンパス株式会社 Ultrasonic observation device, ultrasonic observation device operation method, and ultrasonic observation device operation program
JP5797364B1 (en) * 2013-12-05 2015-10-21 オリンパス株式会社 Ultrasonic observation apparatus, operation method of ultrasonic observation apparatus, and operation program of ultrasonic observation apparatus
US9888906B2 (en) 2013-12-12 2018-02-13 Konica Minolta, Inc. Ultrasound diagnostic apparatus, ultrasound image processing method, and non-transitory computer readable recording medium

Also Published As

Publication number Publication date
JP2005296436A (en) 2005-10-27

Similar Documents

Publication Publication Date Title
JP5345275B2 (en) Superposition of ultrasonic data and pre-acquired image
JP4795099B2 (en) Superposition of electroanatomical map and pre-acquired image using ultrasound
US6413219B1 (en) Three-dimensional ultrasound data display using multiple cut planes
JP4828802B2 (en) Ultrasonic diagnostic equipment for puncture therapy
JP5303147B2 (en) Ultrasonic diagnostic device for generating elastic images
EP1324070A1 (en) Diagnostic ultrasonic imaging system
JP5265091B2 (en) Display of 2D fan-shaped ultrasonic image
KR101140525B1 (en) Method and apparatus for extending an ultrasound image field of view
US20050033160A1 (en) Image processing/displaying apparatus and method of controlling the same
JP2006305358A (en) Three-dimensional cardiac imaging using ultrasound contour reconstruction
JP2006305359A (en) Software product for three-dimensional cardiac imaging using ultrasound contour reconstruction
JPWO2007046272A6 (en) Ultrasonic diagnostic device for generating elastic images
CN102512209B (en) Ultrasound Diagnostic Equipment
EP1815796A1 (en) Ultrasonograph and ultrasonic image display method
DE69831138T2 (en) System for illustrating a twin-dimensional ultrasound image in a three-dimensional image communication environment
EP1982654B1 (en) Ultrasound diagnostic device and control method for ultrasound diagnostic device
US6764449B2 (en) Method and apparatus for enabling a biopsy needle to be observed
JP4919972B2 (en) Elastic image display method and elastic image display device
JP5230589B2 (en) Ultrasonic device, ultrasonic imaging program, and ultrasonic imaging method
US6416476B1 (en) Three-dimensional ultrasonic diagnosis apparatus
JP5208495B2 (en) Medical system
JP5264097B2 (en) Ultrasonic diagnostic equipment
EP1956555A1 (en) Method and device of synthesizing panorama image from ultrasonic images
US8123691B2 (en) Ultrasonic diagnostic apparatus for fixedly displaying a puncture probe during 2D imaging
US6500123B1 (en) Methods and systems for aligning views of image data

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20070409

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20070409

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100119

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100315

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100928

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20101126

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20110111

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20110131

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140204

Year of fee payment: 3

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313111

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350