JP4470187B2 - Ultrasonic device, ultrasonic imaging program, and ultrasonic imaging method - Google Patents

Ultrasonic device, ultrasonic imaging program, and ultrasonic imaging method Download PDF

Info

Publication number
JP4470187B2
JP4470187B2 JP2006547993A JP2006547993A JP4470187B2 JP 4470187 B2 JP4470187 B2 JP 4470187B2 JP 2006547993 A JP2006547993 A JP 2006547993A JP 2006547993 A JP2006547993 A JP 2006547993A JP 4470187 B2 JP4470187 B2 JP 4470187B2
Authority
JP
Japan
Prior art keywords
ultrasonic
region
interest
image
probe
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2006547993A
Other languages
Japanese (ja)
Other versions
JPWO2006059668A1 (en
Inventor
雅 山本
Original Assignee
株式会社日立メディコ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2004351489 priority Critical
Priority to JP2004351489 priority
Application filed by 株式会社日立メディコ filed Critical 株式会社日立メディコ
Priority to PCT/JP2005/022056 priority patent/WO2006059668A1/en
Publication of JPWO2006059668A1 publication Critical patent/JPWO2006059668A1/en
Application granted granted Critical
Publication of JP4470187B2 publication Critical patent/JP4470187B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data

Description

  The present invention relates to an ultrasonic imaging technique for rendering a region of interest of a subject in an ultrasonic image.

  An ultrasonic device that picks up an ultrasonic image related to a subject supplies a driving signal for transmission to an ultrasonic probe to transmit ultrasonic waves to the subject, and ultrasonically reflects reflected echo generated from the subject. The image is received by the probe, and an ultrasonic image (for example, an ultrasonic tomographic image) is reconstructed and displayed based on the received signal.

  In such an ultrasonic apparatus, when confirming the therapeutic effect of the subject or performing ultrasonic treatment, the region of interest set in advance in the subject is redrawn on the ultrasonic tomographic image. .

  For example, before treating a subject, three-dimensional image data (hereinafter referred to as volume data) regarding the subject is acquired. Next, the treatment site of the subject is set as the region of interest in the volume data. Then, after the subject is treated or during the treatment, the position and inclination of the ultrasound probe are adjusted so that the scan surface of the ultrasound probe is aligned with the position of the region of interest. As a result, the region of interest after or during treatment is redrawn on the ultrasonic tomogram. At the same time, a reference image having the same cross section as that of the ultrasonic tomographic image, that is, an image of the region of interest before treatment is constituted from the volume data. Such an imaging method is described in a patent document (JP10-151131A).

  However, in the method as in the above-mentioned patent document, when redrawing the region of interest on the ultrasonic image, it is necessary to adjust the position and inclination of the ultrasonic probe while visually checking the displayed ultrasonic image. Is done. Since such an operation is performed depending on an operator's empirical rule or intuition, the accuracy and required time for redrawing the region of interest on the ultrasonic image may be affected by the difference of the operator.

  An object of the present invention is to realize an ultrasonic apparatus that is more suitable for redrawing a region of interest of a subject on an ultrasonic image.

In order to achieve the above object, an ultrasonic apparatus according to the present invention includes an ultrasonic probe that transmits and receives ultrasonic waves to and from a subject, and a transmission that supplies a driving signal for transmission to the ultrasonic probe. Means, receiving means for processing a reception signal output from the ultrasonic probe, an image constructing unit for forming an ultrasonic image based on the signal output from the receiving means, and displaying the ultrasonic image Display means . Further, the luminance information and the tissue elasticity information based on at least one of acquisition means for capturing volume data relating to the subject, a position of a mark superimposed on the tomographic image, and luminance information and tissue elasticity information of the tomographic image The region of interest setting means for setting the pixel region included in the setting range as the region of interest. Then, the position data of the ultrasound probe, based on the position data of the set region of interest, the scan plane of the ultrasonic probe to generate guide information to guide the position of the region of interest Guide information generating means for changing the color of the guide information in accordance with the distance of the ultrasound probe to the region of interest displayed on the display means.

According to a preferred embodiment of the present invention, the guide information is displayed as an objective index for guiding the position and inclination of the ultrasonic probe being imaged to the target state. The target state is the position or inclination of the ultrasonic probe when the position of the region of interest set in advance on the subject is included in the scan plane. By visually recognizing such guide information, it is possible to quantitatively grasp the target movement direction, target movement amount, target inclination angle, and the like of the ultrasonic probe. As a result, when the position and inclination of the ultrasonic probe are adjusted according to the guide information, the region of interest is accurately and easily redrawn on the ultrasonic image regardless of the operator's difference. Further, the guide information generating means generates and displays an arrow image indicating the target movement direction of the ultrasonic probe as guide information based on, for example, the position data of the ultrasonic probe and the position data of the region of interest, The width and color of the arrow of the arrow image can be changed according to the distance of the ultrasonic probe to the region of interest.

Further, the ultrasonic imaging program of the present invention provides a procedure for supplying a driving signal for transmission to an ultrasonic probe that transmits and receives ultrasonic waves to and from a subject, and a reception output from the ultrasonic probe. The control computer executes a procedure for processing a signal, a procedure for forming an ultrasound image based on the signal after the reception processing, and a procedure for displaying the ultrasound image . Also, a procedure for capturing volume data relating to the subject, a procedure for constructing a tomographic image based on the volume data, a position of a mark superimposed on the tomographic image, luminance information of the tomographic image, and tissue elasticity information The control computer is caused to execute a procedure for setting, as a region of interest, a pixel region in which at least one of the luminance information and the tissue elasticity information is included in a setting range with at least one as a reference. Then, the position data of the ultrasound probe, based on the position data of the set region of interest, the scan plane of the ultrasonic probe to generate guide information to guide the position of the region of interest It is characterized in that the control computer is caused to execute a procedure of displaying on the display means and changing the color of the guide information in accordance with the distance of the ultrasonic probe to the region of interest.

The ultrasonic imaging method of the present invention includes a step of supplying a driving signal for transmission to an ultrasonic probe that transmits and receives ultrasonic waves to and from a subject, and a reception output from the ultrasonic probe. A step of processing a signal, a step of forming an ultrasonic image based on the signal after the reception processing, and a step of displaying the ultrasonic image . Also, a step of capturing volume data relating to the subject, a step of constructing a tomographic image based on the volume data, a position of a mark superimposed on the tomographic image, luminance information of the tomographic image, and tissue elasticity information A step of setting, as a region of interest, a pixel region in which at least one of the luminance information and the tissue elasticity information is included in a setting range with at least one as a reference. Then, the position data of the ultrasound probe, based on the position data of the set region of interest, the scan plane of the ultrasonic probe to generate guide information to guide the position of the region of interest It is characterized by having a step of changing the color of the guide information according to the distance of the ultrasonic probe to the region of interest displayed on the display means.

It is a block diagram which shows the structure of the ultrasonic device of one Embodiment to which this invention is applied. It is a flowchart which shows the preparation process of the ultrasonic device of FIG. 3 is a flowchart illustrating an imaging process of the ultrasonic apparatus of FIG. 1. It is a figure which shows the example of a display of the screen which sets a region of interest. It is a figure which shows the example of a display of guide information. It is a figure which shows the example by which the guide information which guides the scanning surface of a probe to the position of a region of interest was displayed. It is a figure which shows the form in which the model image was displayed. It is a figure which shows the example of a display of other guide information. It is a figure which shows the other example of a display of the screen which sets a region of interest. It is a figure which shows the example of a display of the setting screen of a target cross section. It is a figure which shows the example which displayed the setting screen of FIG.9 and FIG.10 side by side on the display screen of FIG.

  An embodiment of an ultrasonic apparatus to which the present invention is applied will be described with reference to the drawings. FIG. 1 is a block diagram showing the configuration of the ultrasonic apparatus of this embodiment.

  As shown in FIG. 1, the ultrasonic apparatus includes an ultrasonic probe 10 (hereinafter referred to as a probe 10) that transmits and receives ultrasonic waves to and from a subject H, and a probe 10 for transmitting waves. A transmission / reception unit 12 that supplies a drive signal and processes a reception signal output from the probe 10, and an image that forms an ultrasonic image (for example, an ultrasonic tomographic image) based on the reception signal output from the transmission / reception unit 12 The image forming unit 14 includes an ultrasonic image forming unit 14 as a forming unit, an image display unit 16 as display means for displaying an ultrasonic tomographic image formed by the ultrasonic image forming unit 14 on a screen, and the like.

  Here, the ultrasonic apparatus according to the present embodiment includes a guide information calculation unit 20 as means for generating guide information indicating a target movement direction, a target movement amount, a target inclination angle, and the like of the probe 10. Based on the position data of the probe 10 and the position data of the region of interest set in the volume data relating to the subject H acquired in advance, the guide information calculation unit 20 sets the scan surface of the probe 10 to the interest of the subject H. Guide information for guiding to the position of the region is generated and displayed on the image display unit 16.

  The ultrasonic apparatus will be described in more detail. The probe 10 has a plurality of diagnostic transducers arranged. The transducer converts an electrical transmission drive signal supplied from the transmission / reception unit 12 into an ultrasonic wave and emits it toward the subject H. The vibrator receives the reflected echo generated from the subject H and converts it into an electrical reception signal. A plurality of therapeutic vibrations may be arranged in addition to the diagnostic vibrator. In that case, the frequency of the ultrasonic wave transmitted from the therapeutic transducer is set to be smaller than that of the diagnostic transducer. The received signal output from the probe 10 is processed by the transmission / reception unit 12.

  The transmission / reception unit 12 includes transmission means for supplying a drive signal for transmission to the probe 10 and reception means for processing a reception signal output from the probe 10. The receiving unit performs amplification processing and phasing processing on the reception signal output from the probe 10 and then outputs the received signal to the ultrasonic image forming unit 14.

  The ultrasonic image constructing unit 14 performs processing such as detection on the reception signal output from the transmitting / receiving unit 12 to construct an ultrasonic tomographic image. The ultrasonic tomographic image here is a two-dimensional image corresponding to the scan plane of the probe 10. Then, the ultrasonic image constructing unit 14 outputs the ultrasonic tomographic image to the image memory control unit 24.

  The image memory control unit 24 associates a frame number with each ultrasonic tomographic image output from the ultrasonic image construction unit 14 and stores it in the storage area. The frame number is an image management number corresponding to the ultrasonic tomographic image.

  In addition, a magnetic position sensor 22 that acquires the position and inclination of the probe 10 is provided. The magnetic position sensor 22 is a probe based on a magnetic sensor as a magnetic signal detection means affixed to the probe 10, a source as a magnetic field generator attached to a bed or the like, and a detection signal output from the magnetic sensor. Arithmetic means for calculating the position, inclination, etc. of the child 10 (hereinafter referred to as the position data of the probe 10 as appropriate). Then, the magnetic position sensor 22 outputs the position data of the probe 10 to the position information calculation holding unit 26. Note that a form using an optical signal may be applied instead of a form using a magnetic signal. In short, it is sufficient that the position data of the probe 10 can be acquired.

  The position information calculation holding unit 26 associates the position data of the probe 10 with the frame number of the ultrasonic tomographic image. For example, the position data of the probe 10 output from the magnetic position sensor 22 is associated with the frame number notified from the image memory control unit 24. Then, the position information calculation holding unit 26 outputs the position data of the probe 10 to the position information acquisition unit 28 in accordance with the control command.

  The position information acquisition unit 28 captures the position data of the probe 10 output from the position information calculation holding unit 26. For example, the position information acquisition unit 28 acquires the position data of the probe 10 being imaged from the position information calculation holding unit 26 during real-time imaging. Further, the position information acquisition unit 28 acquires the position data of the probe 10 associated with the ultrasonic tomographic image read from the image memory control unit 24 from the position information calculation holding unit 26 at the time of so-called freeze imaging. Then, the position information acquisition unit 28 outputs position data to the guide information calculation unit 20 and the reference image construction unit 30.

  In addition, a volume data acquisition processing unit 18 that captures three-dimensional image data (hereinafter referred to as volume data) related to the subject H is provided. The volume data here is acquired by an imaging apparatus such as an ultrasonic imaging apparatus, an X-ray CT imaging apparatus, or a magnetic resonance imaging apparatus, for example, before treatment. The volume data acquisition processing unit 18 stores the volume data acquired from the image capturing apparatus in the storage area. Then, the volume data processing unit 18 reads the volume data from the storage area in response to a command input via the operation panel 32 and outputs the volume data to the reference image construction unit 30 or the region of interest designation unit 34.

  The reference image construction unit 30 reconstructs a reference image using the volume data output from the volume data acquisition processing unit 18 based on the position data of the probe 10 notified from the position information acquisition unit 28. For example, during real-time imaging, the reference image construction unit 30 reconstructs a tomographic image having the same cross section as the scan plane of the probe 10 being imaged as a reference image. The reference image construction unit 30 reconstructs a tomographic image having the same cross section as the ultrasonic tomographic image read from the image memory control unit 24 as a reference image during so-called freeze imaging.

  The region-of-interest specifying unit 34 sets a region of interest for the volume data output from the volume data acquisition processing unit 18. The region of interest here is a point or range corresponding to a site to be diagnosed or treated (for example, a liver tumor). For example, the region-of-interest specifying unit 34 sequentially displays a plurality of tomographic images having different cross-sectional directions configured from volume data. Then, the region-of-interest designating unit 34 designates the region of interest on the tomographic image, and outputs the voxel coordinates of the designated region (hereinafter, appropriately referred to as region-of-interest position data) to the guide information calculation unit 20. Such a region of interest is specified via the operation panel 32.

  The guide information calculation unit 20 acquires the position data of the region of interest from the region of interest specifying unit 34, and acquires the position data of the probe 10 from the position information acquisition unit 28 at a set time interval (for example, in real time). Then, the guide information calculation unit 20 calculates the relative position of the probe 10 with respect to the region of interest, and generates, as guide information, an image whose display form changes following the change in the size of the relative position. The guide information here is an objective index for guiding the position and inclination of the probe 10 being imaged to the target state. For example, the guide information is a guide image or a character image indicating the target movement direction, target movement amount, target inclination angle, rotation direction, etc. of the probe 10 and is updated as needed according to the position and inclination of the probe 10. The The target state is the position or inclination of the probe 10 when the position of the region of interest is included in the scan plane of the probe 10.

  In addition, the guide information calculation unit 20 can output a command to the sound generation unit 35 serving as a sound generation unit to change a sound generation interval following the change in the relative position of the probe 10 with respect to the region of interest. For example, the guide information calculation unit 20 can output a command to shorten the sound generation interval as the size of the relative position of the probe 10 with respect to the region of interest decreases. The guide information calculation unit 20 can output a command for generating a notification sound when the relative position becomes zero, that is, when the probe 10 is in a target state. Here, the sound generator 35 includes a buzzer, a speaker, and the like that generate sound intermittently.

  The display control unit 36 displays an image of the ultrasonic tomogram read from the image memory control unit 24, the reference image output from the reference image configuration unit 30, and the guide information output from the guide information calculation unit 20. This is displayed on the part 16. The ultrasonic tomographic image here is an ultrasonic image corresponding to the scan plane of the probe 10. The reference image is a tomographic image having the same cross section as the ultrasonic tomographic image being displayed. The guide information is, for example, an arrow image that guides the probe 10 to the position of the region of interest set in the reference image in order to set the scan plane of the probe 10 to the same cross section as the tomographic image of the reference image.

  The display control unit 36 performs control according to a command input from the operation panel 32. For example, in response to a command input from the operation panel 32, the display control unit 36 performs control for reading an ultrasonic tomographic image from the image memory control unit 24 during freeze imaging, control for displaying a reference image side by side on the ultrasonic tomographic image, Control for displaying guide information and control for selecting and switching guide information are performed. The operation panel 32 has input means such as a keyboard, a mouse, and a pointing device.

  The basic operation of the ultrasonic apparatus configured as described above will be described. First, the probe 10 is brought into contact with, for example, the body surface of the subject H. Thereafter, when a drive signal is supplied to the probe 10 from the transmission / reception unit 12, ultrasonic waves are emitted from the probe 10 toward the subject H. A reflected echo generated in the process of ultrasonic waves propagating through the subject H is received by the probe 10 and converted into a received signal. The reception signal output from the probe 10 is subjected to processing such as amplification by the transmission / reception unit 12. An ultrasonic tomographic image is reconstructed by the ultrasonic image construction unit 14 based on the received signal after processing. The reconstructed ultrasonic tomographic image is stored in the image memory control unit 24. The stored ultrasonic tomographic image is read by the display control unit 36 and then displayed on the screen of the image display unit 16.

  At the same time as the ultrasonic tomographic image is captured, a reference image having the same cross section as the ultrasonic tomographic image is reconstructed from the volume data by the reference image construction unit 30. The volume data here is taken from, for example, an X-ray CT apparatus by the volume data acquisition processing unit 18 and relates to the subject H before treatment. The reference image is read by the display control unit 36 and then displayed side by side on the same screen as the ultrasonic tomographic image on the image display unit 16.

  The ultrasonic imaging process of this embodiment will be described in detail. This process is broadly divided into a preparation process performed before the treatment of the subject H, for example, and an imaging process performed after the treatment of the subject H, for example. FIG. 2 is a flowchart showing the preparation steps of this embodiment.

<Volume data acquisition process: S100>
First, a plurality of volume data relating to a subject is constructed in advance by an image capturing apparatus such as an X-ray CT apparatus. Of these volume data, the volume data related to the subject H before treatment is designated on the operation panel 32 as processing target data. For example, after a graphical user interface (GUI) is displayed on the image display unit 16, a volume data selection command is input from the operation panel 32 via the GUI. Here, the GUI displays an input field for designating a storage destination of volume data (for example, an imaging device or a database server) and an input field for designating a name of volume data (for example, the subject H before treatment). This is the menu screen. Then, the volume data acquisition processing unit 18 acquires volume data related to the subject H before treatment from the storage destination.

<Volume data coordinate reference point setting step: S101>
The volume data acquisition processing unit 18 sets a reference coordinate origin for the volume data related to the subject H before treatment. For example, the volume data acquisition processing unit 18 configures a plurality of tomographic images from the volume data, and causes the image display unit 16 to display the tomographic images in order via the display control unit 36. Next, a characteristic part (for example, a sword-like projection) on the displayed tomographic image is designated by the operation panel 32 as a reference point. Then, the volume data acquisition processing unit 18 assigns three-dimensional orthogonal coordinates with the reference point as the origin as the coordinates of the volume data. The volume data acquisition processing unit 18 takes in information related to the probe 10 (for example, the scanning range of the sector type probe and the scanning range of the convex type probe), and on the tomographic image based on the acquired information. The range in which the reference point can be specified may be limited. If a reference point has already been set in the volume data, this step may be omitted.

<Region of interest setting step: S102>
The region-of-interest specifying unit 34 sets a region to be diagnosed or treated (for example, a liver tumor) as a region of interest (ROI) for the volume data related to the subject H before treatment. For example, the region-of-interest specifying unit 34 constructs a plurality of tomographic images from the volume data, and causes the image display unit 16 to display the tomographic images in order via the display control unit 36. Next, when the part of the liver tumor on the displayed tomographic image is designated as the region of interest by the operation panel 32, the region of interest designating unit 34 obtains the position data of the region of interest in the coordinate system set in the step of S101. The information is output to the guide information calculation unit 20. Note that the region of interest here may be one or a plurality of points, or a range having a certain width. In addition, when the region of interest is already set in the volume data, this step can be omitted.

<Position coordinate setting step of the probe 10: S103>
The position coordinates of the probe 10 are associated with the coordinates of the volume data set in step S101. More specifically, when ultrasonic imaging is performed by bringing the probe 10 into contact with the body surface of the subject H, the image display unit 16 displays an ultrasonic tomographic image. Next, when a characteristic part (for example, a sword-like projection) on the displayed ultrasonic tomographic image is designated by the operation panel 32 as a reference point, the magnetic position sensor 22 calculates three-dimensional orthogonal coordinates with the reference point as the origin. Assigned as the position coordinates of the probe 10. In short, by matching the origin of the coordinates of the probe 10 with the origin of the coordinates of the volume data, the scan plane coordinates of the probe 10 are associated with the coordinates of the volume data. Note that the position of the probe 10 when the xiphoid process is drawn on the ultrasonic tomographic image may be taken as a reference point. In that case, it is desirable to use the position when the xiphoid process is drawn on the ultrasonic tomographic image with the maximum size as a reference point. The timing for taking in the reference point of the probe 10 is determined by an input command to the operation panel 32.

  Although the preparatory process of this embodiment was demonstrated and divided into the process of S101-S103, it is not restricted to this form. In short, any method that can associate the volume data coordinates acquired by the volume data acquisition processing unit 18 with the scan plane coordinates of the probe 10 may be used.

  FIG. 3 is a flowchart showing the imaging process of the present embodiment. The imaging step shown in FIG. 3 is to redraw the region of interest set before the treatment on the ultrasonic image during or after the treatment of the subject H, for example. Therefore, when the ultrasonic image displayed in this imaging process is visually confirmed, for example, the therapeutic effect of the subject H can be confirmed, and minimally invasive treatment (IVR: interventional radiology) for treating the region of interest with therapeutic ultrasonic waves can be accurately performed. Can be done.

<Acquisition step of probe 10 position data: S200>
When ultrasonic scanning is performed while the probe 10 is in contact with the subject H during or after treatment, the magnetic position sensor 22 acquires position data of the probe 10 at set time intervals. The set time interval can be changed as necessary, but it is desirable to acquire the position data in real time. Then, the magnetic position sensor 22 passes the position data of the probe 10 to the position information acquisition unit 28 via the position information calculation holding unit 26.

<Guide information generation step: S201>
Based on the position data of the probe 10 acquired from the position information acquisition unit 28 and the position data of the region of interest acquired from the region-of-interest specifying unit 34, the guide information calculation unit 20 captures the probe 10 being imaged. Guide information for guiding the position and inclination to the target state is calculated. The target position data here is data of the position and inclination of the ultrasonic probe when the region of interest is included in the scan plane of the probe 10 being imaged. More specifically, the guide information calculation unit 20 calculates the relative position of the probe 10 with respect to the region of interest at a set time interval (for example, in real time). Next, the guide information calculation unit 20 generates guide information based on the calculated relative position. The guide information is a guide image, character image, or the like indicating the target movement direction, target movement amount, target inclination angle, target rotation direction of the probe 10, but is not limited thereto. In short, the guide information may be an objective index indicating whether the region of interest can be redrawn in the ultrasonic tomographic image by how much and in what direction the position of the probe 10 being imaged is moved.

  Note that the guide information calculation unit 20 calculates the relative position of the probe 10 with respect to the region of interest, but the same is true even if the relative position of the scan plane of the probe 10 with respect to the region of interest is calculated. Therefore, in this embodiment, the relative position of the probe 10 with respect to the region of interest is assumed to include the relative position of the scan surface of the probe 10 with respect to the region of interest. In addition, for the case where it is difficult to redraw the region of interest on an ultrasonic tomographic image with a bone or an organ, the guide information calculation unit 20 has a relative positional relationship between the bone and the region of interest and the scan plane of the probe 10. The displayed 3D guide information can be generated.

<Display process: S202>
The display control unit 36 displays the ultrasonic tomogram read from the image memory control unit 24, the reference image read from the reference image configuration unit 30, and the guide information read from the guide information calculation unit 20. It is displayed on the screen of the image display unit 16 at the same time. The ultrasonic tomographic image here is such that the drawing changes as the position and tilt of the probe 10 change, and can be called, for example, a post-treatment image. The reference image displayed in the same cross section as the ultrasonic tomographic image is updated following the change of the ultrasonic tomographic image, and can be referred to as, for example, a pre-treatment image. When the ultrasonic tomographic image and the reference tomographic image are compared and observed, for example, before treatment of the subject H can be compared with after treatment (or during treatment), and minimally invasive treatment can be performed accurately. Further, the guide information is updated as needed as the position and inclination of the probe 10 change. By adjusting the position and inclination of the probe 10 while referring to this guide information, it is possible to monitor whether the scan surface of the probe 10 approaches or moves away from the region of interest.

  As described above, according to the present embodiment, the guide information is displayed on the image display unit 16 as an objective index for guiding the position and inclination of the probe 10 being imaged to the target state. By visually recognizing such guide information, the target movement direction, target movement amount, target inclination angle, etc. of the probe 10 can be quantitatively grasped. As a result, when the position and inclination of the probe 10 are adjusted according to the guide information, the region of interest can be accurately and easily redrawn on the ultrasonic image regardless of the operator's difference. That is, it becomes easy and accurate to redraw the region of interest on the ultrasonic tomographic image being imaged, and the usability for the operator is improved.

  FIG. 4 is a display example of a screen for setting a region of interest in volume data. This setting screen is displayed in step S102 of FIG. In the setting screen shown in FIG. 4, a plurality of tomographic images having different cross-sectional directions are displayed side by side. These tomographic images are composed of volume data related to the subject H. More specifically, the setting screen shown in FIG. 4 includes a tomographic image display area 52 in which the short-axis cross section of the subject H is drawn, and a tomographic image display area 54 in which the long-axis cross section of the subject H is drawn. And a tomographic image display area 55 on which a circular section of the subject H is drawn.

  In addition, the setting screen includes a display area 56 of a 3D stereoscopic image configured by, for example, rendering processing from volume data related to the subject H, a slice plane image 58 that determines a cross section of the 3D stereoscopic image, and a 3D stereoscopic image. It has a composite image display area 60. The setting screen has a tomographic image display area 59 for region of interest setting selected from the tomographic images displayed in the plurality of display areas 52, 52, and 54.

  Further, the setting screen includes a GUI menu 64 for setting a region of interest via the operation panel 32. The GUI menu 64 includes a button for moving a designated point of the region of interest, an input field for designating the region of interest using three-dimensional orthogonal coordinates having an X axis, a Y axis, and a Z axis, and a region of interest determination button.

  With such a setting screen, before treatment of the subject H, the treatment site is set as volume of interest in the volume data. First, a three-dimensional stereoscopic image and a slice plane image 58 are displayed in the display area 60. By adjusting the position and inclination of the slice plane image 58 via the operation panel 32, a plurality of cross sections of the three-dimensional stereoscopic image are determined. The tomographic images corresponding to the respective sections are displayed in the display areas 52, 54, and 55. When a desired tomographic image is selected from the plurality of tomographic images, the selected image is displayed in the display area 59. Next, when the position of the region of interest is input to the GUI menu 64 while referring to the tomographic image in the display area 59, a mark corresponding to the region of interest is displayed on the tomographic image. When moving the mark of the region of interest, X-axis coordinates, Y-axis coordinates, and Z-axis coordinates may be input via the GUI menu 64, or the movement of the mouse or the like may be followed. When the region of interest determination button is clicked, the position of the region of interest is determined. With such a setting screen, the operation of setting the volume of interest on the subject H to be treated or diagnosed can be interactively performed, so that the convenience for the operator is improved.

  FIG. 5 is a display example when the region of interest of the subject H is redrawn in the ultrasonic image. The ultrasonic image shown in FIG. 5 is displayed in step S103 of FIG. As shown in FIG. 5, the display screen includes an ultrasonic tomographic image display area 68 corresponding to the scan surface of the probe 10 being imaged, and a reference image display area having the same cross section as the ultrasonic image of the display area 68. 66. The display screen also includes a guide information display area 70 for guiding the scan plane of the probe 10 to the position of the region of interest, and a display area in which the position mark of the probe 10 is displayed over the body mark related to the subject H. 71.

  For example, when the imaging of the present embodiment is started after the subject H is treated, an ultrasonic tomographic image corresponding to the scan surface of the probe 10 is displayed in the display area 68. A reference image having the same cross section as the ultrasonic tomographic image in the display area 68 is simultaneously displayed in the display area 66. The ultrasonic tomogram in the display area 68 is a drawing of the current tissue of the region of interest of the subject H, and is updated in real time according to changes in the position and tilt of the probe 10. The reference image in the display area 66 is a drawing of a past tissue in the region of interest of the subject H, and is updated following the change in the ultrasonic tomographic image in the display area 68.

  When the position and inclination of the probe 10 are adjusted while referring to the guide information in the display area 70, the current region of interest is drawn on the ultrasonic tomographic image in the display area 68, and the past image is displayed in the reference image in the display area 66. A region of interest is drawn. The region of interest here is set in the process of S102 in FIG. When the ultrasonic tomographic image in the display area 68 and the reference image in the display area 66 are compared and observed, for example, the therapeutic effect of the region of interest and the cure of the disease can be confirmed.

  In the example of FIG. 5, the display area 66 and the display area 68 are arranged side by side, a body mark is displayed over a part of the display area 66, and guide information is displayed over a part of the display area 68. ing. However, the display position is not limited to this form, and the display position can be changed within a range that does not hinder diagnosis or treatment in accordance with a command input to the operation panel 32.

  FIG. 6 is a diagram illustrating an example in which guide information for guiding the scan plane of the probe 10 to the position of the region of interest is displayed based on the relative position of the probe 10 with respect to the region of interest. As shown in FIG. 6A, numerical values and figures are displayed as guide information. As numerical guide information, a target distance 80 in the vertical direction (X-axis direction) of the probe 10, a target distance 82 in the horizontal direction (Y-axis direction), and a target distance 84 in the depth direction (Z-axis direction) A target angle 86 indicating the inclination of the probe is displayed. Here, the X-axis direction corresponds to the longitudinal direction of the subject H, the Y-axis direction corresponds to the short direction of the subject H orthogonal to the X-axis direction, and the Z-axis direction is orthogonal to the X-axis and Y-axis. This corresponds to the depth direction of the subject H. Further, as the graphic guide information, an image obtained by combining the scan plane image 74 and the three-dimensional stereoscopic image 76 in the same coordinate system is displayed. The three-dimensional stereoscopic image 76 is a rendering image in which a surface is drawn based on volume data regarding the subject H. The scan plane image 74 is a plate-like image corresponding to the scan plane of the probe 10 and is displayed by being inserted into the three-dimensional stereoscopic image 76. Here, for the pixel at the coordinate where the three-dimensional stereoscopic image 76 and the scan plane image 74 are combined, an image on the near side in the viewing direction is displayed. Therefore, the positional relationship between the line-of-sight direction, that is, the depth direction between the three-dimensional stereoscopic image 76 and the scan plane image 74 can be easily grasped. Further, the relative position of the probe 10 with respect to the three-dimensional stereoscopic image 76 may be displayed.

  For example, the target distances 80, 82, 84 and the target angle 86 in FIG. 6 are guide information based on the relative position of the probe 10 with respect to the region of interest. In this example, the target distance 80 is displayed as 12.5 mm, the target distance 82 is 5.8 mm, the target distance 84 is 18.5 mm, and the target angle 86 is 60 °. In this case, the probe 10 is moved by 12.5 mm in the X-axis direction while visually recognizing the target distance 80. Further, while visually recognizing the target distance 82, the probe 10 is moved by 5.8 mm in the Y-axis direction. Further, while visually recognizing the target angle 86, the probe 10 is inclined by 60 ° with respect to the Z axis. That is, the position and inclination of the probe 10 are adjusted so that the target distances 80 and 82 and the target angle 86 become zero. Then, by performing ultrasonic beam scanning at a depth of 18.5 mm in the Z-axis direction, an ultrasonic tomographic image and a reference image in which the region of interest 72 is drawn can be acquired. When the region of interest 72 is drawn, for example, guide information may be blinked.

  Further, when the position and inclination of the probe 10 are adjusted, as shown in FIG. 6B, the scan plane image 74 moves (for example, tilts) following the change of the probe 10, so that the scan plane and The relative position of the region of interest can be visually grasped. By displaying the guide information in this way, it becomes accurate and easy to redraw the region of interest of the subject H on the ultrasonic tomographic image, so that the usability for the operator is improved.

  Moreover, although the form which displays a numerical value or a figure as guide information was demonstrated, it can replace with or with it, and can apply a sound as guide information. For example, in the process of adjusting the position and inclination of the probe 10, if the position of the region of interest 72 is included in the scan plane of the probe 10, a notification sound is generated from the sound generator 35. In addition, when the sound generator 35 intermittently generates sound, the sound generation interval 35 is shortened as the scan surface of the probe 10 approaches the region of interest, and conversely, the scan surface of the probe 10 is the region of interest. The sound generation interval can be increased as the distance from the sound source increases. As a result, it is possible to easily grasp the positional deviation of the scan plane with respect to the region of interest simply by listening to the sound generation interval, which improves usability for the operator.

  FIG. 7 is a diagram showing a form in which a model image 88 as guide information is displayed in addition to the display form of FIG. As shown in FIG. 7, the model image 88 is an image that schematically represents the three-dimensional stereoscopic image 76, and includes volume data related to the subject H. More specifically, the model image 88 is a lattice-shaped transmission image in which a solid line is drawn in units of volume data voxels. That is, the model image 88 is different from the three-dimensional stereoscopic image 76 in which only the surface of the subject H is displayed in that the depth direction of the subject H is displayed through. The model image 88 here is displayed by combining the scan plane image 74 and the region of interest 72 in the same coordinate system. Accordingly, the relative positions of the subject H, the scanning surface of the probe 10, and the region of interest can be more easily grasped.

  For example, as in the case of FIG. 6, the probe 10 is moved by 12.5 mm in the X-axis direction while visually recognizing the target distance 80. Further, while visually recognizing the target distance 82, the probe 10 is moved by 5.8 mm in the Y-axis direction. Further, while visually recognizing the target angle 86, the probe 10 is inclined by 60 ° with respect to the Z axis. Then, the region of interest 72 is drawn on the ultrasonic tomographic image by scanning the ultrasonic beam at a depth of 18.5 mm in the Z-axis direction. In the process of adjusting the position and inclination of the probe 10 in this way, in the example shown in FIG. 7, the relative positional relationship between the scan surface of the probe 10 and the region of interest 72 is visually grasped on the model image 88. Is done. Thereby, the operation | work which adjusts the position and inclination of the probe 10 becomes still easier.

  FIG. 8 is a diagram illustrating a display example of other guide information based on the relative position of the probe 10 with respect to the region of interest. For example, as shown in FIG. 8, rotation arrows 92a and 92b are displayed. The rotation arrows 92a and 92b indicate the rotation direction of the probe 10 with respect to the region of interest 72, for example, around the Z axis or the reverse rotation direction. An arrow 93 indicating the target movement direction in the X-axis direction of the probe 10 and an arrow 94 indicating the target movement direction in the Y-axis direction are displayed. The arrows 93 and 94 change in width and color according to the distance of the probe 10 to the region of interest 72. For example, when the probe 10 and the region of interest 72 are relatively close, the widths of the arrows 93 and 94 are reduced, or the arrows 93 and 94 are displayed in blue. Conversely, when the probe 10 and the region of interest 72 are relatively far apart, the widths of the arrows 93 and 94 are increased, or the arrows 93 and 94 are displayed in red.

  Further, a three-dimensional arrow indicating the target position of the probe 10 in a three-dimensional manner may be displayed. The three-dimensional arrow is a vector image configured based on the target distances 80, 82, and 84 and the target angle 86 of the probe 10. The three-dimensional arrow changes in width and color according to the three-dimensional distance of the probe 10 with respect to the region of interest 72. The three-dimensional distance here is obtained from the three-square theorem based on the target distances 80, 82, and 84.

  FIG. 9 is a diagram showing another display example of a screen for setting a region of interest in volume data. FIG. 9 is a setting screen when the region of interest has a predetermined range, and is different from the setting screen of FIG. 4 when the region of interest is a point. Therefore, the description will focus on the differences from the setting screen of FIG.

  As shown in FIG. 9A, the setting screen includes display areas 59, 52, and 54 for tomographic images having different cross sections of the subject H. In the display area 59, caliper marks are displayed superimposed on the tomographic image. Next, as shown in FIG. 9B, the caliper mark is aligned with the position of the region of interest 100 via the operation panel 32. When a region-of-interest determination command is input from the operation panel 32, the tissue range including the position of the caliper mark is set as the region of interest 100 as shown in FIG. Here, with reference to the luminance information and tissue elasticity information of the position of the caliper mark, a pixel area in which the luminance information and tissue elasticity information are included in the setting range is automatically set as the region of interest 100. For example, an image area displayed with the same luminance as the position of the caliper mark is set as the region of interest 100. In addition, it is preferable to be able to recognize that the region of interest 100 is set, such as displaying the region of interest 100 in a predetermined color.

  When the region of interest 100 is commanded, a target cross section 102 including the region of interest 100 is designated as shown in FIG. The target cross section is a cross section in which the scan plane of the probe 10 being imaged is guided when ultrasonic imaging is performed after treatment of the subject H. For example, a cross section in which the region of interest 100 is drawn to the maximum is designated. However, a cross section necessary for diagnosis or treatment can be appropriately designated, and a new target cross section 102a can be set by moving or rotating the target cross section 102, for example, as shown in FIG. 9D.

  The setting of the target cross section 102 in FIG. FIG. 10 is a diagram showing a display example of a target section setting screen. FIG. 10A is a diagram showing the region of interest 100 extracted from the display area 54 in FIG. 9D for convenience of explanation. When the target cross section 102 is designated as the region of interest 100, a tomographic image 104 corresponding to the target cross section 102 is displayed. The region of interest 100 here is drawn relatively small on the tomographic image 104 as shown in FIG. Next, when the target section 102 is changed and a new target section 102a is designated again, a tomographic image 104a corresponding to the target section 102a is displayed. As shown in FIG. 10B, the region of interest 100 is drawn relatively large on the tomographic image 104a. When it is determined that the tomographic image 104a is suitable for diagnosis or treatment, the target cross section 102a is selected instead of the target cross section 102. That is, the target cross section 102a is set as a cross section through which the scan plane of the probe 10 being imaged is guided when ultrasonic imaging is performed after treatment of the subject H.

  FIG. 11 is a diagram showing an example in which the setting screens of FIGS. 9 and 10 are displayed side by side on the display screen of FIG. As shown in FIG. 11, the display screen includes an ultrasonic tomographic image display area 68 corresponding to the scan plane of the probe 10 being imaged, and a reference image display area having the same cross section as the ultrasonic image of the display area 68. 66. A setting screen is arranged alongside this display screen. The setting screen has display areas 59, 52, and 54 for tomographic images with different cross sections of the subject H, and a display area for tomographic images 104a corresponding to the target cross section 102a.

According to the setting screens shown in FIGS. 9 to 11, even when the region of interest 100 has a predetermined range, the target cross section 102a on which the region of interest 100 is appropriately drawn can be interactively set. Usability is improved. For example, the region of interest 100
It is possible to easily set the target cross section 102a where the maximum is drawn. During or after treatment of the subject H, guide information for guiding the scan surface of the probe 10 to the target cross section 102a is displayed.

  Although the present invention has been described above with the embodiments, it is not limited thereto. When diagnosing or treating a region of interest with motion (for example, a circulatory organ such as a heart or blood vessel), in addition to associating the coordinates of the subject H with each voxel of the volume data, motion data (for example, , Cardiac phase data, pulse wave data) are desirable. Thereby, even when the shape of the region of interest changes with time, a reference image having the same cross section and the same time phase as the ultrasonic tomographic image can be displayed.

  In addition, the ultrasonic apparatus according to the present embodiment applies a so-called RVS (Real-time Virtual Sonography) technique. The RVS technique is to display an ultrasonic image being imaged and simultaneously extract and display a reference image having the same cross section as the ultrasonic image from volume data. However, the present invention is not limited to the ultrasonic apparatus to which the RVS technique is applied. In short, when the region of interest is preset in the volume data related to the subject H, the current scan surface of the probe 10 is guided to the position of the region of interest. The present invention can be applied to cases where it is necessary.

  Further, as shown in FIG. 1, the control functions necessary for the ultrasonic imaging according to the present embodiment have been described in units of blocks. However, the control functions are aggregated as an ultrasonic imaging program, and the ultrasonic imaging program is used as a control computer. Can also be executed. For example, the ultrasound imaging program provides a procedure for supplying a driving signal for transmission to the probe 10 that transmits and receives ultrasound to and from the subject H, and a procedure for processing a reception signal output from the probe 10. Then, the control computer is caused to execute a procedure for constructing an ultrasound image based on the signal after reception processing and a procedure for displaying the ultrasound image. This ultrasound imaging program uses the position data of the probe 10 and the position data of the region of interest set in the volume data relating to the subject H acquired in advance to scan the scan surface of the probe 10 of the subject H. The control computer is caused to execute a procedure for generating guide information to be guided to the position of the region of interest and displaying it on the image display unit 16.

  As described above, according to the present embodiment, it is possible to realize an ultrasonic apparatus that is more suitable for redrawing the region of interest of the subject in an ultrasonic image.

  As described above, an ultrasonic apparatus according to an embodiment to which the present invention is applied has been described. However, the ultrasonic apparatus to which the present invention is applied can be implemented in various other forms without departing from the spirit or main features thereof. it can. Therefore, the above-mentioned embodiment is only a mere illustration in all points, and is not interpreted limitedly. That is, the scope of the present invention includes modifications and changes belonging to the equivalent range.

Claims (17)

  1. An ultrasonic probe for transmitting and receiving ultrasonic waves to and from the subject, a transmission means for supplying a driving signal for transmission to the ultrasonic probe, and a reception signal output from the ultrasonic probe. In an ultrasonic apparatus comprising: a receiving unit for processing; an image constructing unit that configures an ultrasonic image based on a signal output from the receiving unit; and a display unit that displays the ultrasonic image.
    An acquisition means for capturing volume data relating to the subject; a tomographic image based on the volume data; and a position of a mark superimposed on the tomographic image; at least one of luminance information and tissue elasticity information of the tomographic image; A region-of-interest setting means for setting, as a region of interest, a pixel region in which at least one of the luminance information and the tissue elasticity information is included in a setting range as a reference;
    Wherein based on the ultrasonic probe position data and the position data of the set region of interest, wherein the display means generates guide information to induce scan plane of the ultrasonic probe at the position of the region of interest An ultrasonic apparatus comprising: guide information generating means for displaying the information and changing a color of the guide information according to a distance of the ultrasonic probe with respect to the region of interest.
  2.   The guide information generating means captures the position data of the ultrasound probe at set time intervals, and based on the position data of the ultrasound probe and the position data of the region of interest, the ultrasound for the region of interest The ultrasonic apparatus according to claim 1, wherein a relative position of the probe is calculated and the guide information is generated from the relative position.
  3. The guide information generating means generates and displays an arrow image indicating the target movement direction of the ultrasonic probe as the guide information based on the position data of the ultrasonic probe and the position data of the region of interest. The ultrasonic apparatus according to claim 1, wherein a width and a color of an arrow of the arrow image are changed according to a distance of the ultrasonic probe with respect to the region of interest.
  4.   The guide information generating means calculates a relative position of the ultrasonic probe with respect to the region of interest based on the position data of the ultrasonic probe and the position data of the region of interest, and the size of the relative position The ultrasonic apparatus according to claim 1, wherein a command for changing the sound generation interval in accordance with the change of the sound is output to the sound generation means.
  5.   The guide information includes at least one of a movement direction, a movement amount, a tilt angle, and a rotation direction of the ultrasonic probe, and the region of interest of the volume data is set on the scan surface of the ultrasonic probe. The ultrasonic apparatus according to claim 1, wherein the ultrasonic apparatus is information guided to a tomographic plane.
  6.   The ultrasound apparatus according to claim 1, wherein the display unit displays an ultrasound image corresponding to a scan surface of the ultrasound probe and the guide information on the same screen.
  7. The display means displays an ultrasound image corresponding to the scan surface of the ultrasound probe, a reference image having the same cross section as the ultrasound image, based on the volume data, and the guide information on the same screen. The ultrasonic device according to claim 1 .
  8. The ultrasonic wave according to claim 1 , wherein the display unit synthesizes and displays an image corresponding to a scan plane of the ultrasonic probe in the same coordinate system on a three-dimensional image configured based on the volume data. apparatus.
  9. The display means combines a lattice-shaped transparent model image configured based on the volume data with an image corresponding to the scan plane of the ultrasonic probe and an image corresponding to the region of interest in the same coordinate system. ultrasonic device according to claim 1 that is displayed.
  10. The volume data obtaining means, ultrasonic imaging apparatus, X-rays CT imaging apparatus, ultrasonic device of claim 1 to obtain the volume data constituted by at least one magnetic resonance imaging apparatus.
  11. A procedure for supplying a driving signal for transmission to an ultrasound probe that transmits and receives ultrasound to and from the subject, a procedure for processing a reception signal output from the ultrasound probe, and after the reception processing In an ultrasound imaging program for causing a control computer to execute a procedure for constructing an ultrasound image based on the signal and a procedure for displaying the ultrasound image,
    At least one of a procedure for capturing volume data relating to the subject, a procedure for constructing a tomographic image based on the volume data, a position of a mark superimposed on the tomographic image, luminance information of the tomographic image, and tissue elasticity information A procedure for setting, as a region of interest, a pixel region in which at least one of the luminance information and the tissue elasticity information is included in a setting range,
    Wherein based on the ultrasonic probe position data and the position data of the set region of interest, the scan plane of the ultrasonic probe to generate and display means guide information to guide the position of the region of interest An ultrasound imaging program that causes the control computer to execute a procedure for displaying and changing the color of the guide information in accordance with the distance of the ultrasound probe to the region of interest.
  12. A step of supplying a driving signal for transmission to an ultrasonic probe that transmits and receives ultrasonic waves to and from a subject, a step of processing a reception signal output from the ultrasonic probe, and after the reception processing In an ultrasonic imaging method comprising the steps of configuring an ultrasonic image based on the signal of and the step of displaying the ultrasonic image,
    At least one of a step of capturing volume data relating to the subject, a step of constructing a tomographic image based on the volume data, a position of a mark superimposed on the tomographic image, luminance information of the tomographic image, and tissue elasticity information Setting as a region of interest a pixel region in which at least one of the luminance information and the tissue elasticity information is included in a setting range with reference to
    Wherein based on the ultrasonic probe position data and the position data of the set region of interest, the scan plane of the ultrasonic probe to generate and display means guide information to guide the position of the region of interest An ultrasonic imaging method comprising: displaying and changing a color of the guide information according to a distance of the ultrasonic probe with respect to the region of interest.
  13. The step of generating the guide information calculates a relative position of the ultrasonic probe with respect to the region of interest based on the position data of the ultrasonic probe and the position data of the region of interest, and sets the relative position to the relative position. The ultrasonic imaging method according to claim 12 , wherein the guide information is generated based on the information.
  14. The step of generating the guide information calculates a relative position of the ultrasonic probe with respect to the region of interest based on the position data of the ultrasonic probe and the position data of the region of interest, The ultrasonic imaging method according to claim 12 , wherein a command to change the sound generation interval following the change in size is generated.
  15. The step of generating the guide information generates, as the guide information, an arrow image indicating a target moving direction of the ultrasonic probe based on the position data of the ultrasonic probe and the position data of the region of interest. The ultrasonic imaging method according to claim 12 , wherein a width and a color of an arrow of the arrow image are changed according to a distance of the ultrasonic probe with respect to the region of interest.
  16. The guide information includes at least one of a movement direction, a movement amount, a tilt angle, and a rotation direction of the ultrasonic probe, and the region of interest of the volume data is set on the scan surface of the ultrasonic probe. The ultrasonic imaging method according to claim 12 , which is information guided to a tomographic plane.
  17. The ultrasonic imaging method according to claim 12 , wherein in the step of displaying the ultrasonic image, an ultrasonic image corresponding to a scan surface of the ultrasonic probe and the guide information are displayed on the same screen.
JP2006547993A 2004-12-03 2005-12-01 Ultrasonic device, ultrasonic imaging program, and ultrasonic imaging method Expired - Fee Related JP4470187B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2004351489 2004-12-03
JP2004351489 2004-12-03
PCT/JP2005/022056 WO2006059668A1 (en) 2004-12-03 2005-12-01 Ultrasonic device, ultrasonic imaging program, and ultrasonic imaging method

Publications (2)

Publication Number Publication Date
JPWO2006059668A1 JPWO2006059668A1 (en) 2008-06-05
JP4470187B2 true JP4470187B2 (en) 2010-06-02

Family

ID=36565101

Family Applications (2)

Application Number Title Priority Date Filing Date
JP2006547993A Expired - Fee Related JP4470187B2 (en) 2004-12-03 2005-12-01 Ultrasonic device, ultrasonic imaging program, and ultrasonic imaging method
JP2009277262A Expired - Fee Related JP5230589B2 (en) 2004-12-03 2009-12-07 Ultrasonic device, ultrasonic imaging program, and ultrasonic imaging method

Family Applications After (1)

Application Number Title Priority Date Filing Date
JP2009277262A Expired - Fee Related JP5230589B2 (en) 2004-12-03 2009-12-07 Ultrasonic device, ultrasonic imaging program, and ultrasonic imaging method

Country Status (2)

Country Link
JP (2) JP4470187B2 (en)
WO (1) WO2006059668A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170075435A (en) * 2015-12-23 2017-07-03 지멘스 메디컬 솔루션즈 유에스에이, 인크. Ultrasound system and method for displaying ultrasound images

Families Citing this family (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5148094B2 (en) 2006-09-27 2013-02-20 株式会社東芝 Ultrasonic diagnostic apparatus, medical image processing apparatus, and program
JP2008142151A (en) * 2006-12-07 2008-06-26 Matsushita Electric Ind Co Ltd Ultrasonic diagnostic apparatus and ultrasonic diagnostic system
JP2010514524A (en) * 2006-12-29 2010-05-06 ベラソン インコーポレイテッドVerathon Inc. Ultrasound harmonic imaging system and method
JP4936281B2 (en) * 2007-01-24 2012-05-23 国立大学法人名古屋大学 Ultrasonic diagnostic equipment
EP2155068B1 (en) * 2007-05-16 2016-12-21 Verathon, Inc. System and method for ultrasonic harmonic imaging
JP5127371B2 (en) 2007-08-31 2013-01-23 キヤノン株式会社 Ultrasound image diagnostic system and control method thereof
JP4935606B2 (en) * 2007-09-28 2012-05-23 オムロン株式会社 Imaging system
JP2009125181A (en) * 2007-11-21 2009-06-11 Hitachi Medical Corp Ultrasonic diagnostic system
CN102159138B (en) * 2008-07-15 2013-12-25 株式会社日立医疗器械 Ultrasound diagnostic device and method for displaying probe operation guide of same
JP5400466B2 (en) * 2009-05-01 2014-01-29 キヤノン株式会社 Diagnostic imaging apparatus and diagnostic imaging method
KR101121245B1 (en) 2009-08-20 2012-03-23 삼성메디슨 주식회사 Ultrasound system and method for providing elastic change information
JP5538861B2 (en) * 2009-12-18 2014-07-02 キヤノン株式会社 Information processing apparatus, information processing method, information processing system, and program
JP5606076B2 (en) * 2010-01-08 2014-10-15 株式会社東芝 Ultrasonic diagnostic apparatus, medical image processing apparatus, and medical image processing program
FR2957514B1 (en) * 2010-03-17 2013-03-22 Gen Electric Medical imaging device comprising radiographic acquisition means and guiding means for an ultrasonic probe
US9241685B2 (en) 2010-08-06 2016-01-26 Hitachi Medical Corporation Ultrasonic imaging apparatus and three-dimensional image display method using ultrasonic image
JPWO2012164892A1 (en) * 2011-05-30 2015-02-23 コニカミノルタ株式会社 Ultrasonic diagnostic apparatus and image acquisition method using ultrasonic wave
JP6071282B2 (en) * 2011-08-31 2017-02-01 キヤノン株式会社 Information processing apparatus, ultrasonic imaging apparatus, and information processing method
JP5682873B2 (en) * 2011-09-27 2015-03-11 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Ultrasonic diagnostic equipment
JP2012055765A (en) * 2011-12-22 2012-03-22 Toshiba Corp Ultrasonic diagnostic device and program
JP5995449B2 (en) * 2012-01-24 2016-09-21 キヤノン株式会社 Information processing apparatus and control method thereof
JP6039903B2 (en) 2012-01-27 2016-12-07 キヤノン株式会社 Image processing apparatus and operation method thereof
KR101386102B1 (en) 2012-03-09 2014-04-16 삼성메디슨 주식회사 Method for providing ultrasound images and ultrasound apparatus thereof
JP6068017B2 (en) * 2012-06-21 2017-01-25 東芝メディカルシステムズ株式会社 Ultrasonic diagnostic apparatus and image generation program
JP5677399B2 (en) * 2012-10-31 2015-02-25 キヤノン株式会社 Information processing apparatus, information processing system, information processing method, and program
KR101538658B1 (en) * 2012-11-20 2015-07-22 삼성메디슨 주식회사 Medical image display method and apparatus
JP6457054B2 (en) * 2013-01-23 2019-01-23 キヤノンメディカルシステムズ株式会社 Ultrasonic diagnostic equipment
JP6129577B2 (en) * 2013-02-20 2017-05-17 東芝メディカルシステムズ株式会社 Ultrasonic diagnostic apparatus and medical image diagnostic apparatus
JP6202841B2 (en) * 2013-03-18 2017-09-27 東芝メディカルシステムズ株式会社 Ultrasonic diagnostic equipment
JP6203514B2 (en) * 2013-03-29 2017-09-27 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Ultrasonic diagnostic apparatus and control program therefor
JP6081299B2 (en) * 2013-06-13 2017-02-15 東芝メディカルシステムズ株式会社 Ultrasonic diagnostic equipment
JP5701362B2 (en) * 2013-10-24 2015-04-15 キヤノン株式会社 Diagnostic imaging apparatus and diagnostic imaging method
JP6293452B2 (en) * 2013-10-30 2018-03-14 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Ultrasonic diagnostic apparatus and image analysis apparatus
KR101595718B1 (en) * 2014-02-04 2016-02-19 한국디지털병원수출사업협동조합 Scan position guide method of three dimentional ultrasound system
KR101611450B1 (en) 2014-03-04 2016-04-11 삼성메디슨 주식회사 Method and device for guiding measure in ultrasound diagnosis apparatus, storage medium thereof
JP6263447B2 (en) * 2014-06-30 2018-01-17 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Ultrasonic diagnostic apparatus and program
JP6510200B2 (en) * 2014-08-25 2019-05-08 キヤノンメディカルシステムズ株式会社 Ultrasonic diagnostic apparatus and control program
JP6338510B2 (en) * 2014-10-29 2018-06-06 キヤノン株式会社 Information processing apparatus, information processing method, information processing system, and program
CN105769468B (en) * 2016-01-30 2018-08-24 张美玲 A kind of ultrasound monitoring surgical operation care device
JP6263248B2 (en) * 2016-11-07 2018-01-17 キヤノン株式会社 Information processing apparatus, information processing method, and program
JP6355788B2 (en) * 2017-03-27 2018-07-11 キヤノン株式会社 Information processing apparatus, information processing method, information processing system, and program

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08110B2 (en) * 1989-01-24 1996-01-10 アロカ株式会社 Ultrasonic Doppler diagnostic equipment
JP3325300B2 (en) * 1992-02-28 2002-09-17 株式会社東芝 Ultrasound therapy device
JP4443672B2 (en) * 1998-10-14 2010-03-31 株式会社東芝 Ultrasonic diagnostic equipment
JP2001061861A (en) * 1999-06-28 2001-03-13 Siemens Ag System having image photographing means and medical work station
JP3864846B2 (en) * 2002-05-21 2007-01-10 株式会社デンソー Navigation device
JP4088104B2 (en) * 2002-06-12 2008-05-21 株式会社東芝 Ultrasonic diagnostic equipment
CN102512209B (en) * 2003-05-08 2015-11-11 株式会社日立医药 Ultrasonic diagnostic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170075435A (en) * 2015-12-23 2017-07-03 지멘스 메디컬 솔루션즈 유에스에이, 인크. Ultrasound system and method for displaying ultrasound images
KR102030567B1 (en) 2015-12-23 2019-10-10 지멘스 메디컬 솔루션즈 유에스에이, 인크. Ultrasound system and method for displaying ultrasound images

Also Published As

Publication number Publication date
JP5230589B2 (en) 2013-07-10
JP2010051817A (en) 2010-03-11
WO2006059668A1 (en) 2006-06-08
JPWO2006059668A1 (en) 2008-06-05

Similar Documents

Publication Publication Date Title
US8226560B2 (en) Reference image display method for ultrasonography and ultrasonic diagnosis apparatus
JP4828802B2 (en) Ultrasonic diagnostic equipment for puncture therapy
EP2185077B1 (en) Ultrasonic diagnostic imaging system and control method thereof
US8123691B2 (en) Ultrasonic diagnostic apparatus for fixedly displaying a puncture probe during 2D imaging
JP2008514264A (en) Method and apparatus for performing ultrasonic diagnostic imaging of breast with high accuracy
JP2007000226A (en) Medical image diagnostic apparatus
JP2006167267A (en) Ultrasonograph
DE69831138T2 (en) System for illustrating a twin-dimensional ultrasound image in a three-dimensional image communication environment
JP2006006935A (en) Method and apparatus for real-time ultrasonic multiple surface imaging
JP2010183935A (en) Ultrasonic diagnostic apparatus and control program for ultrasonic diagnostic apparatus
KR101140525B1 (en) Method and apparatus for extending an ultrasound image field of view
JP5495357B2 (en) Image display method and medical image diagnostic system
JP5738507B2 (en) Ultrasonic probe trajectory expression device and ultrasonic diagnostic device
JP2005296436A (en) Ultrasonic diagnostic apparatus
US9202301B2 (en) Medical image display apparatus and X-ray diagnosis apparatus
WO2007114375A1 (en) Ultrasound diagnostic device and control method for ultrasound diagnostic device
JP2007319190A (en) Ultrasonograph, and medical image processing apparatus and program
JP5495593B2 (en) Ultrasonic diagnostic apparatus and puncture support control program
JP2010042151A (en) Ultrasonic diagnostic apparatus, ultrasonic image display device, and ultrasonic image display program
US7991108B2 (en) Medical image processing apparatus, ultrasound imaging apparatus, X-ray CT (computed tomography) apparatus, and method of processing medical image
US20100324422A1 (en) Image diagnosis apparatus and image diagnosis method
CN101779969B (en) Ultrasound diagnosis apparatus, medical image display apparatus and medical image displaying method
US9386964B2 (en) 3D view of 2D ultrasound images
US9173632B2 (en) Ultrasonic diagnosis system and image data display control program
JP2006271588A (en) Ultrasonic apparatus

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20081121

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20090714

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20090908

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20091006

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20091207

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20100126

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20100218

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130312

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140312

Year of fee payment: 4

LAPS Cancellation because of no payment of annual fees