US20090137907A1 - Imaging diagnosis apparatus having needling navigation control system and a needling navigation controlling method - Google Patents

Imaging diagnosis apparatus having needling navigation control system and a needling navigation controlling method Download PDF

Info

Publication number
US20090137907A1
US20090137907A1 US12/275,886 US27588608A US2009137907A1 US 20090137907 A1 US20090137907 A1 US 20090137907A1 US 27588608 A US27588608 A US 27588608A US 2009137907 A1 US2009137907 A1 US 2009137907A1
Authority
US
United States
Prior art keywords
data
puncturing
region
puncturing needle
diagnosis apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/275,886
Other languages
English (en)
Inventor
Masao Takimoto
Fumiyasu Sakaguchi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Medical Systems Corp
Original Assignee
Toshiba Corp
Toshiba Medical Systems Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Corp, Toshiba Medical Systems Corp filed Critical Toshiba Corp
Assigned to KABUSHIKI KAISHA TOSHIBA, TOSHIBA MEDICAL SYSTEMS CORPORATION reassignment KABUSHIKI KAISHA TOSHIBA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SAKAGUCHI, FUMIYASU, TAKIMOTO, MASAO
Publication of US20090137907A1 publication Critical patent/US20090137907A1/en
Assigned to TOSHIBA MEDICAL SYSTEMS CORPORATION reassignment TOSHIBA MEDICAL SYSTEMS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KABUSHIKI KAISHA TOSHIBA
Priority to US15/398,242 priority Critical patent/US10881375B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/12Arrangements for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image

Definitions

  • the present invention relates to an imaging diagnosis apparatus, such as an ultrasound diagnosis apparatus and a computer tomography (CT) apparatus, having a needling navigation control system and a needling navigation controlling method. More particularly, the present invention relates to an imaging diagnosis apparatus and a needling navigation controlling method that can accurately and safely navigate an invasive instrument, such as a puncturing needle, into a diagnostic target organ in the body of an object with monitoring 3-D images of the target organ and surrounding tissues such as the blood vessels that are acquired by an imaging diagnosis apparatus.
  • CT computer tomography
  • the puncturing needle navigation control system consistent with the present is applicable to an imaging diagnosis apparatus, such as an ultrasound diagnosis apparatus or a computer tomography (CT) apparatus.
  • an ultrasound imaging diagnosis is explained for applying the puncturing needle navigation control system according to the invention.
  • An ultrasound imaging diagnosis apparatus transmits ultrasound and receives reflected ultrasound through a plurality of ultrasound transducers installed in an ultrasound probe to and from a target in an object in a plurality of directions in order to display the image of the target on a monitor. Since the ultrasound imaging diagnosis apparatus can easily create and display continuous two dimensional (2-D) images or continuous three dimensional (3-D) images on a monitoring screen in real time by simply contacting with the ultrasound probe onto a patient's body surface, it is widely utilized for imaging diagnosis of various bodily organs or other purposes, such as identifying the existence, location, and size of tumors.
  • an invasive instrument such as a catheter or a puncturing needle for insertion into the body of a patient in order to remove some tissue samples of an examining or treating portion, such as a tumor, or to perform other medical treatments and medications.
  • a puncturing adaptor is mounted on the ultrasound prove so as to guide a puncturing needle along a needle guide provided on the adaptor.
  • various types of puncturing adaptors have been proposed for attaching on the ultrasound probe, for instance, as illustrated in FIGS. 14A and 14B .
  • FIG. 14A illustrates one example of the conventional adaptor proposed in Japanese Patent Application 2004-147984.
  • a puncturing adaptor 216 a is attachably mounted on a head portion 211 a of the ultrasound probe 201 a.
  • the head portion 211 a includes a plurality of transducers.
  • a needle guide 217 having a prescribed guiding slant is provided so that the insertion direction of the puncturing needle 218 a coincides with the slice plane of an object for generating ultrasound 2-D image data.
  • a tip position data of the puncturing needle 218 a can be displayed together with 2-D image data of a tumor portion on a monitor.
  • FIG. 14B illustrates another example of the conventional ultrasound probe having a puncturing adaptor configuration proposed in Japanese Patent Application 2005-342109.
  • a notch groove 219 is provided at one end of the head portion 211 b so as to attachable connect a puncturing adaptor 216 b into the groove 219 .
  • the adaptor 216 b includes a needle guide.
  • the puncturing needle 218 b can be inserted into the body through the head portion 211 b of the ultrasound probe 201 b and the tip position of the puncturing needle 218 b can be displayed together with 2-D image data of a tumor portion on a monitor.
  • a 1-D array ultrasound probe that includes a plurality of transducers arrayed in one dimension (1-D)
  • 2-D array ultrasound probes have been used for acquiring 3-D ultrasound image data of an object.
  • the 2-D array ultrasound probe includes a plurality of transducers arrayed in two dimensions (2-D) (i.e., an azimuth direction and an elevation direction).
  • 3-D image data are generated and displayed by rendering the acquired 3-D data (hereinafter, frequently referred to as “volume data”).
  • volume data (hereinafter, frequently referred to as “volume data”).
  • Japanese Patent Application 2005 - 342109 has proposed a method for inserting a puncturing needle with monitoring the 3-D images of the volume data acquired by the 2-D array ultrasound probe.
  • the conventional techniques have proposed inserting a puncturing needle into a tumor portion with confirming the tip position of the puncturing needle as 2-D image or 3-D image on a monitor.
  • the puncturing needle it is not unusual to shift the inserting direction of the puncturing needle from a prescribed slice plane for acquiring image data due to unevenness of living body tissues existing along the insertion route, i.e., differences among muscle portions and fatty portions.
  • the proposed conventional techniques could not confirm the inserting status of the puncturing needle on the 2-D image data. Consequently, it has difficult to accurately insert the puncturing needle into the tumor portion without injuring the surrounding tissues of the tumor portion such as the blood vessels and other organs.
  • the present invention provides a new imaging diagnosis apparatus having a puncturing navigation control system that can navigate an accurate and safe puncturing of an invasive instrument, a catheter or a puncturing needle into a target tumor portion with monitoring ultrasound 3-D images of the puncturing target region and the surrounding regions of the blood vessels and the other organs.
  • the imaging diagnosis apparatus and the needling navigation method consistent with the present invention since the puncturing needle can be always navigated along the eye direction while unevenness of tissues exist along the inserting direction, it becomes possible to accurately and safely insert the puncturing needle into the tumor portion with avoiding any injuring of the surrounding blood vessels and other organs existing the target tumor portion in the body of an object.
  • the imaging diagnosis apparatus and the method consistent with the present invention can significantly improve the efficiency and the safety of puncturing diagnostic examinations and treatments by the insertion of a puncturing needle. Further, according to the imaging diagnosis apparatus consistent with the present invention, it can significantly reduce the burdens of the puncturing operators.
  • an ultrasound imaging diagnosis apparatus comprising:
  • a volume data acquiring unit configured to acquire volume data from a volume (3-D) scan region on an object
  • a puncturing needle position detecting unit configured to detect a position of a puncturing needle inserted in a body of the object
  • a puncturing navigation data generating unit configured to generate puncturing navigation data in order to display an anatomy in a living body that locates in an inserting direction of the puncturing needle based on the detected position of the puncturing needle;
  • a display unit configured to display the puncturing navigation data.
  • One aspect of the needling navigation controlling method for an imaging apparatus consistent with the present invention is a method for controlling puncturing operations applicable to an imaging diagnosis apparatus comprising:
  • volume data from a volume (3-D) scan region on an object
  • FIG. 1 is a block diagram illustrating an ultrasound imaging diagnosis apparatus in accordance with preferred embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating the transmission and reception unit, and the receiving signal processing unit in the ultrasound imaging diagnosis apparatus shown in FIG. 1 .
  • FIG. 3A illustrates the direction of ultrasound transmission and reception in a volume scan by 2-D array transducers provided in an ultrasound probe.
  • FIG. 3B illustrates the direction of ultrasound transmission and reception projected on the x-z plane in the volume scan shown in FIG. 3A .
  • FIG. 3C illustrates the direction of ultrasound transmission and reception projected on the y-z plane in the volume scan shown in FIG. 3A .
  • FIG. 4 is a block diagram illustrating the volume data generating unit, and the region setting unit in the ultrasound imaging diagnosis apparatus shown in FIG. 1 .
  • FIG. 5A illustrates the three multi planar reconstruction (MPR) planes set on the B-mode volume data that include the target portion generated by the ultrasound imaging diagnosis apparatus shown in FIG. 1 .
  • MPR multi planar reconstruction
  • FIG. 5B illustrates MPR image data generated at each of three MPR scan planes shown in FIG. 5A .
  • FIG. 6 illustrates 3-D data of tumor region, organ region and blood vessel regions that are generated by the 3-D data generating unit in the embodiment of the ultrasound imaging diagnosis apparatus shown in FIG. 1 .
  • FIG. 7 illustrates an example of the puncturing navigation data at a front of the tumor region just before the insertion of the puncturing needle in the embodiment shown in FIG. 6 .
  • FIGS. 8A-8E illustrate examples of practical puncturing navigation data in the insertions of the puncturing needle into the body of the object.
  • FIG. 9 illustrates the puncturing navigation data at the rear of the tumor region before insertion or during insertion of the puncturing needle in the embodiment shown in FIG. 6 .
  • FIG. 10A illustrates an eye direction of the puncturing navigation data before insertion of a puncturing needle at an initial setting.
  • FIG. 10B illustrates a renewed eye direction of the puncturing navigation data due to flexion of the puncturing needle in the body of the object.
  • FIG. 11A illustrates an example of display of the puncturing navigation data in which a center of the tumor region is located at a center of the monitor.
  • FIG. 11B illustrates an example of display of the puncturing navigation data in which the expected inserting position to the tumor region is located at a center of the monitor.
  • FIG. 12A illustrates a practical example of the puncturing navigation data displayed on a monitor in which the insertion error region overlaps one portion of a plurality of blood vessel regions in front of the tumor region.
  • FIG. 12B illustrates an example of a renewal status of the inserting direction of the puncturing needle by an operator who renewed the inserting direction of the puncturing needle until acquisition of the puncturing navigation data in that the plurality of blood vessel regions do not overlap the insertion error region.
  • FIG. 13 is a flowchart illustrating a generating process of the puncturing navigation data in accordance with the embodiment of the ultrasound imaging diagnosis apparatus consistent with the present invention.
  • FIG. 14A illustrates one configuration of the conventionally proposed puncturing adaptor for an ultrasound probe.
  • FIG. 14B illustrates another configuration of the conventionally proposed puncturing adaptor for an ultrasound probe.
  • a target tumor region for puncturing (hereinafter, “tumor region”) and, major blood vessel regions (hereinafter, “blood vessel regions”) and other organ region located near to the tumor region (hereinafter, “organ region”) are set based on the volume data acquired by the 3-D scan on the object in order to avoid any possibility of unwanted insertion by the puncturing needle into the blood vessel regions and the organ region located near the tumor region.
  • volume data are generated.
  • the tumor region and the organ region are approximated as a sphere or an ellipse solid.
  • 3-D images of the blood vessel regions themselves are set.
  • the present invention is applicable to acquiring the volume data by moving the 1-D array ultrasound probe. It is also possible to set the blood vessel region by using the volume data based on the 3-D B mode data at the time of contrast media medication instead of using 3-D color Doppler data.
  • FIG. 1 is a block diagram of an ultrasound diagnosis system 100 in accordance with preferred embodiments of the present invention.
  • the ultrasound diagnosis system 100 includes a transmission/reception unit 2 , a 2-D ultrasound probe 3 , a receiving signal processing unit 4 , a volume data generating unit 5 , a multi-planar reconstruction (MPR) data generating unit 6 , a region setting unit 7 and a 3-D data generating unit 8 .
  • the transmission/reception unit 2 includes a transmitting unit 21 for supplying driving signals to the transducers in the ultrasound probe 3 and a receiving unit 22 for adding receiving signals supplied from the transducers.
  • the ultrasound probe 3 includes a plurality of 2-D arrayed transducers for transmitting ultrasound pulses (transmission ultrasound) over a 2-D area or 3-D volume of a diagnosis object portion in an object in accordance with driving signals from the transmission unit 21 and also for converting ultrasound echo signals into electric signals.
  • the receiving signals acquired from a plurality (N) of channels of the transducers in the ultrasound probe 3 are arranged in phases and added in the receiving unit 22 .
  • the added receiving signals are processed in the receiving signal processing unit 4 in order to generate B mode image data and color Doppler data acquired by 3-D scanning over an object.
  • the volume data generating unit 5 generates volume data by arranging the B mode data and color Doppler data so as to correspond to the ultrasound transmission and reception directions.
  • the ultrasound diagnosis system 100 further includes a multi-planar reconstruction (MPR) image data generating unit 7 , a regions setting unit 7 , a 3-D data generating unit 8 , a punctured needle position detecting unit 10 .
  • the MPR image data generating unit 7 generates MPR image data of the B mode volume data at a slice plane set by an input unit 14 that is explained later.
  • the region setting unit 7 sets a 3-D tumor region and 3-D organ regions based on outline data set by the input unit 14 against the tumor and organs located near the tumor on the MPR image data.
  • the region setting unit 7 also sets 3-D blood vessel regions by extracting outlines from color Doppler volume data.
  • the 3-D data generating unit 8 generates monitoring 3-D image data by composing the tumor region data, the organ region data and the blood vessel region data.
  • the punctured needle position detecting unit 9 detects a tip position and an insertion direction of a puncturing needle 150 inserted into the body of an object along a needle guide 161 of a puncturing adaptor 16 mounted on a head portion of the 2-D ultrasound probe 3 .
  • the expected puncturing position calculating unit 10 calculates an expected tip position and an expected insertion direction of the puncturing needle to the tumor region based on various data for the puncturing needle 150 including the puncturing position and the insertion direction that will be explained later.
  • the ultrasound diagnosis system 100 further includes a puncturing navigation data generating unit 11 , a display unit 12 , a warning unit 13 , an input unit 14 and a system control unit 15 .
  • the puncturing navigation data generating unit 11 generates puncturing navigation data based on a puncturing expect position data and insertion error region data in connection with tumor region data, organ region data and blood vessel region data.
  • the display unit 12 displays MPR image data, 3-D data and puncturing navigation data.
  • the warning unit 13 issues warning signals in a case that the organ region or the blood vessel regions in the puncturing navigation data being included in the insertion error region.
  • the input unit 14 sets slice planes for MPR image data and outlines of the tumor and the surrounding organs on the MPR image data.
  • the system control unit 15 totally controls the above-mentioned all units.
  • the ultrasound probe 3 includes a plurality (N) of 2-D arrayed transducers provided on a top surface portion of the probe. Ultrasound transmission and reception of echo ultrasound are performed by touching the top surface portion to a body surface of an object. Often, a gel is used as an intermediary between the body surface and probe surface.
  • the transducers convert driving pulse signals to transmission signals composed of ultrasound pulses during transmission time, and convert ultrasound echoes to receiving signals during reception time.
  • Each of the plurality N of transducers is coupled to the transmission and reception unit 2 through a multi-channel cable.
  • a puncturing adaptor 16 is mounted on the ultrasound probe 3 in order to insert a puncturing needle 150 into the body of an object along a needle guide 161 provided on the puncturing adaptor 16 . Thus, an insertion position and an insertion direction of the puncturing needle 150 are primarily determined by the needle guide 161 .
  • a 2-D array sector scan ultrasound probe including a plurality of N transducers is used as an example of the ultrasound probe 3 .
  • a linear scan ultrasound probe or a convex scan ultrasound probe is also possible.
  • FIG. 2 is a block diagram illustrating the transmission and reception unit, and the receiving signal processing unit in the ultrasound imaging diagnosis apparatus shown in FIG. 1 .
  • the transmission and reception unit 2 includes a transmission unit 21 for supplying drive signals to the plurality of N transducers in the ultrasound probe 3 and a reception unit 22 for adding the receiving signals of N channels acquired through the plurality of transducers.
  • the transmission unit 21 includes a rate pulse generator 211 , a transmission delay circuit 212 and a driving circuit 213 .
  • the rate pulse generator 211 generates rate pulses which determine a recycle period for transmission ultrasound.
  • the generated rate pulses are supplied to the transmission delay circuit 212 .
  • the transmission delay circuit 212 includes a plurality of independent delay circuits of the same number of transducers N as used for transmission.
  • the transmission delay circuit 212 gives a convergence delay time for converging the transmission ultrasound into a prescribed depth and a deviation delay time for transmitting ultrasound in a prescribed direction ( ⁇ p, ⁇ q) to the rate pulses and supplies to the driving circuit 213 .
  • the reception unit 22 includes a plurality of (N channels) of pre-amplifiers 221 , a plurality of (N channels) of A/D converters 222 , a plurality of (N channels) of reception delay circuits 223 and a plurality of (N channels) of adders so as to obtain a sufficient S/N ratio by amplifying weak signals.
  • the plurality of (N channels) of reception signals amplified in the pre-amplifier 221 are converted to digital signals in the A/D converter 222 and supplied to the reception delay circuit 223 .
  • the reception delay circuit 223 gives each of the reception signals outputted from the A/D converter 222 a convergence delay time for converging reception ultrasound from a prescribed depth and a deviation delay time for setting a reception directivity to a predetermined direction ( ⁇ p, ⁇ q).
  • the reception signals acquired from the prescribed direction ( ⁇ p, ⁇ q) are added in the adder 224 .
  • FIG. 3A illustrates an ultrasound probe 3 having 2-D array transducers Trs and an ultrasound transmission/reception position P (r, ⁇ p, ⁇ q).
  • the ultrasound probe 3 has a center axis (z-axis).
  • the ultrasound transmission/reception position P (r, ⁇ p, ⁇ q) locates at a distance r from a surface of the transducers Trs in an X 0 -axis (azimuth) direction and a Y 0 -axis (elevation) direction.
  • FIG. 3B illustrates a projected position P on an X 0 -Z 0 plane transmitting and receiving ultrasound at an angle ⁇ p in the X 0 -axis (azimuth) direction from the Z 0 -axis.
  • FIG. 3C illustrates a projected position P on the Y 0 -Z 0 plane transmitting and receiving ultrasound at an angle ⁇ q in the Y 0 -axis (elevation) direction from the Z 0
  • the reception signals processing unit 4 shown in FIG. 2 includes a B mode data generating unit 41 for generating B mode data by processing the received signals supplied from the adders 224 in the reception unit 22 , a Doppler signals detection unit 42 for detecting the Doppler signals by orthogonally detecting the phases of the received signals and a color Doppler data generation unit 43 for generating color Doppler data reflecting blood flow data in the main blood vessel based on the detected Doppler signals.
  • the B mode data generating unit 41 includes an envelope detector 411 for detecting the envelope of the reception signals supplied from the adder 224 in the reception unit 22 and a logarithmic converter 412 for generating B mode data by converting the amplitude of the envelope detected reception signals. It is possible to replace of the positions of the envelope detector 411 and the logarithmic converter 412 .
  • the Doppler signal detection unit 42 includes a ⁇ /2 phase converter (shifter) 421 , mixers 422 - 1 and 422 - 2 and low pass filters (LPFS) 423 - 1 and 423 - 2 in order to detect Doppler signals by orthogonally detecting the phases of the reception signals supplied from the adders 224 in the reception unit 22 .
  • shifter phase converter
  • LPFS low pass filters
  • the color Doppler data generation unit 43 includes a Doppler signals memory circuit 431 , a MTI filter 432 and an auto-correlation computing unit 433 .
  • the Doppler signals memory circuit 431 stores Doppler signals detected by the Doppler signals detection unit 42 .
  • the MTI filter 432 removes Doppler clutter components that are generated due to fixed reflectors in an organ and breathing movements or pulse movements of the organ from the detected Doppler signals.
  • the auto-correlation computing unit 433 generates color Doppler data by using three kinds of characteristic values, i.e., a mean velocity value, a dispersing value and a power value of blood flows based on the self-correlation value.
  • FIG. 4 is a block diagram illustrating a construction of the volume data generating unit 5 and the regions setting unit 7 shown in FIG. 1 .
  • the volume data generating unit 5 includes a B mode data memory unit 51 , a color Doppler data memory unit 52 , an interpolation processing unit 53 and a volume data memory unit 54 .
  • the B mode data memory unit 51 in the volume data generating unit 5 successively stores B mode data generated by the B mode data generating unit 41 ( FIG. 2 ) based on the acquired reception signals by 3-D scans over the object with affixing ultrasound transmission/reception directions as attached data.
  • the color Doppler data memory unit 52 in the volume data generating unit 5 successively stores color Doppler data generated by the color Doppler data generating unit 43 based on the acquired reception signals with affixing ultrasound transmission/reception directions as attached data.
  • the interpolation processing unit 53 generates 3-D B mode data by reading out a plurality of B mode data at a prescribed time phase in order to arrange the plurality of B mode data so as to correspond to their transmission/reception directions.
  • the interpolation processing unit 53 further generates B mode volume data comprised of equal interval voxels by performing interpolation processes for the unequal interval voxels of the generated B mode data.
  • the interpolation processing unit 53 generates 3-D color Doppler data by reading out a plurality of color Doppler data at a prescribed time phase so as to arrange the plurality of color Doppler data corresponding to each transmission/reception directions.
  • the interpolation processing unit 53 further generates color Doppler volume data by performing interpolation processes for the 3-D color Doppler data.
  • the generated 3-D color Doppler data and the 3-D color Doppler data are stored in the volume data memory unit 54 .
  • the B mode volume data stored in the volume data memory unit 54 in the volume data generating unit 5 are read out and supplied to the MPR image data generating unit 6 .
  • the MPR image data generating unit 6 sets a plurality of MPR planes on the B mode volume data based on MPR planes data supplied from the MPR planes setting unit 141 in the input unit 14 as explained later.
  • MPR image data are generated by extracting voxels of each B mode volume data corresponded to each of the plurality of MPR planes.
  • FIG. 5A illustrates an example the plurality of MPR planes set on the B mode volume data including the tumor portion Tm.
  • three MPR planes Pm 1 to Pm 3 are set on the B mode volume data Vb including the tumor portion Tm.
  • MPR plane Pm 1 is set in parallel to the X 0 -Z 0 plane.
  • MPR plane Pm 2 is set in parallel to the Y 0 -Z 0 plane.
  • MPR plane Pm 3 is set perpendicular to the center axis Z 0 of the ultrasound probe 3 as shown in FIG. 3 .
  • X 0 shows an azimuth direction
  • Y 0 shows an elevation direction.
  • these three MPR planes Pm 1 to Pm 3 are orthogonally set with each other so that each of crossing points intersects at a substantial center position of the tumor portion Tm.
  • FIG. 5B illustrates three MPR image data Mp 1 to Mp 3 that are respectively generated at each of the three MPR planes Pm 1 to Pm 3 .
  • FIG. 4 illustrates a practical construction of the regions setting unit 7 shown in FIG. 1 .
  • The includes a tumor region setting unit 71 , an organ region setting unit 72 and a blood vessel region setting unit 73 .
  • the tumor region setting unit 71 sets a 3-D tumor region based on the outline data set by the outline setting unit 142 in the input unit 14 to the respective tumor portion Tm in the three MPR image data Mp 1 to Mp 3 in the MPR image data generating unit 6 and displayed on the display unit 12 .
  • the 3-D tumor region is approximated to a wire framed sphere body or an ellipse body.
  • the organ region setting unit 72 sets 3-D organ region approximated by a sphere body or an ellipse body to major organs located near to the tumor portion Tm in MPR image data Mp 1 to Mp 3 based on the outline data set by the outline setting unit 142 .
  • the blood vessel region setting unit 73 reads out color Doppler volume data stored in the volume data memory unit 54 in the volume data generating unit 5 . Based on the blood flow data in the color Doppler volume data, the main blood vessels which run the neighborhood of tumor portion Tm are set as three-dimensional (3-D) blood vessel region.
  • the 3-D data generating unit 8 shown in FIG. 1 generates monitoring 3-D data by composing the respective 3-D data of the tumor region, the organ region and the blood vessel regions, a tip position data of the puncturing needle 150 and the inserting direction data of the puncturing needle that are detected by the puncturing needle position detecting unit 9 .
  • FIG. 6 illustrates an example of the 3-D data generated by the 3-D data generating unit 8 .
  • the tumor region Tr is approximated by the sphere of a wired frame.
  • the organ region Rr of circumference main internal organs, such as a bone, is approximated by a rotation ellipse.
  • Four blood vessel regions Vr 1 to Vr 4 are shown so as to run the neighborhood of tumor portion Tm.
  • a puncturing marker Bn showing the inserting direction and the expected inserting position of the puncturing needle 150 are generated as 3-D data by composing the tip position data of the puncturing needle 150 and the inserting direction data of the puncturing needle that are detected by the puncturing needle position detecting unit 9 .
  • the eye direction setting unit 144 in the input unit 14 can arbitrarily set a desired eye direction.
  • the eye direction setting unit 144 can set the eye direction so that the inserting direction or the expected inserting position of the puncturing marker Bn to the tumor region Tr can be observed without blocking by the blood vessel region Vr and/or the organ region Rr.
  • the puncturing needle position detecting unit 9 ( FIG. 1 ) detects the tip position and inserting direction of the puncturing needle 150 at the just before the insertion to the object and during the insertion.
  • the puncturing needle position detecting unit 9 receives slant angle data of the needle guide 161 that are provided from the system control unit 15 in accordance with the identification data of the puncturing adaptor 16 that is inputted through the input unit 14 . Based on this slant angle data, the puncturing needle position detecting unit 9 detects the inserting position and inserting direction of the puncturing needle 150 just before the insertion.
  • the puncturing needle position detecting unit 9 further detects the tip position of the puncturing needle 150 during insertion into the body is detected based on the ultrasound reflected from the tip portion of the puncturing needle 150 .
  • the inserting direction of the puncturing needle 150 is detected based on the time variation of the tip position.
  • the tip position of the puncturing needle 150 based on the insertion distance of the puncturing needle that is detected by a sensor provided on the needle guide. It is applicable to use an encoder that mechanically acts on the puncturing needle 150 as the sensor. Of course, an optical sensor or a magneto metric sensor is also possible to use as a sensor. When the puncturing needle 150 is freely inserted without using the needle guide 161 , it may be possible to detect the tip position of the needle by using a position sensor mounted on a portion of the puncturing needle 150 .
  • the expected inserting position calculating unit 10 calculates a distance between the tip position and the tumor region, i.e., a distance between the tip portion of the puncturing needle and the tumor region by receiving the data of the tip position and inserting direction of the puncturing needle 150 at the just before the insertion to the object and during the insertion. Based on these distance data between the tip position and the tumor region and the inserting direction data of the puncturing needle 150 , the expected inserting position to the tumor region is calculated by assuming that the puncturing needle 150 goes straight into the tissue of the living body.
  • the expected inserting position calculating unit 10 further calculates an insertion error region by presuming a possible bent degree of the puncturing needle 150 during the insertion based on the various data, such as the distance data between the needle tip and the tumor region, material data of the puncturing needle, such as a hardness of the puncturing needle, and living body data, such as tissue hardness of the living body of the object that are supplied from the system control unit 15 . Based on the presumed bent degree of the puncturing needle, a possible error region of the expected inserting position is calculated as the insertion error region.
  • the puncturing navigation data generating unit 11 ( FIG. 1 ) generates puncturing navigation data based on the tumor region data, blood vessel region data and organ region data that are supplied from the regions setting unit 7 and the data of the expected inserting position and insertion error region that are supplied from the expected inserting position calculating unit 10 .
  • FIG. 7 illustrates a practical example of the puncturing navigation data when the puncturing needle 150 is mounted along the needle guide 161 of the puncturing adaptor 16 .
  • the puncturing navigation data are generated by performing a rendering process or projecting process of these eye direction data of the tumor region Tr, blood vessel regions Vr 1 to Vr 4 that locate in a depth range between a body surface of the object and the tumor region Tr and the organ region Rr based on the slant angle data ⁇ o ( FIG. 1 ) supplied from the system control unit 15 , and also by superimposing the expected inserting position data Pi and insertion error region data Av along the insertion direction as the eye direction on these processed data.
  • FIGS. 8A to 8E illustrate practical examples of the puncturing navigation data that are acquired in accordance with successive inserting depths of the puncturing needle 150 into the object body along the inserting direction as an eye viewing direction.
  • FIG. 8A illustrates the puncturing navigation data that is generated just after insertion of the puncturing needle 150 into a body surface of the object.
  • the puncturing navigation data including the blood vessel regions Vr 1 to Vr 4 and the organ region Rr that are exist in a depth region between the body surface of the object and the tumor region Tr.
  • FIG. 8B-8E show the successive variations of the puncturing navigation data depend on the insertion depths of the puncturing needle.
  • FIG. 8B illustrates a first stage in which the blood vessel region Vr 1 and the organ region Rr disappear from the puncturing navigation data since they exist in the nearest distance from the body surface. Furthermore, as the insertion depth of the puncturing needle increases, the blood vessel region Vr 2 near to the body surface, the blood vessel region Vr 3 and the blood vessel region Vr 4 disappear in order from the puncturing navigation data in accordance with the increase of the insertion depth of the puncturing needle, as illustrated in FIGS. 8C and 8D .
  • the insertion error region Av also gradually reduces depending on the insertion depth.
  • the insertion error region Av formed around the expected inserting position Pi also disappears as illustrated in FIG. 8E .
  • FIG. 9 illustrates the puncturing navigation data that is generated before or during insertion of the puncturing needle at the rear of the tumor region Tr.
  • the puncturing navigation data is generated by composing the tumor region data Tr supplied from the puncturing needle position detecting unit 9 , a blood vessel region data Vr 5 that is located behind the tumor region Tr, the expected inserting position data Pi and the insertion error region data Av.
  • FIG. 10A illustrates an eye direction of the puncturing navigation data before insertion of a puncturing needle at an initial setting time.
  • the eye direction of the puncturing navigation data before the insertion coincides with the direction A. If the inserting direction is bent due to flexion of the puncturing needle into the body of the object, the eye direction of the puncturing navigation data during the insertion is also renewed to the direction B as illustrated in FIG. 10B .
  • the display unit 12 ( FIG. 1 ) displays MPR image data generated in the MPR image data generating unit 6 ( FIG. 5B ), 3-D data generated in the 3-D data generating unit 8 ( FIG. 6 ) and the puncturing navigation data generated in the puncturing navigation data generating unit 11 before and during the insertion of the puncturing needle ( FIGS. 7-9 ).
  • the display unit 12 includes a display data generating circuit, a conversion circuit and a monitor (not shown).
  • the display data generating circuit in the display unit 12 generates displaying data by superimposing supplementary data, such as object data to the MPR image data, 3-D data and puncturing navigation data.
  • the conversion circuit in the display unit 12 executes D/A conversions and television format conversions of the display data so as to display the display data on the monitor.
  • On the monitor it is desirable to display the blood vessel region data Vr by using different colors or brightness in accordance with the depth of the blood vessel region. For instance, each of the blood vessel regions Vr 1 to Vr 4 illustrated in FIG. 7 are respectively displayed in a white, a yellow, a orange and a red color, respectively.
  • the colored display can easily recognize each depth of the blood vessel regions. It is also possible to display the tumor region, the organ region and the blood vessel regions in each of different colors or brightness in order to easily recognize each of the regions.
  • the puncturing navigation data on the monitor it is desirable to display so as that a center of the tumor region Tr or an expected inserting position Pi of the tumor region Tr is located at a center portion of the monitor.
  • a positioning of the location is not mandatory.
  • the warning unit 13 ( FIG. 1 ) generates warning signals in order to urge a re-setup of the puncturing conditions including the inserting position and inserting direction of the puncturing needle 150 .
  • the warning unit 13 informs the warning signals to an operator by using such as a warning lamp, a warning buzzer or a display of warning indications.
  • the warning unit 13 issues warning signals so as to indicate an accidental possibility of wrong puncturing into the main blood vessels that are running the neighborhood of the tumor region Tm.
  • the operator renews the inserting direction of the puncturing needle 150 until that the puncturing navigation data shows no overlapping between the insertion error region Av and the blood vessel regions Vr 1 to Vr 4 as shown in FIG. 12B .
  • the puncturing navigation data in FIG. 12A shows a status that one portion of the blood vessel regions Vr 1 to Vr 4 overlapped onto the insertion error region Av.
  • FIG. 12A describes the puncturing navigation data at the front area of the tumor region. It is also possible to change the inserting direction of the puncturing needle 150 with confirming a safety by observing the puncturing navigation data at the rear of the tumor region, i.e., a deeper position than the tumor region Tr. In this case, as explained later, the support data selection unit 143 in the input unit 14 performs a display change of the puncturing navigation data at the front of the tumor region tumor region and the puncturing navigation data at the behind of the tumor region on the display unit 12 .
  • the input unit 14 includes an MPR plane setting unit 141 , an outline setting unit 142 , a support data selection unit 143 and an eye direction setting unit 144 .
  • the MPR plane setting unit 141 sets one or a plurality of MPR planes against the volume data acquired through the 3-D scan on the object.
  • the outline setting unit 142 sets outlines of the tumor and major organs existing near to the tumor in the MPR image data generated at the plurality of MPR planes.
  • the support data selection unit 143 selects the puncturing navigation data at the front and the backward of the tumor region.
  • the eye direction setting unit 144 sets an eye direction of the 3-D data
  • These setting operations are executed by using input devices, such as, a display panel, a keyboard unit, selection buttons or a mouse.
  • input devices such as, a display panel, a keyboard unit, selection buttons or a mouse.
  • the selected input devices set the volume data acquisition conditions, display conditions for displaying the MPR image data, 3-D data and puncturing navigation data.
  • the input devices further set various command signals.
  • the system control unit 15 shown in FIG. 1 includes a central processing unit (CPU) and a memory (not shown).
  • the memory in the system control unit 15 stores above-mentioned various data that are inputted, selected and set by each of the devices of the input unit 14 .
  • the CPU in the system control unit 15 controls each of the units in the ultrasound imaging diagnosis apparatus 100 so as to generate and display the puncturing navigation data.
  • FIG. 13 is a flowchart illustrating generation processes of the puncturing navigation data in accordance with the present embodiment of the invention.
  • the operator for the ultrasound imaging diagnosis apparatus 100 initially inputs object data and a puncturing adaptor recognition data and also sets various conditions, such as, volume data acquisition conditions and display conditions of MPR image data, 3-D data or puncturing navigation data. Then the operator provides the ultrasound probe 3 on a desired position of the body surface of the object so as to set the puncturing needle 150 along the needle guide 161 of the puncturing adaptor 16 mounted on the ultrasound probe 3 ( FIG. 13 , step SI).
  • the operator After completing the initial settings, the operator inputs start commands for generating the puncturing navigation data through the input unit 14 ( FIG. 13 , step S 2 ).
  • the generations of the puncturing navigation data are started by supplying the start command signals to the system control unit 15 .
  • the rate pulse generator 211 in the transmission unit 21 ( FIG. 2 ) generates rate pulses by dividing the reference signals supplied from the system control unit 15 in order to determine a recycle period for transmission ultrasound.
  • the generated rate pulses are supplied to the transmission delay circuit 212 .
  • the transmission delay circuit 212 gives a convergence delay time for converging the transmission ultrasound into a prescribed depth and a deviation delay time for transmitting ultrasound in a plurality of transmission/reception directions ( ⁇ p, ⁇ q) to the rate pulses and supplies to the driving circuit 213 .
  • the driving circuit 213 generates driving signals based on the rate pulses supplied from the transmission delay circuit 212 .
  • the driving signals are supplied to the selected number N of transducers in the ultrasound probe 3 in order to emit transmitting ultrasounds into the body of an object.
  • the transmitted ultrasounds reflect at the boundary surfaces of the organs or tissues in the object and are received by the same transducers for the transmission as reception signals of N channels.
  • the reception signals are amplified in the pre-amplifier 221 and are converted into digital signals in the A/D converter. Further, each of the reception signals are given a convergence delay time for converging reception ultrasound from a prescribed depth and a deviation delay time for setting a reception directivity to the first transmission/reception directions ( ⁇ 1 , ⁇ 1 ).
  • the reception signals acquired from the first transmission/reception direction ( ⁇ 1 , ⁇ 1 ) are added in the adder 224 .
  • the envelope detector 411 and the logarithmic converter 412 in the B mode data generating B mode data by detecting the envelope of the reception signals and performing logarithmic conversions.
  • the generated B mode data are stored in the B mode data memory 51 in the volume data generating unit 5 with attaching transmission/reception directions as affixed data.
  • the B-mode data acquired at each of the transmission/reception directions are stored in the B-mode data memory unit 51 ( FIG. 4 ).
  • the system control unit 15 repeats the ultrasound transmission/reception by a predetermined times (L times) along the transmission/reception directions ( ⁇ 1 , ⁇ 1 ) by controlling the transmission delay times at the transmission delay time in the transmission unit 21 and the reception delay times at the reception delay circuit 212 in the reception delay circuit 223 in the reception unit 22 in order to supply the reception signals acquired from the reception unit 22 in each of the ultrasound transmission/reception to Doppler signal detection unit 42 .
  • the Doppler signal detection unit 42 an orthogonal phase detection is carried out from the reception signals.
  • the detected Doppler signals are stored in the Doppler signal memory circuit 431 in the color Doppler data generating unit 43 .
  • the system control unit 15 When the storage of the Doppler signals acquired by performing the predetermined L times of the ultrasound transmission and reception in the first transmission/reception direction ( ⁇ 1 , ⁇ 1 ) has completed, the system control unit 15 successively reads L numbers of Doppler signals corresponded to a prescribed position or depth among the Doppler signals stored in the Doppler signal memory circuit 431 and supplies to the MTI filter 432 .
  • the MTI filter 432 extracts Doppler components due to the blood flow by performing a filtering process of the supplied Doppler signals and supplies to the auto-correlation computing unit 433 .
  • the auto-correlation computing unit 433 performs the auto-correlation calculation by using Doppler signals supplied from the MTI filter 432 and further calculates blood flow data based on the result of the auto-correlation calculation. The same calculations are performed at the different positions or the depths.
  • the blood flow data in the calculated transmission/reception direction ( ⁇ 1 , ⁇ 1 ) are stored in the color Doppler data memory unit 52 in the volume data generating unit 5 as an affixed data.
  • the acquired color Doppler data in each of the transmission/reception directions are stored in the color Doppler data memory unit 52 in the volume data generating unit 5 as affixed data.
  • the interpolation processing unit 53 further generates B mode volume data by performing interpolation process for the 3-D B mode data.
  • the interpolation processing unit 53 generates 3-D color Doppler by arranging the plurality of color Doppler data read out from the color Doppler data memory 52 as to correspond to each of the transmission/reception directions and further generates color Doppler volume data by performing interpolation process for the 3-D color Doppler data.
  • the generated B mode volume data and color Doppler volume data are stored in the volume data memory unit 54 ( FIG. 13 , step S 3 ).
  • the MPR image data generating unit 6 reads out the B mode volume data stored in the volume data memory unit 54 in the volume data generating unit 5 in order to set a plurality of MPR planes that are crossing at a desired position on the tumor based on the MPR plane data supplied from the MPR plane setting unit 141 in the input unit 14 .
  • a plurality of MPR image data is generated by extracting voxels of the B mode volume data corresponded to these MPR planes.
  • the generated MPR image data are displayed on the monitor in the display unit 12 ( FIG. 13 , step S 4 ).
  • an operator sets each outline of the tumor portion and other organs located near to the tumor portion based on each of the MPR image data by using the outline setting unit 142 in the input unit 14 .
  • the tumor region setting unit 71 in the regions setting unit 7 sets 3-D tumor region designated by an approximate sphere body or a rotating elliptical body based on the outline data of the tumor portion.
  • the organ region setting unit 72 in the regions setting unit 7 sets 3-D organ regions designated by an approximate sphere body or a rotating elliptical body based on the outline data of the organs received by the outline setting unit 142 ( FIG. 13 , step S 5 ).
  • the blood vessel region setting unit 73 reads the color Doppler volume data stored in the volume data memory unit 54 in the volume data generating unit 5 and sets the 3-D main blood vessels located surrounding the tumor portion as blood vessels regions based on the blood flow data of the color Doppler volume data ( FIG. 13 , step S 6 ).
  • the puncturing needle position detecting unit 9 detects the tip position and the inserting direction of the puncturing needle 150 at just before the insertion based on the slant angle data of the needle guide 161 supplied from the system control unit 15 ( FIG. 13 , step S 7 ).
  • the expected inserting position calculating unit 10 calculates a distance between the tip position of the puncturing needle 150 and the tumor region.
  • the expected inserting position calculating unit 10 further calculates an expected inserting position to the tumor region by supposing that the puncturing needle 150 goes straight in the living body to the distance between the tip position of the puncturing needle and the tumor region ( FIG. 13 , step S 8 ).
  • the expected inserting position calculating unit 10 further calculates an insertion error region of a possible error scope of the expected inserting position by assuming a bend degree during the insertion of the puncturing needle 150 based on various data of the distance between the tip position of the puncturing needle and the tumor region, material data of the puncturing needle 150 supplied from the system control unit 15 and the anatomy data of the living body ( FIG. 13 , step S 9 ).
  • the puncturing navigation data generating unit 11 generates the puncturing navigation data just before the insertion based on the data supplied from the regions setting unit 7 , i.e., tumor region data, blood vessel regions data and organ region data, the inserting direction data of the puncturing needle 150 supplied from the puncturing needle position detecting unit 9 , the expected inserting position data to the tumor region and the insertion error region data that are supplied from the expected inserting position calculating unit 10 .
  • the generated puncturing navigation data are displayed on a monitor in the display unit 12 ( FIG. 13 , step S 10 ).
  • the 3-D data generating unit 8 In parallel to the generation of the puncturing navigation data, the 3-D data generating unit 8 generates 3-D data by composing the 3-D data of tumor region, organ region and blood vessel regions and the data of tip position and inserting direction of the puncturing needle 150 that are detected by the puncturing needle position detecting unit 9 .
  • the generated 3-D data are displayed on the display unit 12 as reference data for the puncturing navigation data depend on a necessity.
  • the operator starts the insertion of the puncturing needle 150 into the body of the object ( FIG. 13 , step S 12 ). If the operator finds that the organ region or the blood vessel region in the puncturing navigation data pile up or contact to the expected inserting position or the insertion error region, the operator renews the position of the ultrasound probe 3 , the inserting direction or inserting position of the puncturing needle 150 ( FIG. 13 , step S 13 ). Then the above-mentioned steps S 3 -S 11 are repeated.
  • the puncturing needle position detecting unit 9 detects the tip position of the puncturing needle 150 during the insertion based on the reception signals of ultrasound reflection data obtained through the tip portion. Further, the puncturing needle position detecting unit 9 detects the inserting direction of the puncturing needle 150 based on the time variation of the tip position ( FIG. 13 , step S 7 ).
  • the tip position data and inserting direction data of the puncturing needle 150 during the insertion detected by the puncturing needle position detecting unit 9 are supplied to the expected inserting position calculating unit 10 ( FIG. 1 ) in order to calculate the distance between the tip of the puncturing needle puncturing needle and the tumor region.
  • the expected inserting position calculating unit 10 further calculates the expected inserting position and the insertion error region against the tumor region based on the material data of the puncturing needle 150 and the anatomy data of the living body that are supplied through the system control unit 15 ( FIG. 13 , steps S 8 and S 9 ).
  • the puncturing navigation data generating unit 11 ( FIG. 1 ) generates the puncturing navigation data during the insertion based on various data, i.e., data of the tumor region, blood vessel regions and organ region that are supplied from the regions setting unit 7 , the inserting direction data of the puncturing needle 150 during the insertion that are supplied from the puncturing needle position detecting unit 9 and the expected inserting position data and the insertion error region data that are supplied from the expected inserting position calculating unit 10 .
  • the generated puncturing navigation data is displayed on the monitor in the display unit 12 ( FIG. 13 , step S 10 ).
  • the insertion of the puncturing needle 150 into the body proceeds with monitoring the puncturing navigation data displayed on the monitor in the display unit 12 ( FIG. 13 , step S 12 ).
  • inspections or treatments such as medicine medications or removal of tissue of the tumor, are executed with stopping the insertion of the puncturing needle 150 ( FIG. 13 , step S 15 ).
  • the puncturing needle 150 is extracted from the body of the object ( FIG. 13 , step S 16 ).
  • the tumor region and the neighboring organ region are approximated as a sphere body or a rotation ellipse body and the main blood vessels region are indicated by the outlines.
  • the puncturing navigation data is generated by composing these regions data. Consequently, it becomes possible to emphasize such a notable tumor portion, other organ or blood vessels during the treatments by using the puncturing needle in the monitor display.
  • the ultrasound imaging diagnosis apparatus consistent with the present invention, it becomes possible to significantly improve the efficiency and the safety of puncturing diagnostic examinations and treatments and can significantly reduce the burdens of the puncturing operators and risks of injury to the patient.
  • the organ region or the blood vessel regions that are located at the front portion of the tip portion of the puncturing needle in accordance with the insertion depth of the puncturing needle.
  • the organ region or the blood vessel regions that are located at the dangerous insertion region only can be displayed as emphasized puncturing navigation data.
  • the warning signals are generated in order to re-setup the inserting position or the inserting direction of the puncturing needle. Consequently, it can perfectly prevent a dangerous insertion from occurring. Further, since the expected inserting position or the surrounded portion is displayed with blinking in the puncturing navigation data, it becomes possible to accurately confirm a timing that the tip portion of the puncturing needle reaches to the tumor. Consequently, it can prevent an excessive insertion into the tumor region from occurring.
  • the puncturing navigation data is generated by composing the tumor region, organ region and blood vessel regions together with the expected inserting position and the insertion error region along the eye direction corresponded with the inserting direction of the puncturing needle. Consequently, it can easily and accurately know the position relationship between the puncturing needle and each of the tumor region, organ region and blood vessel regions.
  • the embodiment in consistent with the present invention it becomes possible to significantly improve the efficiency and the safety of puncturing diagnostic examinations and treatments. It becomes also possible to significantly reduce the burdens of the puncturing operators and risks of injury to the patient. Particularly, not only the expected insertion region of the puncturing needle but the insertion error region also is set the tumor region in the puncturing navigation data with considering possible flexion of the puncturing needle during the insertion. This is significantly efficient to improve the safety of ultrasound diagnostic examinations and treatments by using the puncturing needle.
  • the volume data are generated based on the 3-D B mode data and 3-D color Doppler data acquired through the 2-D array ultrasound probe in which a plurality of transducers are arranged in two directions. And the 3-D tumor region, 3-D organ region and 3-D blood vessel regions are approximately set as a sphere or an ellipse body by using the generated volume data. It is also possible to acquire the volume data by mechanically moving the 1-D array ultrasound probe. Moreover, it is possible to set the blood vessel region by using the volume data based on the B mode data at the time of contrast media medication instead of the usage of the color Doppler data.
  • 3-D tumor region or 3-D other organ region are set as an approximated sphere or a rotation ellipse body based on the outline data that are manually set to the MPR image data.
  • the puncturing needle applicable to the present invention
  • various types of catheters are included, such as an RFA (Radio Frequency Ablation) puncturing needle that can perform ablation for the inspection/treated areas, such as a tumor, and other catheters that can perform medication or organization extraction to the inspection/treated area.
  • RFA Radio Frequency Ablation
  • an ultrasound diagnosis apparatus has explained.
  • the puncturing navigation control system consistent with the present invention can be applicable to another type of an imaging diagnosis apparatus, such as a CT apparatus.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Robotics (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
US12/275,886 2007-11-22 2008-11-21 Imaging diagnosis apparatus having needling navigation control system and a needling navigation controlling method Abandoned US20090137907A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/398,242 US10881375B2 (en) 2007-11-22 2017-01-04 Imaging diagnosis apparatus having needling navigation control system and a needling navigation controlling method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2007303040A JP5416900B2 (ja) 2007-11-22 2007-11-22 超音波診断装置及び穿刺支援用制御プログラム
JP2007-303040 2007-11-22

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/398,242 Continuation US10881375B2 (en) 2007-11-22 2017-01-04 Imaging diagnosis apparatus having needling navigation control system and a needling navigation controlling method

Publications (1)

Publication Number Publication Date
US20090137907A1 true US20090137907A1 (en) 2009-05-28

Family

ID=40670344

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/275,886 Abandoned US20090137907A1 (en) 2007-11-22 2008-11-21 Imaging diagnosis apparatus having needling navigation control system and a needling navigation controlling method
US15/398,242 Active US10881375B2 (en) 2007-11-22 2017-01-04 Imaging diagnosis apparatus having needling navigation control system and a needling navigation controlling method

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/398,242 Active US10881375B2 (en) 2007-11-22 2017-01-04 Imaging diagnosis apparatus having needling navigation control system and a needling navigation controlling method

Country Status (2)

Country Link
US (2) US20090137907A1 (ja)
JP (1) JP5416900B2 (ja)

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090226069A1 (en) * 2008-03-07 2009-09-10 Inneroptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
WO2011123661A1 (en) * 2010-04-01 2011-10-06 Sonosite, Inc. Systems and methods to assist with internal positioning of instruments
US20120203106A1 (en) * 2011-02-03 2012-08-09 Toshiba Medical Systems Corporation Ultrasound diagnosis apparatus
CN103006261A (zh) * 2012-11-28 2013-04-03 杭州柏拉图科技有限公司 一种电磁定位的超声穿刺导航方法
CN103027712A (zh) * 2012-11-28 2013-04-10 浙江大学 一种电磁定位的超声穿刺导航系统
WO2013056231A1 (en) * 2011-10-14 2013-04-18 Jointvue, Llc Real-time 3-d ultrasound reconstruction of knee and its complications for patient specific implants and 3-d joint injections
US8482606B2 (en) 2006-08-02 2013-07-09 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US8527033B1 (en) 2010-07-01 2013-09-03 Sonosite, Inc. Systems and methods for assisting with internal positioning of instruments
US8554307B2 (en) 2010-04-12 2013-10-08 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US8585598B2 (en) 2009-02-17 2013-11-19 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US8641621B2 (en) 2009-02-17 2014-02-04 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US20140039304A9 (en) * 2012-01-19 2014-02-06 Ge Medical Systems Global Technology Company, Llc Ultrasound diagnostic apparatus and method
US20140051985A1 (en) * 2012-08-17 2014-02-20 Tailin Fan Percutaneous nephrolithotomy target finding system
US8670816B2 (en) 2012-01-30 2014-03-11 Inneroptic Technology, Inc. Multiple medical device guidance
CN103732152A (zh) * 2012-06-25 2014-04-16 株式会社东芝 超声波诊断装置及图像处理方法
CN103987324A (zh) * 2012-11-09 2014-08-13 株式会社东芝 穿刺辅助装置
US9256947B2 (en) 2010-03-19 2016-02-09 Koninklijke Philips N.V. Automatic positioning of imaging plane in ultrasonic imaging
US9265572B2 (en) 2008-01-24 2016-02-23 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for image guided ablation
CN105491955A (zh) * 2013-08-30 2016-04-13 富士胶片株式会社 超声波诊断装置及超声波图像生成方法
JP2017070362A (ja) * 2015-10-05 2017-04-13 東芝メディカルシステムズ株式会社 超音波診断装置及び医用画像診断装置
US9642572B2 (en) 2009-02-02 2017-05-09 Joint Vue, LLC Motion Tracking system with inertial-based sensing units
US9675319B1 (en) 2016-02-17 2017-06-13 Inneroptic Technology, Inc. Loupe display
US9901406B2 (en) 2014-10-02 2018-02-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
US9949700B2 (en) 2015-07-22 2018-04-24 Inneroptic Technology, Inc. Medical device approaches
US10182793B2 (en) 2012-03-26 2019-01-22 Toshiba Medical Systems Corporation Ultrasonic diagnostic apparatus, image processing apparatus, and image processing method
US10188467B2 (en) 2014-12-12 2019-01-29 Inneroptic Technology, Inc. Surgical guidance intersection display
US10278778B2 (en) 2016-10-27 2019-05-07 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US10314559B2 (en) 2013-03-14 2019-06-11 Inneroptic Technology, Inc. Medical device guidance
CN110403698A (zh) * 2018-04-28 2019-11-05 北京柏惠维康医疗机器人科技有限公司 一种器械介入装置和系统
US10512451B2 (en) 2010-08-02 2019-12-24 Jointvue, Llc Method and apparatus for three dimensional reconstruction of a joint using ultrasound
US10517568B2 (en) 2011-08-12 2019-12-31 Jointvue, Llc 3-D ultrasound imaging device and methods
CN110755136A (zh) * 2019-10-10 2020-02-07 中国科学院合肥肿瘤医院 一种穿刺方法
CN113133813A (zh) * 2021-04-01 2021-07-20 上海复拓知达医疗科技有限公司 基于穿刺过程的动态信息显示系统及方法
CN113133814A (zh) * 2021-04-01 2021-07-20 上海复拓知达医疗科技有限公司 基于增强现实的穿刺手术导航装置及计算机可读存储介质
WO2021155649A1 (zh) * 2020-02-04 2021-08-12 赵天力 一种穿刺针定位系统及方法
CN113786229A (zh) * 2021-09-15 2021-12-14 苏州朗润医疗系统有限公司 一种基于ar增强现实的辅助穿刺导航方法
CN114007517A (zh) * 2019-07-26 2022-02-01 富士胶片株式会社 测量装置、超声波诊断装置、测量方法及测量程序
US11259879B2 (en) 2017-08-01 2022-03-01 Inneroptic Technology, Inc. Selective transparency to assist medical device navigation
US20220183655A1 (en) * 2019-03-19 2022-06-16 Koninklijke Philips N.V. Three dimensional volume flow quantification and measurement
WO2022206434A1 (zh) * 2021-04-01 2022-10-06 上海复拓知达医疗科技有限公司 一种用于手术导航的交互配准系统、方法、电子设备和可读存储介质
US11464578B2 (en) 2009-02-17 2022-10-11 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
CN115191953A (zh) * 2022-09-08 2022-10-18 首都医科大学宣武医院 一种可视化注射系统
US11484365B2 (en) 2018-01-23 2022-11-01 Inneroptic Technology, Inc. Medical image guidance
CN116869652A (zh) * 2023-08-25 2023-10-13 山东卓业医疗科技有限公司 基于超声图像和电子皮肤的手术机器人及其定位方法

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8425425B2 (en) * 2010-09-20 2013-04-23 M. Dexter Hagy Virtual image formation method for an ultrasound device
JP5829022B2 (ja) * 2010-12-27 2015-12-09 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー 超音波診断装置
CN105592816B (zh) * 2013-09-30 2019-06-07 皇家飞利浦有限公司 具有用户可定义的感兴趣区域的图像引导系统
EP3389499A4 (en) * 2015-12-16 2019-07-17 Glo-Tip, LLC NEEDLE TRACKING TRANSFORMER ARRAY METHOD AND DEVICE
GB2552544A (en) 2016-07-29 2018-01-31 Micrima Ltd A medical imaging system and method
WO2019044075A1 (ja) * 2017-08-31 2019-03-07 富士フイルム株式会社 画像生成装置および作動方法
US20210259660A1 (en) * 2018-06-29 2021-08-26 Koninklijke Philips N.V. Biopsy prediction and guidance with ultrasound imaging and associated devices, systems, and methods
KR102188176B1 (ko) * 2018-12-06 2020-12-07 한국 한의학 연구원 침 시술 가이드 기능을 가진 초음파 영상 기기
CN110420050B (zh) * 2019-07-18 2021-01-19 沈阳爱健网络科技有限公司 Ct引导下穿刺方法及相关装置
EP4005494A4 (en) * 2019-07-23 2022-09-07 FUJIFILM Corporation ULTRASONIC DIAGNOSTIC DEVICE AND CONTROL METHOD OF THE ULTRASONIC DIAGNOSTIC DEVICE
CN111436923B (zh) * 2020-03-27 2022-05-20 武汉联影智融医疗科技有限公司 穿刺路径确定装置、手术导航系统、设备及存储介质
CN112022294B (zh) * 2020-08-24 2022-02-18 同济大学 基于超声图像引导的静脉穿刺机器人操作轨迹规划方法
CN112022346B (zh) * 2020-08-31 2022-02-18 同济大学 一种全自动静脉穿刺识别一体机器人的控制方法

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5080104A (en) * 1986-08-05 1992-01-14 University Of Wales College Of Medicine Proximity detector with a medical instrument
US5963211A (en) * 1995-06-29 1999-10-05 Hitachi, Ltd. Method and apparatus for directly generating three-dimensional images from voxel data with dividing image generating processes and utilizing parallel processes
US6055449A (en) * 1997-09-22 2000-04-25 Siemens Corporate Research, Inc. Method for localization of a biopsy needle or similar surgical tool in a radiographic image
US20050256402A1 (en) * 2002-09-27 2005-11-17 Olympus Corporation Ultrasonograph
US20060140798A1 (en) * 2003-08-21 2006-06-29 Terumo Kabushiki Kaisha Infusion device
WO2006067676A2 (en) * 2004-12-20 2006-06-29 Koninklijke Philips Electronics N.V. Visualization of a tracked interventional device
US20060281987A1 (en) * 2005-04-11 2006-12-14 Alberto Bartesaghi Systems, devices, and methods for bundle segmentation in diffusion tensor magnetic resonance imaging
US20070049861A1 (en) * 2005-08-05 2007-03-01 Lutz Gundel Device and method for automated planning of an access path for a percutaneous, minimally invasive intervention
US20070129631A1 (en) * 2005-11-18 2007-06-07 Siemens Medical Solutions Usa, Inc. Synchronized three or four-dimensional medical ultrasound imaging and measurements
US20070167769A1 (en) * 2005-12-28 2007-07-19 Olympus Medical Systems Corp. Ultrasonic diagnosis apparatus
US20080022144A1 (en) * 2003-09-05 2008-01-24 Seiko Epson Corporation Data transfer control device and electronic instrument
US20080039723A1 (en) * 2006-05-18 2008-02-14 Suri Jasjit S System and method for 3-d biopsy
US20080167551A1 (en) * 2007-01-04 2008-07-10 Michael Burns Feature emphasis and contextual cutaways for image visualization
US20080221446A1 (en) * 2007-03-06 2008-09-11 Michael Joseph Washburn Method and apparatus for tracking points in an ultrasound image
US7876942B2 (en) * 2006-03-30 2011-01-25 Activiews Ltd. System and method for optical position measurement and guidance of a rigid or semi-flexible tool to a target

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5095910A (en) * 1990-04-18 1992-03-17 Advanced Technology Laboratories, Inc. Ultrasonic imaging of biopsy needle
JPH05277091A (ja) * 1992-03-31 1993-10-26 Toshiba Corp 磁気共鳴診断画像の表示方法
JPH05329155A (ja) * 1992-05-29 1993-12-14 Toshiba Corp 超音波診断装置
US6167296A (en) * 1996-06-28 2000-12-26 The Board Of Trustees Of The Leland Stanford Junior University Method for volumetric image navigation
JP4443672B2 (ja) * 1998-10-14 2010-03-31 株式会社東芝 超音波診断装置
JP4936607B2 (ja) * 2001-06-27 2012-05-23 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー 画像表示装置および超音波診断装置
US6733458B1 (en) * 2001-09-25 2004-05-11 Acuson Corporation Diagnostic medical ultrasound systems and methods using image based freehand needle guidance
JP4342167B2 (ja) * 2002-11-12 2009-10-14 株式会社東芝 超音波照射装置
JP4205957B2 (ja) * 2003-01-09 2009-01-07 アロカ株式会社 超音波診断装置
JP4280098B2 (ja) * 2003-03-31 2009-06-17 株式会社東芝 超音波診断装置及び穿刺治療支援プログラム
CA2523727A1 (en) * 2003-04-28 2005-01-06 Bracco Imaging Spa Surgical navigation imaging system
JP4594675B2 (ja) * 2004-08-20 2010-12-08 株式会社東芝 超音波診断装置及びその制御方法
JP2007000226A (ja) * 2005-06-22 2007-01-11 Toshiba Corp 医用画像診断装置
US8478386B2 (en) * 2006-01-10 2013-07-02 Accuvein Inc. Practitioner-mounted micro vein enhancer
US8160677B2 (en) * 2006-09-08 2012-04-17 Medtronic, Inc. Method for identification of anatomical landmarks
US8731643B2 (en) * 2007-11-13 2014-05-20 Siemens Aktiengesellschaft Imaging system and methods for medical needle procedures
IT1392888B1 (it) * 2008-07-24 2012-04-02 Esaote Spa Dispositivo e metodo di guida di utensili chirurgici mediante imaging ecografico.

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5080104A (en) * 1986-08-05 1992-01-14 University Of Wales College Of Medicine Proximity detector with a medical instrument
US5963211A (en) * 1995-06-29 1999-10-05 Hitachi, Ltd. Method and apparatus for directly generating three-dimensional images from voxel data with dividing image generating processes and utilizing parallel processes
US6055449A (en) * 1997-09-22 2000-04-25 Siemens Corporate Research, Inc. Method for localization of a biopsy needle or similar surgical tool in a radiographic image
US20050256402A1 (en) * 2002-09-27 2005-11-17 Olympus Corporation Ultrasonograph
US20060140798A1 (en) * 2003-08-21 2006-06-29 Terumo Kabushiki Kaisha Infusion device
US20080022144A1 (en) * 2003-09-05 2008-01-24 Seiko Epson Corporation Data transfer control device and electronic instrument
WO2006067676A2 (en) * 2004-12-20 2006-06-29 Koninklijke Philips Electronics N.V. Visualization of a tracked interventional device
US20060281987A1 (en) * 2005-04-11 2006-12-14 Alberto Bartesaghi Systems, devices, and methods for bundle segmentation in diffusion tensor magnetic resonance imaging
US20070049861A1 (en) * 2005-08-05 2007-03-01 Lutz Gundel Device and method for automated planning of an access path for a percutaneous, minimally invasive intervention
US20070129631A1 (en) * 2005-11-18 2007-06-07 Siemens Medical Solutions Usa, Inc. Synchronized three or four-dimensional medical ultrasound imaging and measurements
US20070167769A1 (en) * 2005-12-28 2007-07-19 Olympus Medical Systems Corp. Ultrasonic diagnosis apparatus
US7876942B2 (en) * 2006-03-30 2011-01-25 Activiews Ltd. System and method for optical position measurement and guidance of a rigid or semi-flexible tool to a target
US20080039723A1 (en) * 2006-05-18 2008-02-14 Suri Jasjit S System and method for 3-d biopsy
US20080167551A1 (en) * 2007-01-04 2008-07-10 Michael Burns Feature emphasis and contextual cutaways for image visualization
US20080221446A1 (en) * 2007-03-06 2008-09-11 Michael Joseph Washburn Method and apparatus for tracking points in an ultrasound image

Cited By (78)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10733700B2 (en) 2006-08-02 2020-08-04 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US11481868B2 (en) 2006-08-02 2022-10-25 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure she using multiple modalities
US10127629B2 (en) 2006-08-02 2018-11-13 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US9659345B2 (en) 2006-08-02 2017-05-23 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US8482606B2 (en) 2006-08-02 2013-07-09 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US9265572B2 (en) 2008-01-24 2016-02-23 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for image guided ablation
US8831310B2 (en) 2008-03-07 2014-09-09 Inneroptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
US20090226069A1 (en) * 2008-03-07 2009-09-10 Inneroptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
US11004561B2 (en) 2009-02-02 2021-05-11 Jointvue Llc Motion tracking system with inertial-based sensing units
US11342071B2 (en) 2009-02-02 2022-05-24 Jointvue, Llc Noninvasive diagnostic system
US9642572B2 (en) 2009-02-02 2017-05-09 Joint Vue, LLC Motion Tracking system with inertial-based sensing units
US10136951B2 (en) 2009-02-17 2018-11-27 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US8585598B2 (en) 2009-02-17 2013-11-19 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US8641621B2 (en) 2009-02-17 2014-02-04 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US9398936B2 (en) 2009-02-17 2016-07-26 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US9364294B2 (en) 2009-02-17 2016-06-14 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US8690776B2 (en) 2009-02-17 2014-04-08 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US11464575B2 (en) * 2009-02-17 2022-10-11 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US11464578B2 (en) 2009-02-17 2022-10-11 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US10398513B2 (en) 2009-02-17 2019-09-03 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US9256947B2 (en) 2010-03-19 2016-02-09 Koninklijke Philips N.V. Automatic positioning of imaging plane in ultrasonic imaging
US20110245659A1 (en) * 2010-04-01 2011-10-06 Sonosite, Inc. Systems and methods to assist with internal positioning of instruments
US20130035590A1 (en) * 2010-04-01 2013-02-07 Sonosite, Inc. Systems and methods to assist with internal positioning of instruments
WO2011123661A1 (en) * 2010-04-01 2011-10-06 Sonosite, Inc. Systems and methods to assist with internal positioning of instruments
US9107698B2 (en) 2010-04-12 2015-08-18 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US8554307B2 (en) 2010-04-12 2013-10-08 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US8527033B1 (en) 2010-07-01 2013-09-03 Sonosite, Inc. Systems and methods for assisting with internal positioning of instruments
US10512451B2 (en) 2010-08-02 2019-12-24 Jointvue, Llc Method and apparatus for three dimensional reconstruction of a joint using ultrasound
US9192356B2 (en) * 2011-02-03 2015-11-24 Kabushiki Kaisha Toshiba Ultrasound diagnosis apparatus
US20120203106A1 (en) * 2011-02-03 2012-08-09 Toshiba Medical Systems Corporation Ultrasound diagnosis apparatus
US10517568B2 (en) 2011-08-12 2019-12-31 Jointvue, Llc 3-D ultrasound imaging device and methods
WO2013056231A1 (en) * 2011-10-14 2013-04-18 Jointvue, Llc Real-time 3-d ultrasound reconstruction of knee and its complications for patient specific implants and 3-d joint injections
US20140221825A1 (en) * 2011-10-14 2014-08-07 Jointvue, Llc Real-Time 3-D Ultrasound Reconstruction of Knee and Its Implications For Patient Specific Implants and 3-D Joint Injections
US20210378631A1 (en) * 2011-10-14 2021-12-09 Jointvue, Llc Real-Time 3-D Ultrasound Reconstruction of Knee and Its Implications For Patient Specific Implants and 3-D Joint Injections
US11819359B2 (en) * 2011-10-14 2023-11-21 Jointvue, Llc Real-time 3-D ultrasound reconstruction of knee and its implications for patient specific implants and 3-D joint injections
US11123040B2 (en) 2011-10-14 2021-09-21 Jointvue, Llc Real-time 3-D ultrasound reconstruction of knee and its implications for patient specific implants and 3-D joint injections
US11529119B2 (en) * 2011-10-14 2022-12-20 Jointvue, Llc Real-time 3-D ultrasound reconstruction of knee and its implications for patient specific implants and 3-D joint injections
US20140039304A9 (en) * 2012-01-19 2014-02-06 Ge Medical Systems Global Technology Company, Llc Ultrasound diagnostic apparatus and method
US8670816B2 (en) 2012-01-30 2014-03-11 Inneroptic Technology, Inc. Multiple medical device guidance
US10182793B2 (en) 2012-03-26 2019-01-22 Toshiba Medical Systems Corporation Ultrasonic diagnostic apparatus, image processing apparatus, and image processing method
CN103732152A (zh) * 2012-06-25 2014-04-16 株式会社东芝 超声波诊断装置及图像处理方法
US20140051985A1 (en) * 2012-08-17 2014-02-20 Tailin Fan Percutaneous nephrolithotomy target finding system
CN103987324A (zh) * 2012-11-09 2014-08-13 株式会社东芝 穿刺辅助装置
US11864835B2 (en) 2012-11-09 2024-01-09 Canon Medical Systems Corporation Puncture support device for determining safe linear puncture routes by puncture region classification and superimposing of images
CN103006261A (zh) * 2012-11-28 2013-04-03 杭州柏拉图科技有限公司 一种电磁定位的超声穿刺导航方法
CN103027712A (zh) * 2012-11-28 2013-04-10 浙江大学 一种电磁定位的超声穿刺导航系统
US10314559B2 (en) 2013-03-14 2019-06-11 Inneroptic Technology, Inc. Medical device guidance
CN105491955A (zh) * 2013-08-30 2016-04-13 富士胶片株式会社 超声波诊断装置及超声波图像生成方法
US9901406B2 (en) 2014-10-02 2018-02-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
US10820944B2 (en) 2014-10-02 2020-11-03 Inneroptic Technology, Inc. Affected region display based on a variance parameter associated with a medical device
US11684429B2 (en) 2014-10-02 2023-06-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
US10188467B2 (en) 2014-12-12 2019-01-29 Inneroptic Technology, Inc. Surgical guidance intersection display
US11534245B2 (en) 2014-12-12 2022-12-27 Inneroptic Technology, Inc. Surgical guidance intersection display
US11931117B2 (en) 2014-12-12 2024-03-19 Inneroptic Technology, Inc. Surgical guidance intersection display
US10820946B2 (en) 2014-12-12 2020-11-03 Inneroptic Technology, Inc. Surgical guidance intersection display
US11103200B2 (en) 2015-07-22 2021-08-31 Inneroptic Technology, Inc. Medical device approaches
US9949700B2 (en) 2015-07-22 2018-04-24 Inneroptic Technology, Inc. Medical device approaches
JP2017070362A (ja) * 2015-10-05 2017-04-13 東芝メディカルシステムズ株式会社 超音波診断装置及び医用画像診断装置
US9675319B1 (en) 2016-02-17 2017-06-13 Inneroptic Technology, Inc. Loupe display
US11179136B2 (en) 2016-02-17 2021-11-23 Inneroptic Technology, Inc. Loupe display
US10433814B2 (en) 2016-02-17 2019-10-08 Inneroptic Technology, Inc. Loupe display
US10772686B2 (en) 2016-10-27 2020-09-15 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US11369439B2 (en) 2016-10-27 2022-06-28 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US10278778B2 (en) 2016-10-27 2019-05-07 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US11259879B2 (en) 2017-08-01 2022-03-01 Inneroptic Technology, Inc. Selective transparency to assist medical device navigation
US11484365B2 (en) 2018-01-23 2022-11-01 Inneroptic Technology, Inc. Medical image guidance
CN110403698A (zh) * 2018-04-28 2019-11-05 北京柏惠维康医疗机器人科技有限公司 一种器械介入装置和系统
US20220183655A1 (en) * 2019-03-19 2022-06-16 Koninklijke Philips N.V. Three dimensional volume flow quantification and measurement
CN114007517A (zh) * 2019-07-26 2022-02-01 富士胶片株式会社 测量装置、超声波诊断装置、测量方法及测量程序
CN110755136A (zh) * 2019-10-10 2020-02-07 中国科学院合肥肿瘤医院 一种穿刺方法
WO2021155649A1 (zh) * 2020-02-04 2021-08-12 赵天力 一种穿刺针定位系统及方法
WO2022206416A1 (zh) * 2021-04-01 2022-10-06 上海复拓知达医疗科技有限公司 基于穿刺过程的动态信息显示系统及方法
WO2022206434A1 (zh) * 2021-04-01 2022-10-06 上海复拓知达医疗科技有限公司 一种用于手术导航的交互配准系统、方法、电子设备和可读存储介质
CN113133814A (zh) * 2021-04-01 2021-07-20 上海复拓知达医疗科技有限公司 基于增强现实的穿刺手术导航装置及计算机可读存储介质
CN113133813A (zh) * 2021-04-01 2021-07-20 上海复拓知达医疗科技有限公司 基于穿刺过程的动态信息显示系统及方法
CN113786229A (zh) * 2021-09-15 2021-12-14 苏州朗润医疗系统有限公司 一种基于ar增强现实的辅助穿刺导航方法
CN115191953A (zh) * 2022-09-08 2022-10-18 首都医科大学宣武医院 一种可视化注射系统
CN116869652A (zh) * 2023-08-25 2023-10-13 山东卓业医疗科技有限公司 基于超声图像和电子皮肤的手术机器人及其定位方法

Also Published As

Publication number Publication date
US20170112465A1 (en) 2017-04-27
JP5416900B2 (ja) 2014-02-12
JP2009125280A (ja) 2009-06-11
US10881375B2 (en) 2021-01-05

Similar Documents

Publication Publication Date Title
US10881375B2 (en) Imaging diagnosis apparatus having needling navigation control system and a needling navigation controlling method
US7477763B2 (en) Computer generated representation of the imaging pattern of an imaging device
US7796789B2 (en) Guidance of invasive medical devices by three dimensional ultrasonic imaging
US6733458B1 (en) Diagnostic medical ultrasound systems and methods using image based freehand needle guidance
US7270634B2 (en) Guidance of invasive medical devices by high resolution three dimensional ultrasonic imaging
US7529393B2 (en) Guidance of invasive medical devices by wide view three dimensional ultrasonic imaging
JP4443672B2 (ja) 超音波診断装置
US8556815B2 (en) Freehand ultrasound imaging systems and methods for guiding fine elongate instruments
US20220273258A1 (en) Path tracking in ultrasound system for device tracking
EP2077526A2 (en) Three-dimensional image reconstruction using doppler ultrasound
US20170095226A1 (en) Ultrasonic diagnostic apparatus and medical image diagnostic apparatus
JP2008535560A (ja) 身体ボリュームにおける誘導介入的医療デバイスのための3次元イメージング
US20060270934A1 (en) Guidance of invasive medical devices with combined three dimensional ultrasonic imaging system
JP2010220770A (ja) 超音波診断装置及び穿刺支援用制御プログラム
EP2866672B1 (en) Ultrasonically guided biopsies in three dimensions
WO2014001963A9 (en) Ultrasonic guidance of multiple invasive devices in three dimensions
US20230181148A1 (en) Vascular system visualization
CN115348839A (zh) 超声探头、用户控制台、系统和方法
JP2007117384A (ja) 画像診断・治療支援装置及び治療効果判定用画像データ生成方法
JP4398405B2 (ja) 医療システム
CN109310393B (zh) 对外部微凸线性超声探头的图像取向识别
JP6078134B1 (ja) 医療システム

Legal Events

Date Code Title Description
AS Assignment

Owner name: TOSHIBA MEDICAL SYSTEMS CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TAKIMOTO, MASAO;SAKAGUCHI, FUMIYASU;REEL/FRAME:022211/0659

Effective date: 20090109

Owner name: KABUSHIKI KAISHA TOSHIBA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TAKIMOTO, MASAO;SAKAGUCHI, FUMIYASU;REEL/FRAME:022211/0659

Effective date: 20090109

AS Assignment

Owner name: TOSHIBA MEDICAL SYSTEMS CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KABUSHIKI KAISHA TOSHIBA;REEL/FRAME:038926/0365

Effective date: 20160316

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION