WO1996025881A1 - Procede de guidage par ultrasons pour actes cliniques - Google Patents

Procede de guidage par ultrasons pour actes cliniques Download PDF

Info

Publication number
WO1996025881A1
WO1996025881A1 PCT/NO1996/000029 NO9600029W WO9625881A1 WO 1996025881 A1 WO1996025881 A1 WO 1996025881A1 NO 9600029 W NO9600029 W NO 9600029W WO 9625881 A1 WO9625881 A1 WO 9625881A1
Authority
WO
WIPO (PCT)
Prior art keywords
tool
image
tools
ultrasound
ultrasonic
Prior art date
Application number
PCT/NO1996/000029
Other languages
English (en)
Inventor
Åge GRÖNNINGSÆTER
Björn OLSTAD
Geirmund Unsgaard
Original Assignee
Groenningsaeter Aage
Olstad Bjoern
Geirmund Unsgaard
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Groenningsaeter Aage, Olstad Bjoern, Geirmund Unsgaard filed Critical Groenningsaeter Aage
Priority to AU48513/96A priority Critical patent/AU4851396A/en
Priority to US08/894,229 priority patent/US6019724A/en
Publication of WO1996025881A1 publication Critical patent/WO1996025881A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4209Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
    • A61B8/4218Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0858Detecting organic movements or changes, e.g. tumours, cysts, swellings involving measuring tissue layers, e.g. skin, interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/367Correlation of different images or relation of image positions in respect to the body creating a 3D dataset from 2D images using position information
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4887Locating particular structures in or on the body
    • A61B5/489Blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0808Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the brain
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1001X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy using radiation sources introduced into or applied onto the body; brachytherapy

Definitions

  • This invention relates to a method for generating useful real-time feedback about tissue characteristics and the position of anatomical objects relative to at least one tool used during clinical procedures in living biological structures, employing an ultrasonic transducer/probe.
  • the invention relates to the field of ultrasound imaging during a clinical procedure that involves a tool being inserted into the imaged scene, particular methods for combining the geometric localization of the said tools relative to the acquired ultrasound images.
  • the method to be described here comprises a combination of acquisition of ultra ⁇ sonic images, localization of tools and/or tool trajectories used during the clinical procedure, processing of the ultrasonic images based on the knowledge of the position of the tools in the imaged scene in order to obtain visualizations with real-time feedback to the operator.
  • the visualizations integrate information obtained both from the ultrasonic images and the geometric localizations of the tools in the imaged scene.
  • the invention describes alternative procedures for obtaining the geometric localization of tools and describes how the said images and geometric localization's can be processed in order to obtain useful feedback based on the information content in both data sources (ultrasonic images and tool positions).
  • the method finds application in surgical, therapeutic and diagnostic procedures.
  • Any invasive diagnostic, therapeutic or surgical procedure as for example: open surgery, endoscopic/laparascopic surgery, cyst aspiration, biopsy (sampling), injection, implantation etc.
  • Any therapeutic and/or diagnostic procedure based on energy emission in terms of fields, waves or particles for example: radiotherapy, laser therapy or ultrasound therapy (ultrasound hyperthermia or Shockwaves) 3. Any similar clinical procedure where at least one mechanical object and/or energy field is applied to the imaged, living biological structures.
  • the term tool will be used throughout this invention to designate:
  • a surgical tool used in a clinical procedure for example: a cutting or resecting device (scalpel, diathermy, scissors, suction, ultrasound aspirator, thermal knife, laser, argon beam), a coagulating device (monopolar or bipolar diathermy, laser), a stapler, biopsy forceps, needle, cannula etc.
  • a cutting or resecting device scalingpel, diathermy, scissors, suction, ultrasound aspirator, thermal knife, laser, argon beam
  • a coagulating device monopolar or bipolar diathermy, laser
  • a stapler a stapler, biopsy forceps, needle, cannula etc.
  • An imaging device like an ultrasound catheter, ultrasound probe or any optical imaging system.
  • Combined devices such as an endoscope that includes imaging capabilities and at least one surgical tool as described above.
  • the term quasi real-time will be used throughout this invention to designate that a process (like ultrasound data acquisition, position determination of tools in the imaged scene and/or data visualization) runs fast enough to allow for interactive feed ⁇ back/operation by the user. This includes truely real-time where the absolute time delay between data acquisition and the final data visualization is below the acceptable level for interactive feedback.
  • quasi real-time to refer to processes that appear as truely real-time to the user.
  • positioning system will be used throughout this invention to designate: 1 . Any system that provide information about the position and/or direction of an ultrasound probe, a tool or other objects within the operating theater.
  • the positioning system can optionally provide mechanical support by limiting the movement of the ultrasound probe, tool or other objects to a predetermined space, plane, direction or point. 2.
  • the positions and/or directions are determined by measurements or by predetermined geometry.
  • Position measurement is achieved by any magnetic, electromagnetic, optical or acoustical system (wireless or not) or by any mechanical arrangement with angle and/or position sensors.
  • Such techniques require positioning and manipulation of tools in relation to organs and other biological structures within the body, all which may be hidden for visual inspection of the human eye.
  • the irregular and unpredictable shape and position of most biological structures and organs makes absolute positioning within the body difficult or impossible from the outside. Positions and shapes may also change during the procedure.
  • Various imaging techniques are currently in use to provide geometric information to the operator, prior to, during and after the procedure. Preoperative MR, CT or X-ray scans are commonly used in order to utilize a description of a lesion and its relation to other structures.
  • Endoscophic techniques based on optics or a video camera provide high quality and real-time visualizations which allows intra-operative procedures.
  • the lack of penetration through biological structures limit their use.
  • the ultrasound technology has several advantages in that it penetrates through biological structures, the instruments are portable, and interactive imaging during the procedure is possible, even in real time. This means that structures that change during the clinical procedure can be monitored continously or repetitively.
  • this invention takes as a starting-point known methods for acquisition of 2D and 3D ultrasonic images and established clinical procedures where at least one tool is applied to the imaged, living biological structures.
  • the invention describes new techniques for computation of 2-dimensional and 3-dimensional ultrasonic images and/or visualizations that utilizes the localization of at least one tool in the imaged scene.
  • the methods to be described allow extension of non-invasive and minimally invasive techniques by providing valuable additional features to existing technology.
  • Previously acquired images The value of previously acquired images is limited to cases where the anatomical structures do not change very much. Brain surgery is an example of one area where this technique is widely used and where new technology develops rapidly. Previously acquired MR/CT and angiograms form the basis for planning the location of the craniotomy and the least damaging route down to the lesion.
  • the access to multiplane images provide information about the size and loca ⁇ tion of a lesion relative to other structures, and this information helps the surgeon to perform for example free-hand catheter/needle interventions into a lesion. However, the accuracy is limited.
  • Stereotaxy has been developed in order to improve the accu ⁇ racy in navigating tools within the brain [1], but this mechanical system is cumber- some to use, and it provides no other information than the position and direction of a tool relative to the coordinate system of the previously acquired images.
  • image guided surgery techniques consist of a system for measuring the position and direction of a surgical tool relative to previously acquired digital 3D images. With this technique, the surgeon can move the tool freely by hand and simultaneously observe on a monitor, a set of images or other visualizations, which in some way are related to the position and/or direction of the tool.
  • Such a system is described by Galloway et al. where the position of the tool was measured by a six-degree-of-freedom articulated arm [2].
  • a commercial product based on the same method is the "The Viewing Wand” which was developed by ISG Technologies Inc. (Ontario, CA).
  • a challenge in image guided surgery techniques is to relate the coordinate system of the previously acquired images to the coordinate system of the tool positioning system. This problem is solved by calibration in the Viewing Wand system: the tip of the tool is located on some points on the patient head whose coordinates are known to the image data base.
  • Such systems have advantages over for example stereotaxy in that it is less cumber ⁇ some, the tool can be moved freely, and image information is available prior to and during intervention. The route down to the lesion can be planned and the actual intervention can be monitored to the extent one can trust the accuracy of the coordinate system alignment.
  • ntraoperative real-time imaging Ultrasound imaging is currently used to guide different kind of surgical procedures. Several authors have demonstrated the value of using ultrasound imaging to determine the shape and location of a lesion in the brain in order to plan the least damaging route down to it. However, the accuracy in hitting a deep-sited cyst or tumor with a needle from the brain surface is very low and the success rate depends on the operators skills [3, 4].
  • a tool that improves the accuracy and success-rate is a mechanical device that is fixed to the ultrasound probe. It contains a guiding channel for a needle whose direction coincide with the ultrasound scan plane. The angle can be tilted and the direction of the needle is marked and superimposed on the ultrasound image.
  • the drawback with this method is that needle intervention is performed under real-time 2D imaging guidance only (in contrast to real-time 3D) and the flexibility in manipulating the needle is low.
  • a clinical procedure is guided interactively by quasi real-time ultrasonography More specifically, the planning and/or execution of a surgical, therapeutic or diagnostic procedure is guided by on-site ultrasound imaging which allows mter- active tool manipulation with arbitrary positioning of the tool(s) relative to the imaging device including freehand movement of the tool or the imaging device The procedure is guided by quasi real-time feedback through ultrasonographic visualizations.
  • the method can optionally include previously acquired images and provide visualizations that combine on-site ultrasound images with the pre- operative scans
  • the main advantage of this method is the on site and quasi real-time imaging capability which extents its use to applications where the shape and/or location of organs and other biological structures change during the procedure
  • One examples is removal of brain tumors where surrounding tissue collapse during resection.
  • Another example is the problem of positioning a tumor or other lesion in the dose-planning and in the rad ⁇ o-therap ⁇ machines
  • Dose planning and therapeutic radiation is performed in different equipment and typically with several days time interval Organs in the abdomen is especially subject to movements between dose planning and therapeutic radiation 3
  • the position of a surgical tool can be registered directly in the ultrasound image in situations where the tool is located in the image field.
  • the tool position relative to the ultrasound image coordinates as well as relative to other biological structures can be measured directly
  • the process of relating an image coordinate set to a tool position coordinate system is obsolete. No calibration procedure is required thus eliminating the risk of misalignments between the coordinate systems. Misalignments during the procedure can be difficult to discover, especially if the calibration is performed once and prior to the surgery.
  • the invention describes the possibility of letting the tool include a second ultrasound probe. This allows calculation of combined visualizations as for example "bifocal imaging” :
  • the image from a high resolution (low penetration) imaging device located at the tip of tool) may for example be superimposed on in the image from a lower resolution overview scanner.
  • Combined use of ultrasound imaging and one or more previously acquired image data bases have several advantages.
  • the use of previously acquired images can be extended to cases with tissue movement.
  • the ultrasound imaging system can track the movement of an organ or other biological structures and transfer the change in coordinate systems to the previously acquired data base.
  • Figure 1 illustrates a system for ultrasound guided intervention (biopsy, aspiration or equivalent) in the brain.
  • An ultrasound probe is located on the brain surface in a burr-hole or craniotomy.
  • a surgical tool in inserted through the same or a different hole in the skull. The positions of the probe and the tool are measured and coregistered in the computer.
  • Figure 2 is a top view illustration of an alternative implementation of that in Figure 1 except that a mechanical position measuring system is attached to the ultrasound probe arm, making initial calibration obsolete.
  • the tool slides in a guiding tube whose direction is measured and known to the system. Information about the location of the tool tip is provided through direct visualization of the tool in the quasi real-time ultrasound image.
  • Figure 3 illustrates is a detailed description of the tool direction measuring system described in Figure 2. It consists of a guiding tube in which the tool slides and four rotational joints with angle sensors.
  • Figure 4 illustrates how blood vessels can be localized and related to the position of the tool.
  • Figure 5 illustrates the use of a hand-held high-resolution ultrasound probe during brain tumor resection.
  • the position of the probe is measured in order to allow coregistered visualizations of structures including blood vessels, ventricles, lesions and lesion borders. Imaging from a separate burr-hole may be convenient.
  • Figure 6 illustrates one possible realization of an endoscope for brain surgery where (a) provide a top view and (b) a view towards the distal tip of the endoscope.
  • the endoscope consist of two parts: (c) an imaging part which contains an ultrasound probe, and and one or two channels for light source and optical view and (d) a surgical part which contains three channels: one working channel for surgical tools, one channel for suction and one for irrigation.
  • Figure 7(a) and (b) are side view (sagittal) and cross sectional (axial) illustrations respectively of ultrasound guided dose planning and/or treatment in radiotherapy.
  • the position and direction of the ultrasound scan plane(s) are measured and coregistered with the radiation field coordinates. This gives the physician opportunity to verify the shape and location of the tumor and/or other biological structures (such as organs or organ parts), and align accurately the radiation field on target.
  • Figure 8(a) and 9 illustrate simplified versions of Figure 7 (a) and (b) where the ultrasound probe movement is limited so that the ultrasound scan plane intersects the center of radiation. This reduces investment costs at the expense of freedom of operability.
  • Figure 10 illustrates how the position of a tool and/or the tool trajectory can be visualized on top of a 2D ultrasonic image.
  • Figure 11 illustrates how the position of a tool and/or the tool trajectory can be visualized together with either 3D ultrasonic datasets or 2D ultrasonic images extracted from a 3D ultrasonic dataset.
  • Figure 12 illustrates how a tool can insert an acoustical shadow in a 2D or 3D ultrasonic dataset.
  • Figure 13 illustrates how a region of interest or variation in the opacity function can be defined based on the location of the tool and/or tool trajectory in a 3 dimensional ultrasonic dataset.
  • Figure 14 illustrates how a 2D ultrasonic plane arbitrarily positioned inside a 3D ultrasonic dataset can be defined relative to the position of the tool. Similarly, 3D visualizations can be related in 3D space relative to the position of the tool inside the imaged scene.
  • Figure 15 illustrates how a 2D imaging technique (including high resolution ultrasonic imaging and video imaging) can be combined with a visualization of a 3D ultrasonic image covering the entire region of interest.
  • Figure 16 illustrates how a secondary 2D/3D image (including all pre-operative and/or intra-operative medical imaging modalities such as: a high resolution ultrasonic 2D/3D image, a magnetic resonance 2D/3D image, a computer tomographic 2D/3D image, a X-ray image, an arteriogram and/or a video image) is related to the ultrasound acquisitions described elsewhere in the invention. All techniques described in this invention on 2D and 3D visualizations/presentations of ultrasonic data and/or tool positions can therefore be extended with mixed or additional visualizations where the image data is fetched in the secondary 2D/3D image in stead of the ultrasonic imaging device used on site during the clinical procedure.
  • a secondary 2D/3D image including all pre-operative and/or intra-operative medical imaging modalities such as: a high resolution ultrasonic 2D/3D image, a magnetic resonance 2D/3D image, a computer tomographic 2D/3D image, a X-ray image, an arteriogram and/or a video
  • Figure 17 illustrates how a transducer is moved with rotation and tilting in order to allow for quasi real-time synchronization with the movement of the tool.
  • the 2D scanplane is oriented such that the tool is contained inside the acquired 2D image.
  • This invention is applicable in surgery, therapy and diagnosis, and possible clinical fields include abdominal, urological, toraxial, gastroenterological, neurosurgical, gyneocological and orthopedical applications.
  • the invention is described through specific examples of how the method can be applied, a summary at the end of this section and the appended claims and illustrations.
  • Several examples are provided from three different clinical cases: I) Deep sited brain intervention, II) Open and endoscopic brain tumor resection and III) Radiotherapy.
  • the method described in this case is minimal invasive brain interventional proce ⁇ dure such as biopsy, aspiration or equivalent at a deep sited location in the brain, guided by ultrasound.
  • a tool is inserted into the lesion through the normal paren ⁇ chyma by ultrasound imaging guidance in order to find the least damaging route to the site.
  • the tool can either be inserted through the burr-hole or craniotomy that is made for the ultrasound probe. However, it may be more convenient to drill an extra hole for the tool in order to obtain a different angle of incident.
  • the system illustrated in Figure 1 consists of an ultrasound instrument 1. 1 1 with a built in computer 1.10 that communicates with the ultrasound instrument.
  • a 3D positioning system that is based on low frequency magnetic fields is connected to the computer.
  • the positioning system consists of a control unit 1.9, a source 1.8 and a sensor 1.6.
  • One possible realization of 1.6, 1.8 and 1.9 is the product 3SPACE ISOTRAK (Polhe us Navigation Science, Colchester, Vermont) which is based on low frequency magnetic fields.
  • the sensor is connected by cable to a tool 1.5 which can be moved freely by hand 1.7.
  • the position and direction of the sensor 1.6 is measured several times per second, and the position and direction of the tool can thereby be calculated relative to the source 1.8.
  • An alternative system is the OPTOTRACK (Northern Digital Inc. , Ontario, Canada).
  • An ultrasound probe 1.1 is connected to the ultrasound instrument by a cable and the probe is mounted to a fixed point 1.4 like for example the bed by a positioning system which consists of three stiff arms 1.3 that are interconnected by flexible ball joints with locks 1.15.
  • the flexible ball joints 1.15 are unlocked and the ultrasound probe is located on the brain surface 1.2 aligned until a good ultrasound image of the lesion and surrounding tissues is provided.
  • the flexible ball joints 1.15 are unlocked in order to fix the ultrasound probe in the same position throughout the procedure.
  • Initial calibration is required in order to relate the coordinate system of the ultrasound probe 1. 1 to the coordinate system of the source 1.8. This is achieved by touching the tip of the tool on three landmarks 1.14 located on the ultrasound probe 1.1 and its arm.
  • the ultrasound probe provide 3D information about the size and shape of the lesion as well as its relation to other biological structures.
  • Preoperative MR of CT images is loaded into the computer prior to the procedure, and a calibration of the topographic coordinate system relative to the magnetic positioning system is performed by measuring the position of the landmarks 1. 16 (located on the patients head during preoperative scan) by touching them with the tip of the tool.
  • the computer calculates repeatedly the position of the tool tip and the direction of the tool.
  • the surgeon can now plan the least damaging route to the lesion by aiming the tool towards the lesion and by observing a set of images that is displayed with an update rate of several images per second.
  • Image selection is done by the computer according to a pre defined format and based on the tool direction.
  • One example is to display two perpendicular planes of ultrasound and MR/CT data that intersect along the aiming trajectory.
  • the surgeon can plan the route in advance and if desirable, mark some milestones that can be used during tool insertion to give an alarm if the actual trajectory deviates too much from the planned route.
  • the tool intersection is done by free hand which means that minor tilting and push ⁇ ing is acceptable during the procedure.
  • the image selection is given by the tool direction, the scenes change by tool manipulation, ii)
  • the image selection is specified in advance and stay constant during the procedure (or during parts of it).
  • the imaging scenes include the lesion, and the tool direction is superimposed as lines or symbols on the MR/CT/ultrasound images. As the tool enter into the quasi real-time ultrasound image, it is visualized directly.
  • Visualizations/graphic presentations are computed that utilize the relative position between the scene imaged by the ultrasound transducer and the tools inserted in the imaged scene.
  • 10. 1 , 10.2, 10.3 and 10.4 illustrate 2D ultrasonic images either obtained with 2D imaging or by extraction of a 2D image from a 3D ultrasonic image. The known localization of a tool is superimposed on the 2D images
  • the tool (10.6) and the tool trajectory (10.7) can be superimposed on a 2D image. If the tool is not inside the plane defined by the 2D ultrasonic image, then the tool or tool trajectory will intercept the 2D image in a single point (10.8). In this case one can also provide feedback on the distance between the 2D image and the tip of the tool if only the tool trajectory intercepts the 2D image. Similarly, one can provide feedback to the user about the relative orientation between the tool and the 2D image. 10.9 illustrates a tool given by for example a radiation field that has a 2 dimensional interception with the 2D image.
  • the tools or tool trajectories can also be added to visualizations of 3 dimensional ultrasound data.
  • 11.1 illustrates a 3D ultrasonic dataset where the tool (1 1.5) and the tool trajectory (11.6) are superimposed on the rendering.
  • a visualization created by slicing through the 3D dataset is illustrated by 11.2 where a cavity 11.7 is shown together with the tool 11.8 and the tool trajectory 11.9.
  • These visualizations can be combined with 2D images extracted from the 3D dataset 11.3 and 11.4 where the tools are indicated (11.10, 11.11) as described earlier in the invention.
  • Tools inserted in the scene imaged by the ultrasound transducer might produce an acoustical shadow behind the tool.
  • 12.1 illustrates a 3D dataset where a tool 12.3 produces the shadow 12.4.
  • the shadow 12.7 is produced.
  • Knowing the position of the tool one can compute which regions in the 2D or 3D imaged where the ultrasound beam has been affected by the tool. Knowledge about the beam profile and the point spread function further increase the possibility to accurately locate the image samples in a 2D or 3D ultrasonic image that have been affected by the tool.
  • These artifacts can either be corrected by inserting measurements from earlier 2D or 3D images when the tool was in a different position or simply by making the affected measurements transparent in a visualization of a 3 dimensional scene. In a 3 dimensional visualization the affected spatial locations can be made transparent such that the image artifacts are not included in the derived visualizations.
  • opacity field which is computed relative to the position of the inserted tool.
  • 13.1 illustrates a 3D ultrasonic dataset with a tool 13.3 and the tool trajectory 13.4.
  • An opacity field 13.5 is indicated as a rotationally symmetric region around the tool trajectory. Any shape for the opacity field might be applied, but the field is moved through the 3D scene according to the movements of the tool inside the imaged scene.
  • 13.2 illustrates a 3D visualization with a cavity 13.9 as an example.
  • the tool 13.7, tool trajectory 13.8 and the associated opacity field 13.6 are illustrated.
  • the opacity field can be specified as the set of volume elements that will be exposed to the radiation field. This setting is illustrated with a 3D ultrasonic dataset 13.10 and the associated radiation field/opacity field 13.11.
  • the described opacity field can also be utilized to optimize the acquisition of a 3D ultrasonic dataset.
  • the spatial resolution can be optimized inside the high-opacity regions and the resolution outside this region can either be lower or completely ignored.
  • the opacity field can constitute an arbitrarily positioned 2D plane and the 3D acquisition can also in this case be optimized in order to acquire a minimal amount of data with a maximal resolution around the 2D plane.
  • the position of the tool can be utilized to extract 2D planes from an ultrasonic 3D data set.
  • 14.1 illustrates a 3D dataset.
  • a tool 14.2 and the tool trajectory 14.3 are also indicated.
  • a 2D plane is extracted relative to the tool position such that the tool/tool trajectory is a normal vector to the extracted plane and such that the distance between the extracted plane and the tip of the tool can be controlled by a user parameter.
  • 14.5 illustrates the 2D plane visualized together with a marker 14.6 indicating the intersection with the tool or tool trajectory. The content of 14.5 will hence change according to the movement of the tool 14.2.
  • other relative orientations of the 2D plane relative to the tool position can be specified.
  • Of particular interest are 2D planes that intercept the tool along a line and not a single point.
  • the tool position can also be utilized to control the viewing parameters of visualizations computed from a 3D dataset.
  • 14.7 illustrates a 3D visualization with a cavity 14.8 as an example.
  • the visualization can be created by slicing the 3D dataset with a 2D plane and creating the visualization by raytracing along lines that start in the 2D plane and are perpendicular to the 2D plane. Similar or equivalent volume rendering techniques might also be applied.
  • the viewing direction is hence given by the location of the 2D plane slicing the 3D dataset.
  • the localization of this plane can hence be controlled by the same techniques that we have given for extraction of 2D planes relative to the tool position.
  • 14.9 and 14.10 illustrate a tool and the tool trajectory respectively.
  • the tool is contained inside the 2D plane used to define the viewing orientation of the 3D visualization.
  • 14.4/14.5 could have been used as the 2D slice plane such that the 3D visualization is computed along rays parallel with the orientation of the tool.
  • the tool might include an imaging device like high resolution ultrasound or video imaging.
  • 15.1 illustrates a 3D ultrasonic dataset.
  • 15.2 and 15.3 indicate a tool and the tool trajectory respectively.
  • An example with a high resolution ultrasound image (15.4) acquired from the tip of the tool is indicated.
  • the high resolution 2D image can be displayed separately (15.5), but also integrated into the rendering of the 3D scene (15.1/15.4).
  • Another example of such integration is given by 15.5 which illustrates a 3D visualization with a cavity (15.7) as an example.
  • a tool 15.8 and the tool trajectory 15.9 are indicated.
  • the high resolution 2D ultrasound image (15. 10) is integrated in a coregistered manner to the 3D visualization.
  • Figure 16 illustrates how a secondary 2D/3D image 16.2 (including all pre-opera ⁇ tive and/or intra-operative medical imaging modalities such as: a high resolution ultrasonic 2D/3D image, a magnetic resonance 2D/3D image, a computer tomo ⁇ graphic 2D/3D image, a X-ray image, an arteriogram and/or a video image) is related to the ultrasound acquisitions described elsewhere in the invention (16.1). All techniques described in this invention on 2D and 3D visualizations/presentations of ultrasonic data and/or tool positions can therefore be extended with mixed or additional visualizations where the image data is fetched in the secondary 2D/3D image in stead of the ultrasonic imaging device used on site during the clinical procedure.
  • a secondary 2D/3D image 16.2 including all pre-opera ⁇ tive and/or intra-operative medical imaging modalities such as: a high resolution ultrasonic 2D/3D image, a magnetic resonance 2D/3D image, a computer tomo ⁇ graphic 2D/3D image, a
  • the figure contains two examples of visualizations that can be combined with coregistered image data from the secondary image (16.2).
  • 16.6 illustrates a 2D ultrasound image acquired on site and a coregistered image 16.7 extracted from the secondary image 16.2.
  • 16.8 illustrates a 3D visualization as described earlier in the invention with a cavity 16.10.
  • the tool 16.11 , tool trajectory 16.12 and a high resolution ultrasound image 16.9 are indicated.
  • the visualization can be mixed with either 2D images or 3D visualizations based on the secondary image 16.2. Both visualizations are combined in a coregistered manner in the final rendering.
  • the coregistration of the secondary image 16.2 and the coordinate system given by the tool positioning can be performed with prior art [7],
  • Figure 4 illustrates this particular example with a transducer 4. 1 , a tool 4.2, blood vessels 4.3 located either in 2D or in 3D and the imaged scene 4.4.
  • any present and future ultrasound modality like tissue imaging, color-flow, power-doppler etc. might be utilized as the basis for the visualizations/graphic presentations described in this invention.
  • the image resolution of a 2D ultrasonic image directly acquired by a transducer is usually higher than a 2D image extracted from a 3D ultrasonic dataset.
  • the position of the tool can be utilized to modify the orientation of the 2D plane acquired by the transducer.
  • Figure 17 illustrates this concept. 17.1 is the ultrasound transducer and the tool is given by 17.2.
  • the 2D image acquired by the transducer is given by 17.3.
  • the transducer head (17.4) and two points on a straight tool 17.5 and 17.6 define a unique 2D plane.
  • the transducer is equipped with mechanical orientation devices that can be used to orient the transducer such that the said unique 2D plane is acquired.
  • 17.7 illustrates a rotational motion of the transducer around the center axis and 17.8 indicates a tilting motion for the 2D scanplane.
  • the mechanical orientation devices are controlled according to the position information from the tool such that the operator freely can move the tool and still the 2D ultrasound image acquired by the transducer 17.1 will contain the tool inside the imaged plane.
  • Various time delays between tool movement and transducer readjustment are possible depending on requirements of the said mechanical orientation devices and preferred interactivity for the operator.
  • Blood vessel detection The localization of blood vessels is an important task in surgery, especially in endoscopic techniques were the ability to stop bleedings are limited.
  • This section describes a slightly modified version of a method previously described for automatic differentiation of blood signal and tissue signal for the purpose of blood noise reduction in intravascular ultrasound imaging. The method is expected to be more efficient at the high frequencies (high resolution imaging) than on low (overview imaging) frequencies due to increasing scatter from blood versus frequency. A detailed description of the method is provided in [5].
  • the method to be decribed here is based on the assumption that the tissue move slower than blood relative to the ultrasound probe.
  • tissue signal will be correlated (if the transit time is longer than T) while blood signal will be uncorrelated (if the transit time is shorter than T).
  • the cross correlation coefficient is estimated in each spatial point of the image based on the signal from two adjacent frames.
  • the outcome such an estimate ranges between zero and one, values close to one indicating blood while values close to one indicates tissue. From these measurements one can generate a 2D or 3D blood vessel detection map that locates the vascular tree 4.3 in relation to the surgical tool 4.2 or biological structures, see Figure 4.
  • the positioning system for the tool which is described in Example 1 can be replaced with a mechanical system.
  • One option is the six degree-of-freedom articulated arm described by Galloway et al. [2] or the Viewing Wand (ISG Technologies Inc. Ontario, CA).
  • a five degree-of freedom system may be sufficient for the purpose described here since quasi real-time ultrasound imaging provide information about the location of the tool tip. What is measured is the direction of the tool which requires four angle sensors.
  • a top view of the skull with a five burrhole craniotomy is shown in Figure 2.
  • the ultrasound probe 2.1 is aligned and fixed on the brain surface 2.2 as described in Example 1.
  • a tool direction measuring device 2.3 is attached to the arm 2.4 which holds the ultrasound probe.
  • the surgical tool is entered through a guiding tube 2.5 which is located at the tip of the positioning system.
  • Four rotational axis with angle measurement devices 2.6 allow flexible orientation and direction measurement of the tool relative to the ultrasound probe.
  • a detailed description of the positioning system is provided in Figure 3.
  • the tool 3.1 is inserted in a guiding tube 3.2.
  • the tool can be slided with low friction in the tube which gives one-degree-of-freedom without position measurement.
  • the guiding tube 3.2 is attached to a rotational joint 3.3 which is located on an arm 3.4.
  • the joint includes an angle measurement device 3.5.
  • the distal arm 3.4 is connected to the proximal arm 3.6 in a way that allows rotation of the distal arm and the guiding tool.
  • the rotational angle is measured by 3.7.
  • a joint 3.8 with an angle measuring device 3.9 connects the proximal outer arm 3.6 and the inner arm 3.10, while the inner arm is attached to the ultrasound probe arm 3.11 by a joint 3.12 with an angle measuring device 3.13.
  • the ultrasound probe 3.14 is fixed to the ultrasound probe arm 3.11.
  • a conventional surgical procedure of a deep sited brain tumor start by resecting an access path through the normal brain tissue which is typically 1-5 cm 2 in cross section.
  • Ultrasound imaging possibly with coregistered MR/CT data as described in Example 1 , is supposed to play an role in planning the least damaging route to the tumor in that crossing blood vessels may be discovered (by color flow detection or method described previously ). It may also be possible to select an insertion path according to the detection of gyri.
  • tissue differentiation and global orientation This can be achieved by the said ultrasound probe which provides overview imaging from the brain surface.
  • Remaining tumor tissue can be localized in the ultrasound image, and the exact location within the brain can be found by moving the tool around within the brain until the tip of the tool appears in the ultrasound image (in the immediate vicinity of the remainuing tumor tissue). There is also a need for high resolution close-up imaging in the resection cavity during resection.
  • Such a probe may play an important role during resection in order to: i) help in determining the lesion border (in advance) and help the surgeon to decide how much tumor there is left, ii) detect blood vessels in advance in order to pay special attention during resection of the surrounding area, iii) perform quality control after tumor removal, serve as a supplement to or a substitute of the time consuming biopsy sampling which is currently used.
  • Figure 5 illustrates a situation where a high frequency, high resolution ultrasound probe is used during brain surgery.
  • This probe 5.1 is connected by a cable to an ultrasound instrument 5.2 which supports dual frequency capabilities.
  • a positioning system measures the position of the high resolution ultrasound probe, here illustrated by the sensor 5.3, the source 5.4 and the control unit 5.5 as described in Example 1.
  • the ultrasound probe 5.6 and the ultrasound instrument 5.2 with a computer is the same as described in Example 1.
  • Acoustic contact between the probe and the tissue is achieved by filling the resection hole with saline 5.7.
  • the tumor 5.10 is partially resected in this illustration.
  • the high frequency ultrasound probe 5.1 is conveniently held by one hand while a surgical tool such as a suction device, diathermy, ultrasound aspirator or a biopsy forceps is held by the other hand.
  • a surgical tool such as a suction device, diathermy, ultrasound aspirator or a biopsy forceps is held by the other hand.
  • This setting makes it possible to guide the procedure by close-up, quasi real-time high- frequency ultrasound imaging as well as medium frequency (for example 7.5 MHz) overview imaging.
  • this instrumentation opens up for the following possibilities: i) Coregistered ultrasound imaging, for example "image in image”: The location of the scan plane 5.8 of the high resolution ultrasound probe 5.1 is measured and known to the computer. This scan plane can be extended by data from the 3D ultrasound data set acquired by the probe 5.6. The high resolution short range image 5.8 is inserted in the overlapping part of the lower resolution long range image 5.9. This method "fills in the shadow" of the high resolution ultrasound probe in the lower resolution long range image.
  • Coregistered MR/CT and high resolution ultrasound imaging Data from the 3D MR data set may be visualized in a predetermined relation to the location of the scan plane 5.8, for example by visualizing the coinciding MR/CT-image plane.
  • Visualizations utilizing the localization of the tool can be computed with the techniques as described in Example 1.
  • the location of the tool can be detected in the overview image by temporal high pass filtering if the tool is continuously moving. This is commonly the case during brain tumor resection.
  • temporal high pass filtering is to subtract two 2D or 3D data-sets. Stationary targets will cancel while the moving tool will be highlighted. The detected locations might be correlated with a priori knowledge about the tool geometry.
  • Intro The current use of endoscophic techniques for brain tumor removal is limited. Some tumors located close to the ventricular system can be resected by endoscopes that provide optical view in combination with channels for laser and other surgical tools.
  • the primary advantage of endoscophic surgery is a lower risk of damaging normal brain tissue. Endoscophic removal of tumors that are surrounded by normal brain tissue is not possible today due to limitations in imaging techniques, and methods to avoid and stop bleedings are not developed.
  • imaging and the integration of high resolution ultrasound imaging at the tip of the endoscope possibly in combination with signal processing that allows blood vessel detection, may bring this technique a major step forward.
  • the technique described in the following is expected to be useful for removing dense tumors where there is a low risk for bleedings and also patients with bleedings (stroke). If this technique can be developed and be applied through a single burrhole, then treatment can be offered patients who are offered no treatment today.
  • FIG. 6 A suggestion of an endoscope for brain surgery with a high resolution ultrasound imaging capability at the tip is illustrated schematically in Figure 6.
  • a top view is shown in Figure 6(a) and a front view towards the distal tip is shown in (b).
  • the endoscope 6.1 consists of two parts which can be separated in the imaging part of the endoscope 6.2, see Figure 6(c) and the surgical part of the endoscope 6.3, see Figure 6(d).
  • the ability to separate the endoscope in a fragile imaging part and a more robust surgical part allows different procedures for cleaning and sterilization.
  • the imaging part of the endoscope 6.1 contains an ultrasound probe at the distal tip.
  • the ultrasound beam can either be generated by a tightly curved switched array 6.4 as illustrated in the figure or by a mechanically driven transducer or by a fixed transducer and a mechanically driven mirror.
  • the ultrasound scan plane 6.5 as illustrated in the figure covers a 180 degrees scan angle providing forward and partially side looking capabilities simultaneously.
  • An optical channel 6.6 can optionally be build into the imaging part of the endoscope.
  • the lens 6.7 is located at the distal tip of the endoscope, and the field of view 6.8 covers the area distal to the tip of the endoscope.
  • the surgical part of the endoscope 6.3 is less expensive to make and it can be designed in different ways according to the application.
  • the solution suggested in Figure 6 consists of a channel 6.9 for surgical tools like forceps, suction device 6.10, ultrasound aspirator, diathermy, laser or other. There is also a channel for suction 6.11 and one for irrigation 6.12.
  • the channel 6.9 for surgical tools is shaped so that the surgical tool is forced to cross the ultrasound scan plane in a specified distance from the probe, and the optical field of view 6.8 is also aimed towards this intersection.
  • the insertion of the endoscope should be guided by the overview ultrasound probe, the high resolution ultrasound probe in the endoscope and by the optical channel.
  • a vessel wall detection capability in the high resolution ultrasound imaging system is desirable since this would reduce the risk of hurting vital blood vessels.
  • the optical vision will be limited or inhibited during endoscope entrance due to the small (or no) cavity in front of the endoscope.
  • the endoscope may be re-drawn slightly during the procedure, and saline may be injected under pressure in order to generate a cavity for visual inspection. This requires combined flushing and irrigation in order to clear the sight in case of bleedings.
  • resection can start, guided by ultrasound and/or optical vision.
  • the position where the surgical tool crosses the high resolution ultrasound scan plane is known and can be marked on the ultrasound image. This allows positioning of the endoscope to a region where resection is supposed to be done. The surgical tool is then advanced until it is seen in the ultrasound image, and resection can start. A resection cavity is made in front of the endoscope as tumor tissue or blood is removed, and this cavity should be kept open in order to maintain good visual inspection.
  • irrigation and suction which may serve several purposes: i) Provide acoustical contact between the high resolution ultrasound probe and the tissue, ii) Flush and clear the sight for the optical system, iii) Keep the cavity open by applying a pressure that inhibits tissue collapse around the endoscope. iv) Remove the resected tissue and blood from the cavity.
  • irrigation channel There is one irrigation channel in the endoscope, but suction may be performed through both the suction channel and the surgical tool/working channel. This means that a control unit is required which measures the volume flow in all channel and provides a user selectable control of the total flow pattern.
  • Example 3 The techniques for coregistered imaging and visualization as described in Example 3 applies here as well during endoscophic tumor resection.
  • Case III Radiotherapy.
  • One of the real challenging tasks during radiotherapy of cancer is positioning of patients in the radiation field.
  • the patient is placed in a radiation simulator after a diagnostic survey and tumor/organ mapping using MR, CT, X-ray or ultrasound imaging
  • This machine is functionally a copy of the radiotherapy machine with the same characteristics regarding patient positioning and radiation field characteristics, but where the high energy radiation field is replaced by low energy X-ray.
  • the physician will plan the actual treatment by applying low energy X-ray fields and simultaneously record the X-ray images. These images are now a map or a control of the correct targeting of the tumor/organs by the radiation fields, and are used to ensure that the surrounding tissues (often critical organs) does not receive a lethal dose of high energy radiation.
  • the patient is aligned in the simulator machine with customized braces and patient holders, preventing the patient from moving on the couch.
  • Ink marks are applied to the skin of the patient according to alignment lasers calibrated to transfer the patient from the coordinate system of the simulator machine to the coordinate system of the therapeutic machine.
  • the patient then is transferred to the therapeutic high radiation machine and aligned on the couch with the same braces and supports used in the simulator and according to the alignment lasers.
  • Low energy X-ray images are often taken with the therapeutic machine (minimal dose shots) to verify the patient alignment according the simulator X-ray images.
  • ultrasound 2D or 3D imaging as an aid for locating the target area and positioning the patient in the simulator and in the therapeutic machines will increase the probability of actually radiating the targeted area, correcting for movements of the skin and internal organs and speed up alignment of the patient. This is especially true in cases where the tumor is not visible on conventional X-ray images.
  • Ultrasound guided location of tumors and target areas requires accurate knowledge of the transducer position and the image orientation relative to the coordinate system of the simulator or therapeutic machine.
  • This example describes a possible design of a system for ultrasound guided target verification and patient alignment in a high energy radiation therapeutic machine.
  • An ultrasound probe (7.1) has attached a position and directional device (7.12), being a part of a fixed traction detection system (7.13) (previously described), keeping track of the absolute coordinates of the ultrasound image (7.2) relative to the coordinate system of the simulator (7.5).
  • the patient (7.3) is placed on the simulator couch (7.4), having three-degrees-of-freedom movement (x-y-z-positioning), and stabilized with customized braces and supports (7.6).
  • the target area (7.8) (tumor/organ) is scanned by 2D or 3D ultrasound imaging (7.7) and the physician marks or traces the radiation target (7.8) (tumor) on the ultrasound image (7.9).
  • the coordinates of the targeted area, calculated by the traction detection system (7.13) is transferred to the, coordinate system of the simulator (7.5) and the radiation fields are positioned to intersect the target area (7.8) accordingly.
  • the direction and extent of these fields (7.51) can be projected onto the ultrasound image (7.9).
  • the quasi-real-time feedback of radiation field positioning (7.55) gives the physician opportunity to make on-the-fly adjustments and corrections to the treatment scheme, reducing the time spent in the simulator due to fewer exposure and development cycles with conventional X-ray imaging.
  • the patient is transferred to the therapeutic high radiation machine, aligned and positioned according to the treatment scheme and simulator data.
  • 2D or 3D ultrasound imaging can again be utilized to correctly position and align the patient and the high energy radiation field.
  • the position detection system at the therapeutic machine allows last minute correction for field adjustments due to skin and internal organ movement.
  • An ultrasound probe (8.1) is rigidly connected to the simulator (8.5) with a two- degree-of freedom arm (8.12) supporting movement of the ultrasound transducer (8.1) in the plane spanned by the ultrasound image sector (8.2) aligned with the radiation field center (8.51).
  • the arm joints (8.13) are supplied with angle sensors. These sensors, in combination, measures the position and orientation of the ultrasound transducer (8.1) relative to the coordinate system of the simulator (8.5).
  • the patient (8.3) is placed on the simulator couch (8.4), having three degrees of freedom movement (x-y-z-positioning), and stabilized with customized braces and supports (8.6).
  • the direction of the radiation field (8.51) can be projected onto the ultrasound image (8.9).
  • the physician marks the desired point for the radiation field center in the ultra ⁇ sound image (8.9), the coordinates of this point are transferred to the coordinate system of the simulator (8.5) and the direct feedback of target position will aid the placement of radiation fields (8.51) and their relative angles (8.55) to the patient (8.3).
  • Ultrasound imaging combined with position feedback linked to therapeutic radiation system gives an advantage of reducing simulator time, increasing patient safety, minimizing high energy radiation exposure to sensitive organs and increasing treatment quality control during radiation therapy.
  • At least one ultrasound imaging system is included to generate image information about the biological structures, however, other imaging systems can optionally be included in addition (including ultrasound).
  • the method applies to both 2D and 3D ultrasound imaging.
  • the given definition of the term tool is very broad which means that this method finds applications within different clinical fields including invasive surgery, non- invasive therapy and diagnostics.
  • Possible tools are simple mechanical devices used in surgery, more complex multifunction devices like endoscopes, energy fields in radioterapy or laser-light in diagnostic equipment.
  • the invention allows for arbitrary movement of the tool relative to the imaging devices including freehand movement of the tool or ultrasound probe.
  • the localization of the tool in relation to an ultrasound image allows computation of various visualizations that may be synchronized with the operators tool movement.
  • One example is visualisation of multiplane images that intersects the axis of the tool in order to visualize structures that is is located in vicinity of the tool and distal to its tip.
  • Another example is to apply image processing functions that limit the amount of data that is fed to the display in order to focus on a special area, for example by applying an opacity function to a 3D data set. Techniques like image-in-image is possible if more than one ultrasound imaging system is in use.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Robotics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

La présente invention concerne un procédé permettant de produire quasiment en temps réel, par imagerie par ultrasons, des informations en retour destinées à guider des actes chirurgicaux, thérapeutiques ou de diagnostic. La localisation de l'instrument chirurgical, du champ de rayonnement thérapeutique ou du champ d'énergie de diagnostic utilise un système de coordonnées fournies par un système peropératoire d'imagerie ultrasonore bidimensionnelle et/ou tridimensionnelle (et éventuellement de données préopératoires de résonance magnétique, de tomographie assistée par ordinateur ou de rayons X). Ce procédé permet ainsi d'établir des relations synchronisées entre la saisie des données, les mouvements d'instruments et les affichages d'images.
PCT/NO1996/000029 1995-02-22 1996-02-08 Procede de guidage par ultrasons pour actes cliniques WO1996025881A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU48513/96A AU4851396A (en) 1995-02-22 1996-02-08 Method for ultrasound guidance during clinical procedures
US08/894,229 US6019724A (en) 1995-02-22 1996-02-08 Method for ultrasound guidance during clinical procedures

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
PCT/NO1995/000040 WO1996025882A1 (fr) 1995-02-22 1995-02-22 Procede de guidage par ultrasons utilisable dans le cadre d'examens cliniques
NOPCT/NO95/00040 1995-02-22

Publications (1)

Publication Number Publication Date
WO1996025881A1 true WO1996025881A1 (fr) 1996-08-29

Family

ID=19907785

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/NO1995/000040 WO1996025882A1 (fr) 1995-02-22 1995-02-22 Procede de guidage par ultrasons utilisable dans le cadre d'examens cliniques
PCT/NO1996/000029 WO1996025881A1 (fr) 1995-02-22 1996-02-08 Procede de guidage par ultrasons pour actes cliniques

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/NO1995/000040 WO1996025882A1 (fr) 1995-02-22 1995-02-22 Procede de guidage par ultrasons utilisable dans le cadre d'examens cliniques

Country Status (2)

Country Link
AU (1) AU4851396A (fr)
WO (2) WO1996025882A1 (fr)

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0825833A1 (fr) * 1995-04-13 1998-03-04 Neovision Corporation Appareil a biopsie a guidage par image, et a capacite d'imagerie renforcee, et procedes correspondants
WO1999002100A1 (fr) 1997-07-10 1999-01-21 Mueller Wolfram Systeme de guidage par coordonnees et de positionnement de reference
WO1998046120A3 (fr) * 1997-04-16 1999-03-11 Storz Karl Gmbh & Co Systeme endoscopique
EP0930046A2 (fr) * 1997-11-26 1999-07-21 Picker International, Inc. Méthode et appareil d'imagerie
WO2000021450A1 (fr) * 1998-10-09 2000-04-20 Auer, Dorothee Dispositif pour effectuer des interventions medicales et procede pour produire une image
WO2000024317A1 (fr) * 1998-10-26 2000-05-04 Universitätsklinikum Charite Medizinische Fakultät Der Humboldt-Universität Zu Berlin Systeme pour ponctionner des vaisseaux
WO2001039683A1 (fr) 1999-12-03 2001-06-07 Sinvent As Instrument de navigation
EP1109034A1 (fr) * 1999-12-16 2001-06-20 HILTI Aktiengesellschaft Procédé et dispositif permettant l'investigation et l'identification de la nature d'une surface sous-jacente
WO2001058359A1 (fr) * 2000-02-11 2001-08-16 Zanelli Claudio I Imageur a ultrasons
EP1199996A1 (fr) * 1999-07-19 2002-05-02 Light Sciences Corporation Surveillance en temps reel de therapie photodynamique pendant une duree prolongee
WO2001034051A3 (fr) * 1999-10-28 2002-05-10 Medtronic Surgical Navigation Superposition d'informations de navigation sur une imagerie a ultrasons
WO2002036013A1 (fr) 2000-10-18 2002-05-10 Paieon Inc. Procede et systeme de positionnement d'un dispositif dans un organe tubulaire
EP1217947A1 (fr) * 1999-07-23 2002-07-03 University of Florida Guidage ultrasons de structures cibles pour interventions medicales
WO2002062224A1 (fr) * 2001-02-05 2002-08-15 Koninklijke Philips Electronics N.V. Methode d'imagerie diagnostique
US6587709B2 (en) 2001-03-28 2003-07-01 Koninklijke Philips Electronics N.V. Method of and imaging ultrasound system for determining the position of a catheter
DE10313829A1 (de) * 2003-03-21 2004-10-07 Aesculap Ag & Co. Kg Verfahren und Vorrichtung zur Auswahl eines Bildausschnittes aus einem Operationsgebiet
WO2004091418A1 (fr) * 2003-04-15 2004-10-28 Dror Nir Procede et systeme pour selectionner et enregistrer des sites de biopsie dans un organe corporel
FR2856577A1 (fr) * 2003-06-27 2004-12-31 Medicrea International Dispositif d'examen des caracteristiques d'un os
WO2005000124A3 (fr) * 2003-06-27 2005-03-17 Medicrea International Dispositif d'examen des caracteristiques d'un os
EP1717601A2 (fr) 2005-04-26 2006-11-02 Biosense Webster, Inc. Affichage d'une pointe de cathéter et la direction d'un faisceau ultrasonore pour système ultrasonographique
EP1720038A2 (fr) 2005-04-26 2006-11-08 Biosense Webster, Inc. Superposition de données ultrasonographiques avec une image pré-acquise
EP1720039A2 (fr) * 2005-04-26 2006-11-08 Biosense Webster, Inc. Représentation d'un champ ultrasonore de deux dimensions en forme d'éventail
DE102005041602A1 (de) * 2005-09-01 2007-04-05 Siemens Ag Verfahren zur Darstellung eines medizinischen Implantats in einem Bild sowie medizinisches bildgebendes System
EP1804079A3 (fr) * 2005-12-28 2007-09-12 Olympus Medical Systems Corp. Appareil de diagnostic à ultrasons
EP1858418A1 (fr) * 2005-02-28 2007-11-28 Robarts Research Institute Systeme et procede pour la realisation d'une biopsie d'un volume cible et dispositif informatique pour sa planification
EP1937153A2 (fr) * 2005-06-21 2008-07-02 Traxtal Inc. Dispositif et procédé pour appareil à ultrasons traçable
US20080298660A1 (en) * 2007-01-11 2008-12-04 Kabushiki Kaisha Toshiba 3-dimensional diagnostic imaging system
WO2009063423A1 (fr) * 2007-11-16 2009-05-22 Koninklijke Philips Electronics, N.V. Navigation interventionnelle par imagerie ultrasonore 3d avec injection de contraste
US7587074B2 (en) 2003-07-21 2009-09-08 Paieon Inc. Method and system for identifying optimal image within a series of images that depict a moving organ
US7742629B2 (en) 2003-09-25 2010-06-22 Paieon Inc. System and method for three-dimensional reconstruction of a tubular organ
AU2006201646B2 (en) * 2005-04-26 2011-01-06 Biosense Webster, Inc. Display of catheter tip with beam direction for ultrasound system
US8126241B2 (en) 2001-10-15 2012-02-28 Michael Zarkh Method and apparatus for positioning a device in a tubular organ
US8290303B2 (en) 2007-10-11 2012-10-16 General Electric Company Enhanced system and method for volume based registration
US8295577B2 (en) 2005-03-31 2012-10-23 Michael Zarkh Method and apparatus for guiding a device in a totally occluded or partly occluded tubular organ
AU2012258444B2 (en) * 2005-04-26 2014-01-09 Biosense Webster, Inc. Display of two-dimensional ultrasound fan
CN103908297A (zh) * 2012-12-31 2014-07-09 通用电气公司 用于识别来自阴影区域的数据的超声成像系统和方法
US8989842B2 (en) 2007-05-16 2015-03-24 General Electric Company System and method to register a tracking system with intracardiac echocardiography (ICE) imaging system
US9549713B2 (en) 2008-04-24 2017-01-24 Boston Scientific Scimed, Inc. Methods, systems, and devices for tissue characterization and quantification using intravascular ultrasound signals
US10456105B2 (en) 2015-05-05 2019-10-29 Boston Scientific Scimed, Inc. Systems and methods with a swellable material disposed over a transducer of an ultrasound imaging system
US20230310729A1 (en) * 2011-05-13 2023-10-05 Vascular Technology Inc. Remotely controlled suction/irrigation for surgery

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001012057A1 (fr) 1999-08-16 2001-02-22 Super Dimension Ltd. Procede et systeme de presentation d'images en coupe transversale d'un corps
WO2009147671A1 (fr) 2008-06-03 2009-12-10 Superdimension Ltd. Procédé d'alignement basé sur des caractéristiques
US8218847B2 (en) 2008-06-06 2012-07-10 Superdimension, Ltd. Hybrid registration method
US8428328B2 (en) 2010-02-01 2013-04-23 Superdimension, Ltd Region-growing algorithm
EP4151154A3 (fr) * 2010-06-30 2023-06-07 Muffin Incorporated Introduction percutanée, guidée par ultrasons de dispositifs médicaux
WO2017037705A1 (fr) * 2015-08-30 2017-03-09 M.S.T. Medical Surgery Technologies Ltd Système de commande d'outil chirurgical intelligent pour chirurgies laparoscopiques
CN113729941B (zh) * 2021-09-23 2024-01-30 上海卓昕医疗科技有限公司 基于vr的手术辅助定位系统及其控制方法
WO2023108625A1 (fr) * 2021-12-17 2023-06-22 上海卓昕医疗科技有限公司 Système de positionnement de ponction et son procédé de commande
AU2023232839A1 (en) * 2022-03-08 2024-10-17 Sononurse Vs Inc. Apparatus and method to guide the insertion of medical device into a subject
CN114419044B (zh) * 2022-03-30 2022-06-17 广东恒腾科技有限公司 一种基于人工智能的医用超声图像分析系统及方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4672963A (en) * 1985-06-07 1987-06-16 Israel Barken Apparatus and method for computer controlled laser surgery
US4750367A (en) * 1986-01-31 1988-06-14 U.S. Philips Corporation Device for examining moving objects by means of ultrasound echography
US4834089A (en) * 1985-02-12 1989-05-30 Koivukangas John P Adapter for brain surgery
US4869256A (en) * 1987-04-22 1989-09-26 Olympus Optical Co., Ltd. Endoscope apparatus
US5241473A (en) * 1990-10-12 1993-08-31 Ken Ishihara Ultrasonic diagnostic apparatus for displaying motion of moving portion by superposing a plurality of differential images
US5370120A (en) * 1992-12-08 1994-12-06 Siemens Aktiengesellschaft Ultrasound imaging apparatus
US5391139A (en) * 1992-09-03 1995-02-21 William Beaumont Hospital Real time radiation treatment planning system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4834089A (en) * 1985-02-12 1989-05-30 Koivukangas John P Adapter for brain surgery
US4672963A (en) * 1985-06-07 1987-06-16 Israel Barken Apparatus and method for computer controlled laser surgery
US4750367A (en) * 1986-01-31 1988-06-14 U.S. Philips Corporation Device for examining moving objects by means of ultrasound echography
US4869256A (en) * 1987-04-22 1989-09-26 Olympus Optical Co., Ltd. Endoscope apparatus
US5241473A (en) * 1990-10-12 1993-08-31 Ken Ishihara Ultrasonic diagnostic apparatus for displaying motion of moving portion by superposing a plurality of differential images
US5391139A (en) * 1992-09-03 1995-02-21 William Beaumont Hospital Real time radiation treatment planning system
US5370120A (en) * 1992-12-08 1994-12-06 Siemens Aktiengesellschaft Ultrasound imaging apparatus

Cited By (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0825833A1 (fr) * 1995-04-13 1998-03-04 Neovision Corporation Appareil a biopsie a guidage par image, et a capacite d'imagerie renforcee, et procedes correspondants
US6656110B1 (en) 1997-04-16 2003-12-02 Karl Storz Gmbh & Co. Kg Endoscopic system
WO1998046120A3 (fr) * 1997-04-16 1999-03-11 Storz Karl Gmbh & Co Systeme endoscopique
US6832985B2 (en) 1997-04-16 2004-12-21 Karl Storz Gmbh & Co. Kg Endoscopic system with instrument position and orientation display
WO1999002100A1 (fr) 1997-07-10 1999-01-21 Mueller Wolfram Systeme de guidage par coordonnees et de positionnement de reference
AT405126B (de) * 1997-07-10 1999-05-25 Graf Reinhard Koordinatenführungssystem und referenzpositioniersystem
EP0930046A2 (fr) * 1997-11-26 1999-07-21 Picker International, Inc. Méthode et appareil d'imagerie
EP0930046A3 (fr) * 1997-11-26 2000-09-20 Picker International, Inc. Méthode et appareil d'imagerie
WO2000021450A1 (fr) * 1998-10-09 2000-04-20 Auer, Dorothee Dispositif pour effectuer des interventions medicales et procede pour produire une image
WO2000024317A1 (fr) * 1998-10-26 2000-05-04 Universitätsklinikum Charite Medizinische Fakultät Der Humboldt-Universität Zu Berlin Systeme pour ponctionner des vaisseaux
EP1199996A1 (fr) * 1999-07-19 2002-05-02 Light Sciences Corporation Surveillance en temps reel de therapie photodynamique pendant une duree prolongee
EP1199996A4 (fr) * 1999-07-19 2009-01-14 Light Sciences Oncology Inc Surveillance en temps reel de therapie photodynamique pendant une duree prolongee
EP1217947A1 (fr) * 1999-07-23 2002-07-03 University of Florida Guidage ultrasons de structures cibles pour interventions medicales
EP1217947A4 (fr) * 1999-07-23 2005-01-19 Univ Florida Guidage ultrasons de structures cibles pour interventions medicales
WO2001034051A3 (fr) * 1999-10-28 2002-05-10 Medtronic Surgical Navigation Superposition d'informations de navigation sur une imagerie a ultrasons
WO2001039683A1 (fr) 1999-12-03 2001-06-07 Sinvent As Instrument de navigation
EP1109034A1 (fr) * 1999-12-16 2001-06-20 HILTI Aktiengesellschaft Procédé et dispositif permettant l'investigation et l'identification de la nature d'une surface sous-jacente
CN100350130C (zh) * 1999-12-16 2007-11-21 希尔蒂股份公司 研究和鉴定基础的类型用的方法和装置
JP2001228125A (ja) * 1999-12-16 2001-08-24 Hilti Ag 基盤の調査・同定方法及び装置
US6640205B2 (en) 1999-12-16 2003-10-28 Hilti Aktiengesellschaft Method and device for investigating and identifying the nature of a material
WO2001058359A1 (fr) * 2000-02-11 2001-08-16 Zanelli Claudio I Imageur a ultrasons
WO2002036013A1 (fr) 2000-10-18 2002-05-10 Paieon Inc. Procede et systeme de positionnement d'un dispositif dans un organe tubulaire
US7778685B2 (en) 2000-10-18 2010-08-17 Paieon Inc. Method and system for positioning a device in a tubular organ
US6654444B2 (en) 2001-02-05 2003-11-25 Koninklijke Philips Electronics N.V. Diagnostic imaging method
WO2002062224A1 (fr) * 2001-02-05 2002-08-15 Koninklijke Philips Electronics N.V. Methode d'imagerie diagnostique
US6587709B2 (en) 2001-03-28 2003-07-01 Koninklijke Philips Electronics N.V. Method of and imaging ultrasound system for determining the position of a catheter
US8126241B2 (en) 2001-10-15 2012-02-28 Michael Zarkh Method and apparatus for positioning a device in a tubular organ
DE10313829A1 (de) * 2003-03-21 2004-10-07 Aesculap Ag & Co. Kg Verfahren und Vorrichtung zur Auswahl eines Bildausschnittes aus einem Operationsgebiet
DE10313829B4 (de) * 2003-03-21 2005-06-09 Aesculap Ag & Co. Kg Verfahren und Vorrichtung zur Auswahl eines Bildausschnittes aus einem Operationsgebiet
WO2004091418A1 (fr) * 2003-04-15 2004-10-28 Dror Nir Procede et systeme pour selectionner et enregistrer des sites de biopsie dans un organe corporel
WO2005000124A3 (fr) * 2003-06-27 2005-03-17 Medicrea International Dispositif d'examen des caracteristiques d'un os
FR2856577A1 (fr) * 2003-06-27 2004-12-31 Medicrea International Dispositif d'examen des caracteristiques d'un os
US7587074B2 (en) 2003-07-21 2009-09-08 Paieon Inc. Method and system for identifying optimal image within a series of images that depict a moving organ
US7742629B2 (en) 2003-09-25 2010-06-22 Paieon Inc. System and method for three-dimensional reconstruction of a tubular organ
US8788019B2 (en) 2005-02-28 2014-07-22 Robarts Research Institute System and method for performing a biopsy of a target volume and a computing device for planning the same
EP1858418A4 (fr) * 2005-02-28 2009-12-30 Robarts Res Inst Systeme et procede pour la realisation d'une biopsie d'un volume cible et dispositif informatique pour sa planification
EP1858418A1 (fr) * 2005-02-28 2007-11-28 Robarts Research Institute Systeme et procede pour la realisation d'une biopsie d'un volume cible et dispositif informatique pour sa planification
US8295577B2 (en) 2005-03-31 2012-10-23 Michael Zarkh Method and apparatus for guiding a device in a totally occluded or partly occluded tubular organ
US8870779B2 (en) 2005-04-26 2014-10-28 Biosense Webster, Inc. Display of two-dimensional ultrasound fan
AU2012258444B2 (en) * 2005-04-26 2014-01-09 Biosense Webster, Inc. Display of two-dimensional ultrasound fan
EP1717601A3 (fr) * 2005-04-26 2008-02-27 Biosense Webster, Inc. Affichage d'une pointe de cathéter et la direction d'un faisceau ultrasonore pour système ultrasonographique
EP1720038A3 (fr) * 2005-04-26 2008-02-27 Biosense Webster, Inc. Superposition de données ultrasonographiques avec une image pré-acquise
EP1720039A3 (fr) * 2005-04-26 2008-02-20 Biosense Webster, Inc. Représentation d'un champ ultrasonore de deux dimensions en forme d'éventail
US7604601B2 (en) 2005-04-26 2009-10-20 Biosense Webster, Inc. Display of catheter tip with beam direction for ultrasound system
AU2006201451B2 (en) * 2005-04-26 2012-09-20 Biosense Webster, Inc. Registration of ultrasound data with pre-acquired image
US10143398B2 (en) 2005-04-26 2018-12-04 Biosense Webster, Inc. Registration of ultrasound data with pre-acquired image
EP1720039A2 (fr) * 2005-04-26 2006-11-08 Biosense Webster, Inc. Représentation d'un champ ultrasonore de deux dimensions en forme d'éventail
EP1720038A2 (fr) 2005-04-26 2006-11-08 Biosense Webster, Inc. Superposition de données ultrasonographiques avec une image pré-acquise
AU2006201646B2 (en) * 2005-04-26 2011-01-06 Biosense Webster, Inc. Display of catheter tip with beam direction for ultrasound system
EP1717601A2 (fr) 2005-04-26 2006-11-02 Biosense Webster, Inc. Affichage d'une pointe de cathéter et la direction d'un faisceau ultrasonore pour système ultrasonographique
EP1937153A4 (fr) * 2005-06-21 2010-02-10 Traxtal Inc Dispositif et procédé pour appareil à ultrasons traçable
EP1937153A2 (fr) * 2005-06-21 2008-07-02 Traxtal Inc. Dispositif et procédé pour appareil à ultrasons traçable
US8498692B2 (en) 2005-09-01 2013-07-30 Siemens Aktiengesellschaft Method for displaying a medical implant in an image and a medical imaging system
DE102005041602A1 (de) * 2005-09-01 2007-04-05 Siemens Ag Verfahren zur Darstellung eines medizinischen Implantats in einem Bild sowie medizinisches bildgebendes System
EP1804079A3 (fr) * 2005-12-28 2007-09-12 Olympus Medical Systems Corp. Appareil de diagnostic à ultrasons
US8340374B2 (en) * 2007-01-11 2012-12-25 Kabushiki Kaisha Toshiba 3-dimensional diagnostic imaging system
US20080298660A1 (en) * 2007-01-11 2008-12-04 Kabushiki Kaisha Toshiba 3-dimensional diagnostic imaging system
US8989842B2 (en) 2007-05-16 2015-03-24 General Electric Company System and method to register a tracking system with intracardiac echocardiography (ICE) imaging system
US8290303B2 (en) 2007-10-11 2012-10-16 General Electric Company Enhanced system and method for volume based registration
WO2009063423A1 (fr) * 2007-11-16 2009-05-22 Koninklijke Philips Electronics, N.V. Navigation interventionnelle par imagerie ultrasonore 3d avec injection de contraste
US9651662B2 (en) 2007-11-16 2017-05-16 Koninklijke Philips N.V. Interventional navigation using 3D contrast-enhanced ultrasound
US9549713B2 (en) 2008-04-24 2017-01-24 Boston Scientific Scimed, Inc. Methods, systems, and devices for tissue characterization and quantification using intravascular ultrasound signals
US20230310729A1 (en) * 2011-05-13 2023-10-05 Vascular Technology Inc. Remotely controlled suction/irrigation for surgery
CN103908297A (zh) * 2012-12-31 2014-07-09 通用电气公司 用于识别来自阴影区域的数据的超声成像系统和方法
US10456105B2 (en) 2015-05-05 2019-10-29 Boston Scientific Scimed, Inc. Systems and methods with a swellable material disposed over a transducer of an ultrasound imaging system

Also Published As

Publication number Publication date
WO1996025882A1 (fr) 1996-08-29
AU4851396A (en) 1996-09-11

Similar Documents

Publication Publication Date Title
US6019724A (en) Method for ultrasound guidance during clinical procedures
WO1996025881A1 (fr) Procede de guidage par ultrasons pour actes cliniques
US11464575B2 (en) Systems, methods, apparatuses, and computer-readable media for image guided surgery
US11800970B2 (en) Computerized tomography (CT) image correction using position and direction (P and D) tracking assisted optical visualization
CN107072736B (zh) 计算机断层扫描增强的荧光透视系统、装置及其使用方法
US11612377B2 (en) Image guided surgical methodology and system employing patient movement detection and correction
US6591130B2 (en) Method of image-enhanced endoscopy at a patient site
US20080188749A1 (en) Three Dimensional Imaging for Guiding Interventional Medical Devices in a Body Volume
US20070225553A1 (en) Systems and Methods for Intraoperative Targeting
US20080243142A1 (en) Videotactic and audiotactic assisted surgical methods and procedures
US20080234570A1 (en) System For Guiding a Medical Instrument in a Patient Body
CA3029348C (fr) Procede et systeme d'imagerie medicale peroperatoire

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BB BG BR BY CA CH CN CZ DE DK EE ES FI GB GE HU IS JP KE KG KP KR KZ LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TR TT UA UG US UZ VN AZ BY KG KZ RU TJ TM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): KE LS MW SD SZ UG AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE

CFP Corrected version of a pamphlet front page
CR1 Correction of entry in section i

Free format text: PAT.BUL.39/96 UNDER INID (51) "IPC" REPLACE THE EXISTING SYMBOLS BY "A61B 8/08, G06T 7/00"

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 08894229

Country of ref document: US

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase