NL1034672C2 - Medical navigation system with tool and / or implant integration in fluoroscopic image projections and method for their use. - Google Patents

Medical navigation system with tool and / or implant integration in fluoroscopic image projections and method for their use. Download PDF

Info

Publication number
NL1034672C2
NL1034672C2 NL1034672A NL1034672A NL1034672C2 NL 1034672 C2 NL1034672 C2 NL 1034672C2 NL 1034672 A NL1034672 A NL 1034672A NL 1034672 A NL1034672 A NL 1034672A NL 1034672 C2 NL1034672 C2 NL 1034672C2
Authority
NL
Netherlands
Prior art keywords
tool
implant
image
surface boundary
representation
Prior art date
Application number
NL1034672A
Other languages
Dutch (nl)
Other versions
NL1034672A1 (en
Inventor
Willie Williamson Jr
Original Assignee
Gen Electric
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US56116206 priority Critical
Priority to US11/561,162 priority patent/US7831096B2/en
Application filed by Gen Electric filed Critical Gen Electric
Publication of NL1034672A1 publication Critical patent/NL1034672A1/en
Application granted granted Critical
Publication of NL1034672C2 publication Critical patent/NL1034672C2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/62Semi-transparency

Description

Brief indication: Medical navigation system with tool and / or implant integration in fluoroscopic image projections and method for their use.

The invention generally relates to image-guided surgery (or surgical navigation). In particular, the invention relates to a medical navigation system with tool and / or implant integration in fluoroscopic image projections.

Medicine practitioners, such as doctors, surgeons, and other healthcare professionals, often rely on the technique to perform a medical procedure, such as an image-guided operation or examination. A tracking system can provide positional information for the medical instrument relative to, for example, the patient or a reference coordinate system. A physician may consult the tracking system to determine the position of the medical instrument when the instrument is not within the physician's field of vision. A tracking system can also be helpful when planning an operation.

The tracking or navigation system makes it possible for the physician to view the patient's anatomy and to follow the position and orientation of the instrument. The physician can use the tracking system to determine when the instrument is positioned at a desired location. The physician can locate and work in a desired or damaged area while avoiding other structures. An increased accuracy when placing medical instruments in a patient can provide a less invasive medical procedure by promoting an improved control of smaller instruments that have less influence on the patient. Improved control and accuracy with smaller, more sophisticated instruments can also reduce associated risks with more invasive procedures, such as open surgery.

Medical navigation systems according to the precise location of surgical instruments in relation to multi-dimensional images of the anatomy of a patient. Medical navigation systems also use visualization tools to provide the Surgeon with aligned views of these surgical instruments with the patient's anatomy. This functionality is typically provided by incorporating components of the medical navigation system on a wheeled cart (or carts) that can be moved in the operating room.

Tracking systems can be, for example, ultrasound, inertial position or electromagnetic radiation tracking systems. Electromagnetic tracking systems can use coils as receivers and transmitters. Electromagnetic tracking systems can be arranged in sets of three transmitter coils and three receiver coils, such as an industry standard coil architecture (ISCA) configuration. Electromagnetic tracking systems can also be arranged with a single transmitter coil used with an array of receiver coils, or with an array of transmitter coils with a single receiver coil. Magnetic fields generated by the transmitter coil (s) can be detected by the receiver coil (s). For obtained parameter measurements, position and orientation information can be determined for the transmitter and / or receiver coil (s).

In medical and surgical imaging, such as internal or external imaging, images of an area of a patient's body are formed. The images are used to assist with an ongoing procedure in which a surgical tool or instrument is applied to the patient and is monitored in relation to a reference coordinate system formed from the images. Image-guided surgery has particular application in surgical procedures such as brain surgery and arthroscopic procedures performed on the knee, wrist, shoulder or spine, as well as certain types of angiography, cardiac procedures, interventional radiology and biopsies, in which X-ray images can be made to display a tool or instrument involved in the procedure, correct its position or navigate in another way.

Different areas of surgery involve highly accurate planning and control for placing an elongated probe or other object in tissue or bone that is internal or difficult to view directly. In particular, for brain surgery, stereotactic frames defining an input point, probe angle, and probe depth are used to access a location in the brain, generally in conjunction with pre-compiled three-dimensional diagnostic images, such as magnetic resonance imaging (MRI), positron emission tomography (PET) or computed tomography (CT) scan images, which provide accurate tissue images. For placement of stem screws in the spine, where visual and fluoroscopic imaging directions cannot receive an axial view to center a profile of an insertion pathway in bone, such systems have been found to be useful.

When used with existing CT, PET or MRI image sequences, pre-recorded diagnostic image sequences define a three-dimensional (3D) rectangular coordinate system, either as a result of their accurate scanning formation or by the spatial mathematics of their reconstruction algorithms. However, it may be desirable to correlate the available fluoroscopic views and anatomical features visible from the surface or in fluoroscopic images with features in the 3D diagnostic images and external coordinates of tools used. Correlation is often performed by providing implanted checkpoints and / or adding externally visible or traceable markers that can be mapped. Using a keyboard, mouse or other pointer, checkpoints can be identified in the different images. Thus, common sets of coordinate registration points can be identified in the different images. The common sets of coordinate registration points can also be tracked in an automated manner by means of an external coordinate measuring device, such as a suitably programmed, optical tracking assembly available from stock. Instead of calibration points to be imaged, which can be imaged, for example, in both fluoroscopic and MRI or CT images, such systems can to a large extent work with simple optical tracking of the surgical tool and such systems can apply an initialization protocol, wherein a surgeon touches or designates a number of bone features or other recognizable anatomical features to define external coordinates in relation to a patient's anatomy and to initialize software that detects the anatomical features.

Generally, image-guided surgery systems operate with an image display positioned in a field of view of the surgeon and displaying a number of panels, such as a selected MRI image and various X-ray or fluoroscopic views taken at different angles. Three-dimensional diagnostic images typically have a spatial resolution that is both rectangular and accurate to within a very small tolerance, such as within one mm or less. In contrast, fluoroscopic views can be distorted. The fluoroscopic views are shadow-graphic since they represent the density of tissue through which the conical X-ray beam has passed. In tool navigation systems, the display visible to the surgeon may show an image of a surgical tool, biopsy instrument, stem screw, probe, or other device projected onto a fluoroscopic image, so that the surgeon may orient the surgical instrument in relation to the anatomy of the image depicted. a patient can view. A correctly reconstructed CT or MRI image, which can correspond to the followed coordinates of the probe tip, can also be displayed.

Of the systems proposed for implementing such displays, many rely on closely monitoring the position and orientation of the surgical instrument in external coordinates. The different sets of coordinates may be defined by mechanical robot connections and encoders, or more commonly, defined by a fixed patient support, two or more receivers, such as video cameras, which may be attached to the support, and a number to a conductor or signaling elements attached to the surgical instrument, allowing the position and orientation of the tool with respect to patient support and the camera frame to be determined automatically by means of trigonometric calculation, so that different transformations between respective coordinates -4- can be calculated. Three-dimensional tracking systems employing two video cameras and a number of transmitters or other position signaling elements have long been commercially available and are easily adapted to such operating room systems. Such systems may also determine external position coordinates using commercially available acoustic interrogation systems, in which three or more acoustic transmitters are activated and the sound thereof is detected with multiple receivers to determine their relative distances to the detecting assemblies, and thus define by by means of simple trigonometry the position and orientation of the frames or supports on which the transmitters are mounted. When traced checkpoints appear in the diagnostic images, it is possible to define a transformation between operating room coordinates and the coordinates of the image.

More recently, a number of systems have been proposed in which the accuracy of the diagnostic 3D data image sequences is utilized to improve the accuracy of operating room images, by adapting these 3D images to patterns used in intraoperative fluoroscopic images performance. These systems can use bone tracking and matching edge profiles, morphologically deforming one image on another to determine a coordinate transformation or use a different correction process. The procedure of correlating the lesser quality non-planar fluoroscopic images with planes in the 3D image data sets 20 can be time-consuming. In techniques that use checkpoints or added markers, a surgeon may follow a lengthy initialization protocol or a slow and computationally intensive procedure to identify and correlate markers between different sets of images. These factors have all adversely affected the speed and applicability of intra-operative image-guided systems or navigation systems.

Correlation of patient anatomy or intra-operative fluoroscopic imaging with pre-compiled 3D diagnostic imaging data sets can also be complicated by intervening movement of the depicted structures, in particular soft tissue structures, between the times of original imaging and the intra-operative procedure. Transformations between three or more coordinate systems for two sets of 30 images and the physical coordinates in the operating room can thus involve a large number of registration points to provide an effective correlation. To track the spine to position stem screws, the tracking assembly can be initialized at ten or more points on a single vertebra to achieve appropriate accuracy. In cases where a growing tumor or evolving state is currently changing the tissue size or position between imaging sessions, further confusing factors may occur.

-5-

When the purpose of image-guided tracking is to define an operation on a rigid or bone-like structure near the surface, as is the case with the placement of stem screws in the backbone, the alignment can alternatively be accomplished without continuous reference to sub-images, using a computer modeling procedure, in which a tool tip is touched and initialized at each of different bone-like features to determine the coordinates and displacement thereof, after which backbone motion is modeled as a whole by optical initial alignment and subsequent following the tool in relation to the position of these features, while a virtual representation of the backbone is mechanically modeled with a follower element or frame attached to the backbone. Such a procedure eliminates the time-consuming and computationally intensive correlation of different image sequences from different sources and by substituting the optical tracking of points, this procedure can reduce the number of X-ray exposures that are used to effectively adjust the tool position. determine, eliminate or reduce the relationship to patient anatomy with the acceptable degree of accuracy.

However, the foregoing approaches, which correlate high quality image data sets with more distorted shadow graphic projection images and use tracking data to display a tool position, or which capture a finite set of points on a dynamic anatomical model on which extrinsically detected tool coordinates are superimposed, result in a process whereby machine calculations produce a synthetic image or select an existing diagnostic data base plane to guide the surgeon with regard to the current tool position. Although different templates and individual sub-assemblies have been devised to make it easier or reasonably reliable to use each individual coordinate detection or image handling system, the area remains unnecessarily complex. Not only do systems often use correlation of various image sequences and extensive point-by-point initialization of the operation, tracking and image space coordinates or characteristics, but these systems are subject to conditions due to ownership limitations of various manufacturers of 30 apparatus, the physical limitations imposed by tracking systems and the complex programming task for linking to many different image sources in addition to determining their scale, orientation and relationship to other images and coordinates of the system.

Various proposals have been made in which fluoroscopic images are corrected to improve their accuracy. This is a complex undertaking since the nature of the fluoroscopic 3D to 2D projecting imaging action results in the loss of a large portion of information in each shot, so that the reverse conversion is strongly under-determined. Changes in imaging parameters due to camera and source position and orientation that occur with each shot further complicate the problem. This area has been addressed to some extent by one manufacturer, which manufacturer has provided a stiffer and isocentric C-arm structure. The added positional accuracy of the relevant imaging system offers the prospect that by taking a large series of fluoroscopic shots of an immobilized patient, who is calm under certain circumstances, one can be able to undertake some form of planar image reconstruction . However, this appears to be very costly in a mathematical sense, and the current state of the art suggests that, although it may be possible to produce corrected fluoroscopic imaging data sets with somewhat less expensive equipment than with the equipment used for conventional CT imaging , intra-operative fluoroscopic imaging guidance will continue to involve access to MRI, PET, or CT data sets and to rely on extensive surgical input and set-up for tracking systems that make it possible to perform position or imaging correlations .

Thus, it remains highly desirable to use simple, low-dose, and low-cost fluoroscopic imaging for surgical guidance, but also to achieve improved accuracy for critical tool positioning.

Alignment is a process of correlating two coordinate systems, such as a patient image coordinate system and an electromagnetic tracking coordinate system.

Various methods can be used to align coordinates in imaging applications. "Known" or predefined objects are located in an image. A known object contains a sensor used by a tracking system. Once the sensor is located in the image, the sensor allows alignment of the two coordinate systems.

25 U.S. U.S. Patent No. 5,829,444 to Ferre et al., issued November 3, 1998, refers to a method of tracking and alignment using, for example, a headset. A patient wears a head set containing radiopaque markers when scanning images are recorded. Based on a predefined reference unit structure, the reference unit can then automatically locate parts of the reference unit on the scanned images, thereby identifying an orientation of the reference unit with respect to the scanned images. A field generator can be connected to the reference unit to generate a position characteristic field in an area. When a relative position of a field generator with respect to the reference unit has been determined, the alignment unit can then generate a correct mapping function.

Subsequent surfaces can then be located with respect to the stored images.

-7-

However, alignment using a reference unit placed on the patient and remote from the fluoroscope camera introduces inaccuracies in coordinate alignment due to the distance between the reference unit and the fluoroscope. Moreover, the reference unit placed on the patient is typically small, or else the unit may interfere with image scanning. A smaller reference unit can produce less accurate position measurements and thus affect the alignment.

Typically, a reference frame used by a navigation system is aligned with an anatomy prior to surgical navigation. Alignment of the reference frame affects the accuracy of a navigated tool in relation to a displayed fluoroscopic image.

During a procedure, a surgeon operating on a backbone must maintain an accurate perception of complex anatomical 3D relationships. Fluoroscopy is conventionally used intraoperatively to facilitate visualization of an anatomy (e.g., the stem) and placement of tools or implants (e.g., a guidewire or a stem screw). Although fluoroscopy is useful, it is currently limited to only 2D projections of a complex 3D structure. Furthermore, fluoroscopy can only be performed along axes around the transverse plane, with anteroposterior (AP) and mediolateral (ML) views being the most common. In this case, a surgeon cognitively derives surgical placement along a superior / inferior axis (i.e., an axial view) based on interpretation of indications in the images and knowledge of anatomy. These types of leads can lead to varying degrees of inaccuracy when placing, for example, stem screws in the spine.

Computed tomography (CT) imaging provides specific volumetric 3D images for each patient. This series of images can be displayed again from practically any viewing direction and is conventionally presented as a series of axial cross-sections. This is usually used in the pre-operative stage to diagnose a condition and to plan a surgical strategy.

Image-guided navigation is in clinical use for spinal surgery in addition to other applications. Image-guided applications typically use fluoroscopic 2D images or 3D CT data sets. 3D-based systems require explicit alignment of the data set with the patient, usually accomplished by manual digitization (e.g., point-taking) of the patient's anatomy. 2D-based systems are easier to use, since images are intrinsically aligned by following the imaging device (e.g., a fluoroscope) relative to the patient. A hybrid 2D / 3D navigation system, which contains the ease of use and real-time updates of a 2D system together with a simply aligned 3D CT data set, would therefore be highly desirable.

-8-

Fluoroscopic images are taken during a navigation procedure in which a tool can be navigated. In contrast to CT / MRI images (which are thin slices of the anatomy), the fluoroscopic images are projections through a volume of the anatomy. When the navigated tool is pulled, it appears to be "floating" on top of the image (even when the tool is inside or behind the structure (see Fig. 1). A floating image is not an accurate representation of the tool location and required that the surgeon must constantly correct the representation given to him or her.

There is therefore a need for systems and methods for integrating an instrument / tool into a fluoroscopic image.

Certain embodiments of the invention provide systems and methods for representing a tool or an implant in an image.

Certain embodiments provide a method for displaying a tool or implant with respect to an image. The method includes determining a surface boundary for an area of interest shown in an image; determining a position of a tool or implant with respect to the surface boundary; and displaying a representation of the tool or implant on the image. A part of the tool or implant within the surface boundary is shown in the representation with a degree of transparency compared to a part of the tool or implant outside the surface boundary.

Certain embodiments provide a user interface system for displaying a representation of a tool or implant with respect to an image. The system includes a processor adapted to determine a surface boundary for an area of interest shown in the figure and to determine a position of the tool or implant with respect to the surface boundary, and a display , which is arranged to dynamically display the image and representation for a user. The processor generates a representation of the tool or implant based on the position with respect to the surface boundary. A portion of the tool or implant within the surface boundary is shown in the representation with a degree of transparency compared to a portion of the tool or implant outside the surface boundary.

Certain embodiments provide a computer-readable medium that has a series of instructions for execution on a computer. The set of instructions includes a bounding routine for determining a surface boundary for an area of interest shown in an image based on tracking information; an implant representation routine for generating a representation of a tool or implant with respect to the surface boundary based on tracking information; and a display routine for displaying the representation of the tool or implant on the image. A part of the tool or implant within the surface boundary is shown in the representation with a degree of transparency compared to a part of the tool or implant outside the surface boundary.

FIG. 1 shows a known representation of a three-dimensional implant on top of a two-dimensional fluoroscopic image.

FIG. 2 shows a medical navigation system used according to an embodiment of the invention.

FIG. 3 shows a medical navigation system used in accordance with an embodiment of the invention.

FIG. 4 shows a medical navigation system used in accordance with an embodiment of the invention.

FIG. 5 shows an example of a three-dimensional implant, seen in conjunction with a fluoroscopic image according to an embodiment of the invention.

FIG. 6 shows a simplified representation of a tool that shows different shading based on the distance below a boundary surface according to an embodiment of the invention.

FIG. 7 shows a flow chart for a method of implant / tool representation in an image used according to an embodiment of the invention.

FIG. 8 shows an example of an imaging and tracking system used in accordance with an embodiment of the invention.

The foregoing summary as well as the following detailed description of certain embodiments of the invention may be better understood when they are read in conjunction with the accompanying drawings. For illustrative purposes of the invention, certain embodiments are shown in the drawings. It will be understood, however, that the invention is not limited to the devices and instruments shown in the accompanying drawings.

Reference is now made to Fig. 2, in which a medical navigation system (e.g., a surgical navigation system), generally designated by the reference numeral 10, which system comprises a portable computer 12, a display 14 and a navigation link , is shown. The medical navigation system 10 is adapted to cooperate with an electromagnetic field generator 20 and an electromagnetic sensor 22 to determine the location of a device 24. Although the system 10 and / or other navigation or tracking system can be used in conjunction with a variety of tracking techniques, including electromagnetic, optical, ultrasound, inertial position and / or other tracking systems, the -10 system 10 is below for illustration purposes described with regard to electromagnetic tracking.

A table 30 is positioned near the electromagnetic sensor 22 to support a patient 40 during a surgical procedure. A cable 50 is provided for transferring data between the electromagnetic sensor 22 and the medical navigation system 10. The medical navigation system 10 is mounted on a mobile cart 60 with a second representation 18 in the embodiment shown in FIG.

The electromagnetic sensor 22 may, for example, be a printed circuit board. Certain embodiments may include an electromagnetic sensor 22 that includes a printed circuit board receiver array 26 with a plurality of coils and coil pairs and electronics for digitizing magnetic field measurements detected in the printed circuit board receiver array 26. The magnetic field measurements can be used to calculate the position and orientation of the electromagnetic field generator 20 according to any suitable method or system. After the magnetic field measurements have been digitized using electronics in the electromagnetic sensor 22, the digitized signals are sent via cable 50 to the navigation link 16. As will be explained in detail below, the medical navigation system 10 is arranged to calculate a location of the device 24 based on the digitized signals received.

The medical navigation system 10 described herein is capable of following many different types of devices during different procedures. Depending on the procedure, the device 24 may be a surgical instrument (e.g., an imaging catheter, a diagnostic catheter, a therapeutic catheter, a guidewire, a wound excision device, a suction device, a conductor, etc.), a surgical implant (e.g., ., an artificial disk, a bone screw, a by-pass, a stem screw, a plate, an intramedullary rod, etc.) or any other device. Depending on the context of the use of the medical navigation system 10, any number of suitable devices may be used.

With regard to Fig. 3, an example of a block diagram of the medical navigation system 100 is provided. The medical navigation system 100 is shown conceptually as a collection of modules, but can be implemented using any combination of dedicated motherboards, digital signal processors, field-programmable gate arrays, and processors. Alternatively, the modules can be implemented using a single processor or multiple processors deliverable from stock, the functional operations being distributed between the processors. As an example, it may be desirable to have a dedicated processor for position and orientation calculations as well as a dedicated processor for visualization operations. As a further option, the modules can be performed using a hybrid configuration, in which certain modular functions are performed using dedicated hardware, while the remaining modular functions are performed using a computer available from stock. The operations of the modules can be controlled by a system control unit 210.

The navigation link 160 receives digitized signals from an electromagnetic sensor 222. In the embodiment shown in FIG. 2, the navigation link 16 includes an Ethemet port. This port can for example be provided with an Ethernet network coupling card or adapter. However, according to various alternative embodiments, the digitized signals may be sent from the electromagnetic sensor 222 to the navigation link 160 using alternative wired or wireless communication protocols and links.

The digitized signals received by the navigation link 116 represent magnetic field information detected by an electromagnetic sensor 222. In the embodiment shown in Fig. 3, the navigation link 160 sends the digitized signals via a local link 215 to the tracking module 250. The tracking module calculates position and orientation information based on the received digitized signals. This position and orientation information provides a location of a device.

The tracking module 250 communicates the position and orientation information via a local link 215 to the navigation module. As an example, this local link 215 is a Peripheral Component Interconnect (PCI) bus. However, according to various alternative embodiments, equivalent bus techniques can be substituted without departing from the scope of the invention.

After receiving the position and orientation information, the navigation module 260 is used to align the location of the device with the acquired patient data. In the embodiment shown in Fig. 3, the acquired patient data is stored on a disk 245. The acquired patient data may include computer tomography data, magnetic resonance data, positron emission tomography data, ultrasound data, X-ray data or other suitable data as well as combinations thereof. For example, the disk 245 is a hard disk drive, but other suitable storage devices and / or memories may be used.

The acquired patient data is loaded from the disk 245 into memory 220.

The navigation module 260 reads the acquired patient data from memory 220. The navigation module 260 aligns the location of the device with the acquired patient data and generates image data suitable for visualizing the patient image data and a representation of the device. In the embodiment shown in Fig. 3, the image data is sent via a local link 215 to a display control unit 230. The display controller 230 is used to output the image data to two displays 214 and 218.

- 12 -

Although two displays 214 and 218 are shown in the embodiment of FIG. 3, alternative embodiments may include different display configurations. Different display configurations can be used to improve the operating room ergonomics, to display different views or to display information to staff at different locations. For example, as shown in Fig. 2, a first view 14 may be included in the medical navigation system 10 and a second view 18, larger than the first view 14, may be mounted on a mobile cart 60. Alternatively, one or more of the displays 214 and 218 are mounted on a surgical swing arm. The surgical swing arm can be mounted on the ceiling, attachable to a surgical table or mounted on a mobile cart.

Reference is now made to Fig. 4, which shows an alternative embodiment of a medical navigation system 300. The medical navigation system 300 includes a portable computer with a relatively small footprint (e.g., about 1000 cm 2) and an integrated display 382. According to various alternative embodiments, a suitable smaller or larger footprint can be used.

The navigation link 370 receives digitized signals from an electromagnetic sensor 372. In the embodiment shown in FIG. 4, the navigation link 370 sends the digitized signals via a local link 315 to a follower link 350. In addition to the follower link 350, the follower module 356 includes a processor 352 and a memory 354 for calculating position and orientation information based on the received digitized signals.

The follower link 350 communicates the calculated position and orientation information via a local link 315 to the visualization link 360. In addition to the visualization link 360, the navigation module 366 includes a processor 362 and a memory 364 for outputting the location of the device. lines with the acquired patient data stored on a disc 392 and generates image data suitable for visualizing the patient image data and a representation of the device.

The visualization link 360 sends the image data via a local link 315 to a display control unit 380. The display control unit 380 is used to output the display data to the display 382.

The medical navigation system 300 also includes a processor 342, a system controller 344, and a memory 346, which are used for additional computing applications, such as scheduling, updating patient data, or other suitable applications. The operation of the medical navigation system 300 is improved by using a processor 35 342 for general computing applications, a processor 352 for position and orientation calculations, and a processor 362 dedicated to visualization operations. Notwithstanding the description of the embodiment of Figure 4, alternative system architectures can be substituted without departing from the scope of the invention.

As will be further described below, certain embodiments of the invention provide intra-operative navigation on 3D computer tomography (CT) data sets, such as an axial view, in addition to fluoroscopic 2D images. In certain embodiments, the CT data set is intra-operatively aligned with the patient via correlation to standard anteroposterior and lateral fluoroscopic images. Additional 2D images can be acquired and navigated as the procedure proceeds without the need for re-alignment of the CT data series. Certain embodiments provide tools that enable placement of multiple-level procedures. Template formation on a screen can be used to select the length and size of an implant. The system can store the location of multi-level implants in memory. A user can recall stored overlaps for reference during placement of additional implants. In addition, certain embodiments are helpful in eliminating trial and error adjustment of components by performing navigated measurements. In certain embodiments, annotations appear on the screen alongside relevant anatomy and implants.

Certain embodiments use a correlation based on an alignment algorithm 20 to provide reliable alignment. Standard anteroposterior (AP) and lateral (Lat) fluoroscopic images can be acquired. A vortex level is selected and the images are aligned. The vertebral level detection is achieved by pointing a navigated instrument to, for example, the actual anatomy.

Certain embodiments of the system operate in conjunction with a family of backbone instruments and tool boxes, such as a backbone visualization instrument box, surgical backbone instrument box, a neck instrument box, navigation access needle, etc. These instruments facilitate the placement of, for example, a stream of standard steel screws. A set of screw geometries is used to represent these screws and to promote an overlap of wire frame on fully overshadowed models. The overlaps can be saved and recalled for each vortex level.

In certain embodiments, recalled overlaps can be displayed with various automatic measurements, including spacing between multiple-level stem screws, curvature between multiple-level stem screws and level 35 annotations (e.g., Left L4 vertebra). These measurements facilitate an accurate selection of implant length and size. These measurements are also helpful in eliminating experimental adjustments to components.

-14-

Certain embodiments thus assist a surgeon in locating anatomical structures somewhere on the human body during open or subcutaneous procedures. Certain embodiments can be used, for example, at lumbar and / or sacral vertebral levels. Certain embodiments provide Digital Imaging and Communications in Medicine (DICOM) flexibility and support for portal tilt and / or variable slice spacing. Certain embodiments provide automatic windowing and centering with stored profiles. Certain embodiments provide a correlation-based 2D / 3D alignment algorithm and allow, for example, real-time multi-plane excision.

Certain embodiments allow a user to store and recall navigated placements. Certain embodiments allow a user to determine a distance between multiple-level stem screws and / or other implants / instruments. Certain embodiments allow a user to calculate, for example, the length and curvature of a connecting bar.

FIG. 5 shows an example of a 3D implant shown in conjunction with a fluoroscopic image according to an embodiment of the invention. The figure 500 includes a tool / implant 510 and an anatomy 550. The tool / implant 510 includes a portion 520 outside the anatomy 550 and a portion 530 within the anatomy 550.

Using information provided by a user about where a boundary should be present, such as the skin of a patient, the system can determine which parts of the tool / implant 510 are located within 530 and outside 520 the "target" anatomy 550 . Portions 530 of the tool / implant 510, which are below the interface, can be adapted to be, for example, transparent, semi-transparent and / or otherwise opaque in varying sizes. FIG. 6 shows, for example, a simplified representation of a tool that shows different shading based on the distance below a boundary surface. Providing a degree of transparency can give the tool / implant 510 the illusion of being within the fluoroscopic image as shown, for example, in FIG.

In certain embodiments, a user clicks on, or selects, highlights, and / or identifies, on the other hand, one or more points to indicate the surface or boundary of the anatomy 550 or other object in the image. A point and / or a plane is thus used to determine a boundary above which the tool / implant 510 is represented as a solid object in an image, and below which the tool / implant 510 is represented as a semi-transparent object. The tool / implant 510 can be modeled prior to imaging operation to enable the system to assess the position of the tool / implant 510 below and above the boundary and to provide postionation and transparency information to a user. The user may select one or more boundary points by pressing a button and / or applying pressure to a tool or pointer and / or by positioning the tool / implant 510 and selecting a point via software (e.g., based on keyboard and / or mouse input).

In certain embodiments, opacity / transparency can be weighted based on distance to the surface boundary. For example, when an implant 510, such as a stem screw, extends further and further under the skin of a patient, the end of the screw becomes increasingly transparent. The length of the screw under the skin is shown in the image with varying degrees of transparency to provide insight into depth within the patient.

FIG. 7 shows a flow chart for an implant / tool representation method 700 in an image used in accordance with an embodiment of the invention. In step 710, a user indicates one or more measurement points representative of a boundary, such as the skin of a patient. The user can indicate one or more points by, for example, pressing a button and / or applying pressure to a tool or pointer and / or by positioning a tool / implant and selecting a point. via software (e.g., based on keyboard and / or mouse input), etc., to indicate presence of a boundary.

In step 720, a position of a tool / implant is measured with respect to a target, such as a patient. The position can be measured based on any technique from a variety of tracking / navigation techniques, as explained above. In step 720, using information provided by the user regarding the location of the boundary and position information for the tool / implant, a determination is made as to which portion of the tool / implant is within the boundary and which part of the tool / implant is outside the boundary. For example, a portion of the tool / implant below the boundary can be represented with a certain degree of transparency on a displayed image of the patient area. For example, a portion of the tool / implant above the limit can be represented as an opaque object or icon. Providing a degree of transparency can assist a user in gaining a better understanding of tool / implant position with respect to the boundary and the target area.

In step 740, an image is displayed to a user, which image shows the tool / implant with respect to a target, such as the anatomy of a patient. Portions of the tool / implant are opaque and / or transparent according to the boundary location as described above. The image can be updated when the tool / implant is positioned by a user.

- 16-

In certain embodiments, a degree of opacity / transparency can be weighted based on the distance from the surface boundary. For example, when a tool / implant extends further and further below the boundary, the distal portion (s) of the tool / implant becomes increasingly transparent. The length of the tool / implant below the limit can be represented in the image with varying degrees of transparency to provide insight into depth within the target. The degree of transparency and position with respect to a displayed image can be dynamically adjusted when the tool / implant is moved by a user.

Certain embodiments thus make it possible for a surgeon to see a more realistic representation of a 3D tool and / or implant and its location in an anatomy. Certain embodiments provide systems and methods for integrating 3D tracked tools (or implants) into projection images (e.g., fluoroscopic images). Certain embodiments provide a possibility to represent "visually" that a tool / implant is located, for example, in a bone, on a 2D view (e.g., a fluoroscopic or X-ray image view), whereby an approximate 3D effect on a 2D image is created. Certain embodiments provide a dynamic, adaptive side effect when a user moves a tool / implant and / or uses another / additional tool and / or implant on a patient.

Certain embodiments thus provide a workflow improvement for surgical navigation and measurement. In addition, navigated stem screws and / or other implants can be graphically displayed and represented as an overlap of an image for viewing by a physician. The overlap helps to maintain visualization of, for example, screw and / or other implant locations.

Certain embodiments work in conjunction with a 2D / 3D hybrid navigation system that includes real-time updating and ease of use of a 2D system along with a simple aligned 3D CT data set. Safety and accuracy of medical procedures can be improved with a 2D / 3D navigation system. The use of a CT data set together with 2D intra-operative imaging function contributes to visualization and understanding of an anatomy in an operating room. Such a system can find application in a variety of medical procedures, such as backbone procedures, skull procedures, and other clinical procedures. Backbone procedures can include poso-lateral open and minimally invasive surgical (MIS) stem screws, posterior C1-C2 35 transarticular screw fixation, transoral odonoid fixation, cervical lateral mass plate screw fixation, anterior thoracic screw fixation, scoliosis, kyphosis, kyphoplastic transfusion include interbody function (TLIF), artificial discs, burst fractures, excision of paraspinal neoplasms, etc.

Various embodiments have been described above with reference to the drawings. These drawings show certain details of specific embodiments that implement the systems and methods and programs of the invention. However, describing the invention with drawings should not be construed as limitations imposed on the invention which are associated with features shown in the drawings. The invention contemplates methods, systems and program products on machine-readable media for effecting the operations thereof. As mentioned above, the embodiments of the invention can be implemented using an existing computer processor or by means of a special purpose computer processor included for this or another purpose, or by means of a wired system.

As stated above, embodiments within the scope of the invention include program products that include machine-readable media for carrying or containing machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available medium that can be accessed by a computer for general or special purposes or by another machine with a processor. For example, such machine-readable media may be RAM, ROM, PROM, EPROM, EEPROM, Flash, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices or any other medium that may be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed through a general or special-purpose computer or other machine with a processor. When information is transmitted or provided via a network or other communication connection (wired, wireless or a combination of wired and wireless) to a machine, the machine correctly sees the connection as a machine-readable medium. Such a connection is thus correctly referred to as a machine-readable medium. Combinations of the above are also included within the scope of machine-readable media. Machine executable instructions include, for example, instructions and data that cause a general purpose computer, a special purpose computer, or special purpose process machines to perform a particular function or set of functions.

Embodiments of the invention are described in the general context of method steps, which in one embodiment can be implemented by means of a program product, which includes machine-executable instructions, such as program code, for example in the form of program modules executed by machines in network environments. contains. In general, program modules include routines, programs, objects, components, data structures, etc., that perform certain tasks or implement certain abstract data types. Machine-executable instructions, associated data structures and program modules represent examples of program code for performing the steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represent examples of corresponding operations for implementing the functions described in such steps.

Figure 8 shows an example of an imaging and tracking system used according to an embodiment of the invention. Certain embodiments can be used in conjunction with an imaging and tracking system, such as the example of an imaging and tracking system shown in Figure 8. The system shown includes a display device 810, a table 820, a patient 830, a tracking sensor 840, a medical device or implant 850, follower electronics 860, an image processor 870, and a display device 880. The display device 810 is shown as a C-arm useful for obtaining X-ray images of an anatomy of the patient 830, but may be any imaging device 810 useful in a tracking system. The display device 810 is in communication with the image processor 870. The image processor 870 is in connection with the follower electronics 860 and the display device 880.

Embodiments of the invention can be put into practice in a network environment using logical connections to one or more computers or remote, which computers have processors. Logical connections can include a local area network (LAN) and a large area network (WAN), which are presented here by way of example and not as a limitation. Such network environments are common in office-wide or firm-wide computer networks, intranets and the Internet and can use a wide variety of different communication protocols. Those skilled in the art will recognize that such network computer environments will typically include many types of computer system configurations, including personal computer computers, handheld devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, mini-computers, mainframe computers.

Embodiments of the invention can also be implemented in distributed computing environments, in which tasks are performed by local processing devices and remote processing devices, which are connected to each other via a communication network (by means of wired connections, wireless connections or by a combination wired and wireless connections). In a distributed computing environment, program modules may be located in both local memory storage devices and remote memory storage devices.

-19-

An example of a system for executing the entire system or parts of the invention may be a general purpose computer device in the form of a computer, including a processing unit, a system memory and a system bus, which supply various system components, including the system memory. processing unit. The system memory can contain read-only memory (ROM) and random access memory (RAM). The computer may also include a magnetic hard disk drive for reading from or writing on a magnetic hard disk, a magnetic disk drive for reading from or writing on a removable magnetic disk, and an optical disk drive for reading from or writing on a removable optical disk, such as a CD ROM or other optical media. The stations and their associated machine-readable media provide non-volatile storage of machine-executable instructions, data structures, program modules, or other data for the computer.

The foregoing description of embodiments of the invention has been presented for illustration and description purposes. The description is not intended to be exhaustive or the invention to be limited to the disclosed precise form, and modifications and variations are possible in light of the above description or may be made in practice of the invention. The embodiments were selected and described to illustrate the principles of the invention and the practical application thereof to enable a person skilled in the art to make the invention in various embodiments and with various modifications as they are suitable for the intended particular use, to use.

Those skilled in the art will recognize that the embodiments disclosed herein may be applied to the formation of any medical navigation system. Certain features of the embodiments of the claimed subject matter of the invention are shown as described herein, however, many modifications, substitutions, changes and equivalents will now be apparent to those skilled in the art. Moreover, although various functional blocks and relationships between them are described in detail, it is contemplated that various of the operations may be performed by those skilled in the art without the use of other, or additional functions, or relationships between functions that may still be determined in accordance Be in accordance with the claimed subject matter of the invention. It is therefore to be understood that the appended claims are intended to cover such modifications and changes as falling within the true spirit of the invention of the claimed subject matter of the invention.

-20-

PART LIST

10 Medical navigation system 12 Portable computer 14 Display 16 Navigation link 18 Second view 20 Electromagnetic field generator 22 Electromagnetic sensor 24 Device 26 Printed circuit board receiver array 30 Table 40 Patient 50 Cable 60 Mobile cart 100 Medical navigation system 160 Navigation link 200 Processor 210 System control unit 214,218 Display 215 Local link 220 Memory 222 Electromagnetic sensor 230 Display control unit 240 Disk control unit 245 Disk 250 Follower module 260 Navigation module 300 Local coupler 342 Processor 344 System control unit 346 Memory 350 Follower coupler 352 Processor 354 Memory 356 Follower module 360 Visualization coupler -21 - 362 Processor 364 Memory 366 Navigation module 37 Navigation module 370 Navigation module 370 Navigation module 370 Display module 382 Display 390 Disk control unit 392 Disk 500 Figure 510 Tools / Implant 520 External part 530 Internal part 550 Anatomy

FIG. 6 Illustration

FIG. 7/700 Flow chart 710 Indicate boundary point (s) 720 Measuring tool / implant position 730 Determine tool position and hatching with respect to boundary 740 Display image 810 imaging device 820 Table 830 Patient 840 Follow sensor 850 Medical device or implant 860 Follower electronics 870 Image processor 880 Display device 1034672

Claims (10)

  1. A method (700) for displaying a tool or implant (510) with respect to an image (500), the method (700) comprising: determining a surface boundary for an image (500) (710) ) displayed area of interest (550); determining a position of a tool or implant (510) with respect to the surface boundary (720, 730); and displaying a representation of the tool or implant (510) on the image (500), wherein a portion (530) of the tool or implant (510) is displayed within the surface boundary in the representation with a degree of transparency in comparison with a portion (520) of the tool or implant (510) outside the surface boundary (740).
  2. The method (700) of claim 1, wherein the portion (530) of the tool or implant (510) within the surface boundary is represented in the representation with a number of degrees of transparency based on a distance within the surface boundary.
  3. The method (700) of claim 1 or 2, wherein the representation displays a three-dimensional view of the tool or implant (510) on a two-dimensional image.
  4. The method (700) according to any of the preceding claims, wherein the surface boundary is determined using a tracking device (22, 222, 372).
  5. The method (700) of any one of the preceding claims, wherein the step of determining a surface boundary for an area of interest (550) shown in an image (500) further determining a surface boundary for an image 25 ( 500) represented area of interest (550) based on identification of a location of one or more points on a surface of the area of interest (550).
  6. A user interface system for displaying a representation of a tool or implant (510) with respect to an image (500), the system comprising: a processor (12, 342, 352, 362) arranged around a surface boundary to determine an area of interest (550) shown in the image (500) and to determine a position of the tool or implant (510) with respect to the surface boundary in which the processor (12, 342, 352, 362) generates a representation of the tool or implant (510) based on the position with respect to the surface boundary, and wherein a portion (530) of the tool or implant (510) within the surface boundary is represented in the representation by a degree of transparency compared to a portion (520) of the tool or implant (510) outside the surface boundary; and 1034672 -23- a display (14, 214, 218, 382) that is arranged to dynamically display the display (500) and the representation for a user.
  7. The system of claim 6, wherein the portion (530) of the tool or implant (510) within the surface boundary is represented in the representation with a number of degrees of transparency based on a distance within the surface boundary.
  8. The system of claim 6 or 7, wherein the representation displays a three-dimensional view of the tool or implant (510) on a two-dimensional image.
  9. 9. System according to claim 6, 7 or 8, wherein the surface boundary is determined using a tracking device (22, 222, 372).
  10. The system of any of claims 6-9, wherein the surface boundary is determined based on identification of a location of one or more points on a surface of the region of interest (550). 1034672
NL1034672A 2006-11-17 2007-11-12 Medical navigation system with tool and / or implant integration in fluoroscopic image projections and method for their use. NL1034672C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US56116206 2006-11-17
US11/561,162 US7831096B2 (en) 2006-11-17 2006-11-17 Medical navigation system with tool and/or implant integration into fluoroscopic image projections and method of use

Publications (2)

Publication Number Publication Date
NL1034672A1 NL1034672A1 (en) 2008-05-20
NL1034672C2 true NL1034672C2 (en) 2009-04-03

Family

ID=39416991

Family Applications (1)

Application Number Title Priority Date Filing Date
NL1034672A NL1034672C2 (en) 2006-11-17 2007-11-12 Medical navigation system with tool and / or implant integration in fluoroscopic image projections and method for their use.

Country Status (3)

Country Link
US (1) US7831096B2 (en)
JP (1) JP5328137B2 (en)
NL (1) NL1034672C2 (en)

Families Citing this family (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8944070B2 (en) * 1999-04-07 2015-02-03 Intuitive Surgical Operations, Inc. Non-force reflecting method for providing tool force information to a user of a telesurgical system
US9469034B2 (en) 2007-06-13 2016-10-18 Intuitive Surgical Operations, Inc. Method and system for switching modes of a robotic system
US9138129B2 (en) 2007-06-13 2015-09-22 Intuitive Surgical Operations, Inc. Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide
US20080065106A1 (en) * 2006-06-13 2008-03-13 Intuitive Surgical, Inc. Minimally invasive surgical apparatus with side exit instruments
US8620473B2 (en) 2007-06-13 2013-12-31 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
US20090192523A1 (en) 2006-06-29 2009-07-30 Intuitive Surgical, Inc. Synthetic representation of a surgical instrument
US9789608B2 (en) * 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical robot
US10008017B2 (en) 2006-06-29 2018-06-26 Intuitive Surgical Operations, Inc. Rendering tool information as graphic overlays on displayed images of tools
US9718190B2 (en) * 2006-06-29 2017-08-01 Intuitive Surgical Operations, Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US8165360B2 (en) * 2006-12-06 2012-04-24 Siemens Medical Solutions Usa, Inc. X-ray identification of interventional tools
EP1959391A1 (en) * 2007-02-13 2008-08-20 BrainLAB AG Determination of the three dimensional contour path of an anatomical structure
DE102007041912A1 (en) * 2007-09-04 2009-03-05 Siemens Ag A method for displaying image data of a plurality of image data volumes in at least one common image representation and associated medical device
US9168173B2 (en) 2008-04-04 2015-10-27 Truevision Systems, Inc. Apparatus and methods for performing enhanced visually directed procedures under low ambient light conditions
AT548712T (en) * 2008-06-25 2012-03-15 Koninkl Philips Electronics Nv Localization of a relevant object in one person
US9089256B2 (en) * 2008-06-27 2015-07-28 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US10258425B2 (en) * 2008-06-27 2019-04-16 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide
US8864652B2 (en) 2008-06-27 2014-10-21 Intuitive Surgical Operations, Inc. Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the positioning and orienting of its tip
US10117721B2 (en) 2008-10-10 2018-11-06 Truevision Systems, Inc. Real-time surgical reference guides and methods for surgical applications
JP5566657B2 (en) * 2008-10-15 2014-08-06 株式会社東芝 3D image processing apparatus and X-ray diagnostic apparatus
EP2421461B1 (en) * 2009-04-25 2015-07-15 Siemens Aktiengesellschaft System for assessing the relative pose of an implant and a bone of a creature
US8693628B2 (en) * 2009-04-27 2014-04-08 Lindsay S. Machan X-ray system
US9084623B2 (en) 2009-08-15 2015-07-21 Intuitive Surgical Operations, Inc. Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide
US9492927B2 (en) 2009-08-15 2016-11-15 Intuitive Surgical Operations, Inc. Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose
US8903546B2 (en) 2009-08-15 2014-12-02 Intuitive Surgical Operations, Inc. Smooth control of an articulated instrument across areas with different work space conditions
WO2011020505A1 (en) 2009-08-20 2011-02-24 Brainlab Ag Integrated surgical device combining instrument; tracking system and navigation system
US8784443B2 (en) 2009-10-20 2014-07-22 Truevision Systems, Inc. Real-time surgical reference indicium apparatus and methods for astigmatism correction
GB2475722B (en) * 2009-11-30 2011-11-02 Mirada Medical Measurement system for medical images
US8918211B2 (en) * 2010-02-12 2014-12-23 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
US8903144B2 (en) * 2010-12-01 2014-12-02 Olympus Corporation Endoscope apparatus and method of measuring object
WO2012090148A1 (en) * 2010-12-30 2012-07-05 Mediguide Ltd System and method for registration of fluoroscopic images in a coordinate system of a medical system
CN103429158B (en) 2011-03-15 2017-12-26 皇家飞利浦有限公司 For medical imaging equipment to provide positioning support interventional device image representation
WO2013034175A1 (en) 2011-09-06 2013-03-14 Ezono Ag Imaging probe and method of obtaining position and/or orientation information
JP5954762B2 (en) * 2011-11-29 2016-07-20 東芝メディカルシステムズ株式会社 X-ray diagnostic imaging equipment
US10325522B2 (en) * 2012-01-27 2019-06-18 University of Pittsburgh—of the Commonwealth System of Higher Education Medical training system and method of employing
US9552660B2 (en) 2012-08-30 2017-01-24 Truevision Systems, Inc. Imaging system and methods displaying a fused multidimensional reconstructed image
US9459087B2 (en) 2013-03-05 2016-10-04 Ezono Ag Magnetic position detection system
US9257220B2 (en) 2013-03-05 2016-02-09 Ezono Ag Magnetization device and method
GB201303917D0 (en) 2013-03-05 2013-04-17 Ezono Ag System for image guided procedure
US9295372B2 (en) * 2013-09-18 2016-03-29 Cerner Innovation, Inc. Marking and tracking an area of interest during endoscopy
CN106489152A (en) * 2014-04-10 2017-03-08 Sync-Rx有限公司 Image analysis in the presence of a medical device
EP3151750B1 (en) * 2014-06-06 2017-11-08 Koninklijke Philips N.V. Imaging system for a vertebral level
US9986983B2 (en) 2014-10-31 2018-06-05 Covidien Lp Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same
GB2534359A (en) * 2015-01-15 2016-07-27 Corin Ltd System and method for patient implant alignment
KR20190019131A (en) * 2016-07-14 2019-02-26 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Systems and methods for displaying a device navigator in a remote operating system
US10299880B2 (en) 2017-04-24 2019-05-28 Truevision Systems, Inc. Stereoscopic visualization camera and platform
US20190066314A1 (en) * 2017-08-23 2019-02-28 Kamyar ABHARI Methods and systems for updating an existing landmark registration

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5829444A (en) * 1994-09-15 1998-11-03 Visualization Technology, Inc. Position tracking and imaging system for use in medical applications
US6016439A (en) * 1996-10-15 2000-01-18 Biosense, Inc. Method and apparatus for synthetic viewpoint imaging
US6167296A (en) * 1996-06-28 2000-12-26 The Board Of Trustees Of The Leland Stanford Junior University Method for volumetric image navigation
US20020049375A1 (en) * 1999-05-18 2002-04-25 Mediguide Ltd. Method and apparatus for real time quantitative three-dimensional image reconstruction of a moving organ and intra-body navigation

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5542003A (en) * 1993-09-13 1996-07-30 Eastman Kodak Method for maximizing fidelity and dynamic range for a region of interest within digitized medical image display
US7194117B2 (en) * 1999-06-29 2007-03-20 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual examination of objects, such as internal organs
US6424332B1 (en) * 1999-01-29 2002-07-23 Hunter Innovations, Inc. Image comparison apparatus and method
US6415171B1 (en) * 1999-07-16 2002-07-02 International Business Machines Corporation System and method for fusing three-dimensional shape data on distorted images without correcting for distortion
US6834122B2 (en) * 2000-01-22 2004-12-21 Kairos Scientific, Inc. Visualization and processing of multidimensional data using prefiltering and sorting criteria
US6823207B1 (en) * 2000-08-26 2004-11-23 Ge Medical Systems Global Technology Company, Llc Integrated fluoroscopic surgical navigation and imaging workstation with command protocol
DE10108547B4 (en) * 2001-02-22 2006-04-20 Siemens Ag Operation system for controlling surgical instruments based on intra-operative Röngtenbildern
US6990220B2 (en) * 2001-06-14 2006-01-24 Igo Technologies Inc. Apparatuses and methods for surgical navigation
AU2003218010A1 (en) * 2002-03-06 2003-09-22 Z-Kat, Inc. System and method for using a haptic device in combination with a computer-assisted surgery system
WO2004017836A2 (en) * 2002-08-26 2004-03-04 Orthosoft Inc. Computer aided surgery system and method for placing multiple implants
TW558689B (en) * 2002-08-30 2003-10-21 Univ Taipei Medical Three-dimensional surgery simulation system and method
GB2393625B (en) * 2002-09-26 2004-08-18 Internet Tech Ltd Orthopaedic surgery planning
US7492930B2 (en) * 2003-02-04 2009-02-17 Aesculap Ag Method and apparatus for capturing information associated with a surgical procedure performed using a localization device
PT1444993E (en) * 2003-02-10 2007-01-31 Heraeus Gmbh W C Improved metal alloy for medical devices and implants
US20070276488A1 (en) * 2003-02-10 2007-11-29 Jurgen Wachter Medical implant or device
US7154985B2 (en) * 2003-05-13 2006-12-26 Medical Insight A/S Method and system for simulating X-ray images
US7641660B2 (en) * 2004-03-08 2010-01-05 Biomet Manufacturing Corporation Method, apparatus, and system for image guided bone cutting
US20060004274A1 (en) * 2004-06-30 2006-01-05 Hawman Eric G Fusing nuclear medical images with a second imaging modality
US8515527B2 (en) * 2004-10-13 2013-08-20 General Electric Company Method and apparatus for registering 3D models of anatomical regions of a heart and a tracking system with projection images of an interventional fluoroscopic system
EP1816961A1 (en) * 2004-11-23 2007-08-15 Philips Electronics N.V. Image processing system and method for displaying images during interventional procedures
JP4855085B2 (en) * 2006-01-27 2012-01-18 興和株式会社 Perimeter
WO2007092159A2 (en) * 2006-02-02 2007-08-16 Wake Forest University Health Sciences Cardiac visualization systems for displaying 3-d images of cardiac voxel intensity distributions with optional physician interactive boundary tracing tools
US20070265595A1 (en) * 2006-05-09 2007-11-15 Olympus Medical Systems Corp. Treatment tool inserting/withdrawing auxiliary device and medical procedure through endoscope

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5829444A (en) * 1994-09-15 1998-11-03 Visualization Technology, Inc. Position tracking and imaging system for use in medical applications
US6167296A (en) * 1996-06-28 2000-12-26 The Board Of Trustees Of The Leland Stanford Junior University Method for volumetric image navigation
US6016439A (en) * 1996-10-15 2000-01-18 Biosense, Inc. Method and apparatus for synthetic viewpoint imaging
US20020049375A1 (en) * 1999-05-18 2002-04-25 Mediguide Ltd. Method and apparatus for real time quantitative three-dimensional image reconstruction of a moving organ and intra-body navigation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WOLLF ET AL: "Real-time endoscope and intraoperative ultrasound integration in computer assisted navigated surgery", INTERNATIONAL CONGRESS SERIES, EXCERPTA MEDICA, AMSTERDAM, vol. 1281, 1 May 2005 (2005-05-01), pages 606 - 611, XP005081738, ISSN: 0531-5131 *

Also Published As

Publication number Publication date
JP2008126063A (en) 2008-06-05
JP5328137B2 (en) 2013-10-30
US7831096B2 (en) 2010-11-09
US20080118115A1 (en) 2008-05-22
NL1034672A1 (en) 2008-05-20

Similar Documents

Publication Publication Date Title
US6347240B1 (en) System and method for use in displaying images of a body part
US8160677B2 (en) Method for identification of anatomical landmarks
US6511418B2 (en) Apparatus and method for calibrating and endoscope
US8706185B2 (en) Method and apparatus for surgical navigation of a multiple piece construct for implantation
US8150498B2 (en) System for identification of anatomical landmarks
CA2161126C (en) System for locating relative positions of objects
CA2140786C (en) Process for imaging the interior of bodies
CN105025799B (en) Three-dimensional mapping display system for diagnostic ultrasound machine
CA2003497C (en) Probe-correlated viewing of anatomical image data
US8165660B2 (en) System and method for selecting a guidance mode for performing a percutaneous procedure
JP5065783B2 (en) Stereotaxic treatment apparatus and method
US8934961B2 (en) Trackable diagnostic scope apparatus and methods of use
US9867674B2 (en) Automatic identification of tracked surgical devices using an electromagnetic localization system
US8725235B2 (en) Method for planning a surgical procedure
US7715898B2 (en) System and method for employing multiple coil architectures simultaneously in one electromagnetic tracking system
US7344307B2 (en) System and method for integration of a calibration target into a C-arm
EP2405846B1 (en) System for navigating a surgical instrument
JP4204109B2 (en) Real-time positioning system
US5823958A (en) System and method for displaying a structural data image in real-time correlation with moveable body
US8320991B2 (en) Portable electromagnetic navigation system
JP5227027B2 (en) Method and apparatus for calibrating linear instruments
US6714810B2 (en) Fluoroscopic registration system and method
Navab et al. Camera augmented mobile C-arm (CAMC): calibration, accuracy study, and clinical applications
US6259943B1 (en) Frameless to frame-based registration system
US20190223689A1 (en) Apparatus and Method for Four Dimensional Soft Tissue Navigation Including Endoscopic Mapping

Legal Events

Date Code Title Description
AD1A A request for search or an international type search has been filed
PD2B A search report has been drawn up
V1 Lapsed because of non-payment of the annual fee

Effective date: 20150601