WO2010067267A1 - Head-mounted wireless camera and display unit - Google Patents

Head-mounted wireless camera and display unit Download PDF

Info

Publication number
WO2010067267A1
WO2010067267A1 PCT/IB2009/055462 IB2009055462W WO2010067267A1 WO 2010067267 A1 WO2010067267 A1 WO 2010067267A1 IB 2009055462 W IB2009055462 W IB 2009055462W WO 2010067267 A1 WO2010067267 A1 WO 2010067267A1
Authority
WO
Grant status
Application
Patent type
Prior art keywords
head
display unit
surgical
camera
image
Prior art date
Application number
PCT/IB2009/055462
Other languages
French (fr)
Inventor
Joerg Sabczynski
Original Assignee
Philips Intellectual Property & Standards Gmbh
Koninklijke Philips Electronics N. V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F19/00Digital computing or data processing equipment or methods, specially adapted for specific applications
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/2251Constructional details
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00212Electrical control of surgical instruments using remote controls
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00221Electrical control of surgical instruments with wireless transmission of data, e.g. by infrared radiation or radiowaves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • A61B2017/00699Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body correcting for movement caused by respiration, e.g. by triggering
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • A61B2017/00703Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body correcting for movement of heart, e.g. ECG-triggered
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00734Aspects not otherwise provided for battery operated
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/0059Detecting, measuring or recording for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/12Devices for detecting or locating foreign bodies

Abstract

The present invention refers to a head-mounted wireless camera and display unit (100) equipped with a processing unit (103) for carrying out optical position measurements, wherein said camera and display unit is intended to be worn by a surgeon as a headset, goggle or helmet and can advantageously be used for supporting the process of navigating and tracking a surgical tool during a stereotactic operation. The proposed system comprises, aside from said processing unit, a display (102), camera units (101a+b) and at least one rechargeable or non-rechargeable battery (104). For data exchange and battery reloading a docking station (106) or cable is used.

Description

HEAD-MOUNTED WIRELESS CAMERA AND DISPLAY UNIT

FIELD OF THE INVENTION

The present invention generally relates to the field of image-guided surgery and surgical navigation systems with optical position measurement capability developed for supporting a surgeon in navigating and tracking navigation motions of at least one surgical instrument during a stereotactic surgical procedure by visualizing a preoperatively generated image (e.g. a CT image) or an intra-operatively acquired image (e.g. a fluoroscopy or X-ray image) and graphically displaying the exact position and angular orientation of the at least one surgical instrument relative to the locations of identified tissue anomalies, lesions or anatomical structures of interest (such as e.g. an organ or a part thereof) in a surgical field in the interior of a patient's body to be treated by cardiology, interventional radiology or stereotactic surgery or relative to a fixed reference position. In particular, the invention refers to a head-mounted wireless camera and display unit equipped with a processing unit for carrying out optical position measurements, wherein said camera and display unit is intended to be worn by a surgeon as a headset, goggle or helmet and can advantageously be used for supporting the process of navigating and tracking a surgical tool during a stereotactic operation. The proposed system is a complete but miniaturized surgical navigation system which comprises, aside from said processing unit, a display, camera units and at least one rechargeable or non- rechargeable battery. For data exchange and battery reloading a docking station or cable is used. Compared with conventional navigation systems as commonly used today in clinical setups, especially in the scope of surgical procedures, the proposed navigation system is able to overcome problems which inhibit a faster growth of the surgical navigation market. This is (a) because the navigation system as claimed does not consume any space in the vicinity of the patient, (b) because said system does not need any cables or wires in the operating room or in the surgical field, (c) as it provides ergonomic working conditions to the surgeon and (d) due to the fact that the claimed system can easily and rapidly be set up.

BACKGROUND OF THE INVENTION

Navigation systems are used in many surgical disciplines, such as e.g. in orthopedic surgery, spine surgery and stereotactic neurosurgery of the brain and the central nervous system as well as in interventional radiology and surgery. They are used to accurately implement pre-operative surgical plans during an operation and to allow for advanced image guidance of a surgical instrument.

From the relevant literature, computer-assisted navigation systems are known which are based on stereotactic recording and have long been used by neurosurgeons as a surgical guide for minimal craniotomy or for assisting in localizing tumors that are hard to locate. Two main types of image guidance systems have been developed. Optical-based image guidance systems use an infrared camera array to monitor instruments and head position. Unlike electromagnetic-based systems, these devices do not have problems of signal interference from metallic objects near the surgical field. However, a clear line of sight must be maintained between the infrared camera and light-emitting diodes during surgery. Over the past few years, these systems have been developed in functional endoscopic sinus surgery (FESS) as a surgical control in difficult cases or as a surgical guide for minimal conservative surgery. Stereotactic localization is a method for locating a target within a three- dimensional object which is commonly used to locate an anatomical structure or pathological anomaly in the human body (particularly in a patient's brain or spine) for medical and surgical treatment. According to one approach, fiducial markers ("fiducials") are attached to a patient's body in one of a variety of manners, e.g. by using an attachable frame or attaching said markers to the skin with an adhesive. A scan is then taken of the patient's anatomy or at least one part thereof, e.g. of the head, to reconstruct a three-dimensional image of e.g. the patient's brain. Scanning can thereby be done using a variety of techniques including CT, MRI, PET and SPECT. Images of the fiducial markers that may be located around the patient's body are then located in the three-dimensional image at fiducial image points. Points of interest, such as e.g. the location of a cancerous tumor or carcinoma, are located in the three-dimensional image with reference to these fiducial image points. A detailed survey on stereotactic surgery can be found in "Textbook of Stereotactic and Functional Neurosurgery" (McGraw-Hill, June 1997, ISBN: 0070236046) by P. L. Gildenberg and R. R. Tasker (editors).

Typical surgical navigation systems consist of a computer, a monitor, a position measurement device and tracked surgical instruments. Thereby, preoperatively acquired image data of the patient (which may e.g. include relevant structures segmented from a preoperatively generated CT or MRI scan, such as e.g. tumors and bone surfaces, surgical instruments, etc.) are loaded together with a preoperatively elaborated and defined surgical plan into the computer prior to the operation. The aforementioned surgical plan may e.g. include geometric data such as target markers, fiducials and planned trajectories for navigating the surgical instruments. After performing a registration procedure which registers the locations of anatomical landmarks or fiducial markers ("fiducials") attached to a patient's body with the coordinates of said landmarks or fiducial image points in the preoperatively acquired image, the coordinate systems of the position measurement device and the preoperatively generated image are matched. It is then possible to show the preoperatively defined surgical plans as well as the current positions of tracked surgical instruments within said image data. During a surgical stereotactic operation, this allows for an easy navigation of said instruments.

In an approach to stereotactic brain surgery, a three-dimensional frame is screwed to the patient's skull prior to scanning the head. This frame serves as a mechanical reference mechanism that supports scanning fiducial markers at fiducial points around the body. The frame remains attached to the patient's skull from before scanning until after surgery is complete. Prior to surgery, a mechanical guide assembly is attached to the frame. The relative location in the image of the point of interest with respect to the fiducial image points is determined, and this relationship is used to adjust the mechanical guide assembly with re- spect to the fiducial points on the frame. Using the adjusted mechanical guide assembly, a surgical instrument is then guided to a location in the body that corresponds to the point of interest in the image.

In another form of stereotactic surgery, in the relevant literature known generally as "image-guided stereotactic surgery", rather than relying on mechanical adjustment of a guide assembly, visual feedback is provided to a surgeon by displaying a composite image formed from a scanned three-dimensional image and a synthesized image of a hand-held surgical instrument. The surgeon guides the hand-held instrument into a patient's body using the visual feedback. In this form of surgery, a frame is attached to the patient and a scan is taken as described above. After scanning, the head and frame are secured in a fixed position, for ex- ample, fixed to an operating table. In order to display the image of the surgical instrument in a proper relationship to the scanned image, the position and angular orientation of the instrument is sensed using a localization apparatus that remains in a fixed position relative to the body. The localization apparatus can be coupled to the surgical instrument using an articulated mechanical arm on which the surgical instrument is attached. Sensors in the joints of the arm provide signals that are used to determine the location and orientation of the instrument relative to a fixed base of the mechanical arm. Some more recent systems do not use mechanical coupling between the surgical instrument and the localization apparatus and instead rely on remote sensing of small localized energy emitters (e.g., sources or transducers of energy) fixed to the instrument. For example, a camera array is used to locate light-emitting diodes (LEDs) or passive markers made of a light reflecting material that are attached to the instrument. The locations of the LEDs or passive markers in the camera images are used to determine the three-dimensional physical locations of the LEDs or passive markers relative to the camera array. The locations of multiple LEDs or passive markers attached to the instrument are then used to determine the location and orientation of the instrument. Another example of remote sensing uses sound generators and a microphone array and relies on the relative time of arrival of acoustical signals to determine the three-dimensional locations of the sound generators.

Before a synthesized image of the instrument can be combined with the scanned image in a proper relationship, some form of registration is required. For example, the tip of the surgical instrument can be placed at each of several fiducial markers for which corresponding images have been located in the three-dimensional scanned image. Registration of the synthesized image of the instrument and the scanned image can thereby be established. According to a variant of image-guided stereotactic surgery, in the relevant Ii- terature generally referred to as "dynamic referencing", a patient's head and said frame are secured in a fixed position such as in the image-guided approach. However, unlike other image-guided techniques, the sensors (e.g. cameras) of the localization apparatus are not at a fixed location. In order to compensate for motions of the sensors, energy emitters are fixed to the frame as well as to the instrument. At any point in time, the location and orientation of the frame relative to the sensors as well as the location and orientation of the instrument relative to the sensors are both determined, and the differences in their locations and orientations are used to compute the location and orientation of the instrument relative to the frame. This computed location of the instrument is then used to display the synthesized image of the surgical instrument in an appropriate relationship to the scanned image. Still another approach to stereotactic surgery, generally known as "image- guided frameless stereotactic surgery", does not rely on attaching a frame to the body before scanning. Instead, adhesive fiducial markers are applied to the scalp, or small screws are inserted into the skull, and the patient is scanned as in the techniques described above. During surgery, the patient is immobilized and locked in place using a head clamp or a frame. The image-guided stereotactic approach described above is then followed, including the registration procedure described above to establish the locations of the fiducial markers relative to the instrument. Image-guided frameless stereotaxy has also been applied to spine surgery. A reference frame is attached to an exposed spinous process during open spine surgery, and a probe is used to register the patient's spine with scanned image of the spine. Anatomical landmarks are used as fiducial points which are located in the scanned image, and visual feedback is provided to manually guide placement of instruments, such as insertion of pedicle screws into the spinal structures.

A method for augmented reality navigation during an image-guided medical in- tervention based on a stereoscopic head-mounted display is disclosed in US 2005/0,203,380 Al. The display includes a pair of stereo viewing cameras (and at least one tracking camera) and a stereoscopic guidance display. During a surgical intervention of a patient, the patient's body pose is determined from a rigid body transformation between the tracking camera and frame markers on a scanning table, and the pose of an intervention instrument with respect to the table is determined. A visual representation of the patient overlaid with an image of the intervention target, the instrument and a path for guiding the instrument to perform said medical intervention is displayed on the stereoscopic guidance display at a position at which the surgeon would see it ("augmented reality").

SUMMARY OF THE INVENTION

Although surgical navigation systems as commonly used today offer an increased surgical accuracy and allow to implement preoperatively defined surgical plans, these systems have a number of drawbacks as mentioned below: - Required space in the operating room: First of all, the computer system with the monitor is usually built within a trolley on wheels. Such a trolley consumes much space in the operating room, which is because it must be placed close to the operating surgeon in order to allow for a good view onto the computer screen. Moreover, the position measurement system consumes space close to the operating field. If a surgical navigation system comprising an optical position measurement subsystem is used, the line-of-sight to the operating field must be kept clear. Sometimes, this may be difficult to achieve.

- Ergonomy: Secondly, conventional position measurement systems are connected with a cable to the computer, which is difficult to set up and increases the risk of stumbling within the operating room. Aside therefrom, looking onto a computer screen placed somewhere in the operating room, while working on the patient is not ergonomic and tiring for the surgeon and thus error prone. - Setup procedure: Thirdly, the setup of a conventional surgical navigation system is relatively time-consuming and prolongs the time for the setup of the operating room, which has to be done by experienced assistance personnel.

The time consuming setup procedure, working conditions not up to ergonomic standards and the large space consumption in the operating room are among the biggest problems inhibiting a faster growth of the surgical navigation market.

In view of the above-described facts, it is an objective of the present invention to provide a navigation system which is less space-consuming and more convenient by allow- ing a more ergonomical work and facilitated setup procedure than conventional systems.

In this context, a first exemplary embodiment of the present invention is directed to a head-mounted wireless camera and display unit serving as a navigation system for supporting a surgeon in navigating and tracking navigation motions of at least one surgical instrument during a stereotactic surgical procedure by visualizing a preoperatively generated image (e.g. a CT image) or an intra-operatively acquired image (e.g. a fluoroscopy or X-ray image) and graphically displaying the exact position and/or angular orientation of the at least one surgical instrument relative to the locations of identified tissue anomalies, lesions or anatomical structures of interest within a preoperatively elaborated surgical plan and/or within a preoperatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image of the patient.

The proposed camera and display unit may e.g. be realized in the form of a headset, a goggle or a helmet to be worn by the surgeon during the surgical stereotactic operation. Therefore, it does not consume any space in the vicinity of the patient's body and is easy to set up during the preparation of the surgical stereotactic operation. Aside from tracking navigation motions of the at least one surgical instrument, it may further be provided that the proposed navigation system is adapted to simultaneously track cardiac, respiratory and body motions of the patient and compensate for motions of the camera and display unit relative to the patient or relative to the locations of identified tissue anomalies, lesions or anatomical structures of interest. According to this embodiment, said camera and display unit may comprise an integrated processing unit, supplied with the video output signals of one or more camera units or optical position sensors (e.g. line cameras) integrated in the head-mounted camera and display unit, wherein said processing unit may be configured for registering the locations of anatomical landmarks or fiducial markers attached to the patient's body, the coordinates of said landmarks or fiducials being detected by a pointer instrument tracked by means of at least one camera of the head-mounted camera and display unit, with the coordinates of said landmarks or fiducial image points in the preoperatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image and/or in a preoperatively elaborated surgical plan and displaying a registered graphical representation of the at least one surgical instrument and/or a segmented anatomy object from the preoperatively or intra-operatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image within said image on an integrated display of the head-mounted camera and display unit. Due to the fact that this display is located in the field of view of the surgeon, he/she does not need to turn his/her head in or- der to look on a computer display. This allows a much more ergonomic way of working.

In addition to that, said processing unit may further be configured for calculating the current position and angular orientation of the at least one surgical instrument relative to the locations of said tissue anomalies, lesions or anatomical structures of interest in the interior of the patient's body to be treated by cardiology, interventional radiology or surgery or relative to said fixed reference position within the preoperatively elaborated surgical plan and/or within the preoperatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image of the patient.

Aside therefrom, the head-mounted camera and display unit may advantageously comprise an infrared or Bluetooth interface for wirelessly communicating with and being remotely controllable by an input device. Therefore, no cables are necessary in the operating field.

Furthermore, said camera and display unit may be equipped with at least one integrated rechargeable or non-rechargeable battery or battery set which serves as a power supply unit for operating the head-mounted camera and display unit. A second exemplary embodiment of the present invention refers to a surgical navigation system comprising an optical position measurement subsystem for supporting a surgeon in navigating and tracking navigation motions of at least one surgical instrument during a stereotactic surgical procedure by visualizing a preoperatively generated image (e.g. a CT image) or an intra-operatively acquired image (e.g. a fluoroscopy or X-ray image) and graphically displaying the exact position and/or angular orientation of the at least one surgical instrument relative to the locations of identified tissue anomalies, lesions or anatomical structures of interest within a preoperatively elaborated surgical plan and/or within a preoperatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image of the patient. According to the present invention, said surgical navigation system thereby comprises a head-mounted wireless camera and display unit as described with reference to said first exemplary embodiment.

According to this embodiment, the proposed surgical navigation system may thereby comprise a docking station or cable for battery reloading. Aside therefrom, said docking station may also be configured for uploading preoperatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image data and/or geometric data of the preoperatively elaborated surgical plan to an integrated data memory of the head-mounted camera and display unit, wherein said geometric data may include target markers, fϊducials and planned trajectories for navigating the at least one surgical instrument to the locations of the tissue anomalies, lesions or anatomical structures of interest in the interior of said patient's body to be treated by cardiology, interventional radiology or surgery.

Instead of being integrated in the head-mounted camera and display unit, it may also be provided that said processing unit and/or the at least one rechargeable battery or battery set are integrated in a separate device worn somewhere else on the body of the surge- on and wirelessly or electrically connected to the head-mounted camera and display unit. Moreover, the head-mounted camera and display unit and/or said separate device may comprise a connector for external energy supply in case of an unexpected low battery.

Besides, it may also be foreseen that the head-mounted camera and display unit comprises means for wireless reception of intra-operatively acquired sonography, fluorosco- py, X-ray or other image data and non- image data from an imaging system, from a hospital information system or from an operating room information system.

If wireless data transmission should not be possible, the head-mounted camera and display unit as proposed in this embodiment may comprise a connector for a cable to a hospital information system, to an imaging system, to the docking station or to an operating room information system.

It may also be provided that the surgical navigation system comprises imaging means for providing intra-operative X-ray or ultrasound data used for localization of tissue anomalies, lesions or anatomical structures of interest.

A third exemplary embodiment of the present invention relates to a camera- assisted image guidance and navigation method for supporting a surgeon in navigating and tracking navigation motions of at least one surgical instrument during a stereotactic surgical procedure by visualizing a preoperatively generated image (e.g. a CT image) or an intra- operatively acquired image (e.g. a fluoroscopy or X-ray image) and graphically displaying the exact position and/or angular orientation of the at least one surgical instrument relative to the locations of identified tissue anomalies, lesions or anatomical structures of interest within a preoperatively elaborated surgical plan and/or within a preoperatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image of the patient. The proposed method may thereby comprise the steps of registering the locations of anatomical landmarks or fiducial markers attached to the patient's body, the coordinates of said landmarks or fiducials being detected by a pointer instrument tracked by means of at least one camera of a head- mounted wireless camera and display unit worn by the surgeon during said operation, with the coordinates of the anatomical landmarks or fiducial image points in the preoperatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image and/or preoperatively elaborated surgical plan and displaying a registered graphical representation of the at least one surgical instrument and/or a segmented anatomy object from the preoperatively or intra-operatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image within said image on an integrated display of the head-mounted camera and display unit. Finally, a fourth exemplary embodiment of the present invention refers to a computer software product configured for performing a method as described with reference to said third exemplary embodiment when running on a processing unit of a surgical navigation system such as disclosed with reference to said second exemplary embodiment.

The head-mounted camera and display unit as claimed and disclosed in this ap- plication overcomes all the problems mentioned above, which is due to the fact that the head- mounted wireless camera and display unit does not consume any space in the vicinity of the patient, because it does not need any cables or wires in the operating room or in the surgical field, as it provides ergonomic working conditions to the surgeon and, finally, because the entire system can easily and rapidly be set up. Furthermore, a navigation system as proposed in this application can advantageously be applied to overcome those problems which are caused due to an obstructed line of sight when using conventional ultrasonic position measurement systems.

BRIEF DESCRIPTION OF THE DRAWINGS

These and other advantageous aspects of the invention will be elucidated by way of example with respect to the embodiments described hereinafter and with respect to the accompanying drawings. Therein, Fig. 1 shows a block diagram of a head-mounted navigation system as proposed by the present invention,

Fig. 2 shows a headset with displays and camera units as used in the scope of the present invention with a computer and a battery set (not shown) being attached to the rear side of the headset, and

Fig. 3 shows a flow chart for illustrating the proposed method according to the present application.

DETAILED DESCRIPTION OF THE PRESENT INVENTION

In the following, the proposed wireless navigation system according to the present invention will be explained in more detail with respect to special refinements and referring to the accompanying drawings.

As described above, the proposed navigation system comprises a head- mounted camera and display unit 100 (i.e., a headset, goggle or helmet), a docking station 106, set up anywhere in the operating room, and a wireless input device 107 (see Fig. 1). For enabling dynamic referencing, the above-mentioned system further comprises a dynamic reference tracker 108.

The head-mounted camera and display unit 100 of the system comprises a surgical navigation system consisting of one or two camera units 101a and 101b whose video outputs are connected to an integrated processing unit 103 (in Fig. 1 also referred to as "com- puter") of said headset, a small display 102 (or two small displays) for visualization, wherein said display(s) is/are mounted in the field of view of the surgeon 20 without obstructing his view onto the surgical field, and at least one battery 104 or a battery set as required for wireless operation (see Fig. 2). Said camera units 101a and 101b are positioned such that they look at the surgical field when the headset is worn by a surgeon. In order to look at the video sig- nal, the surgeon just has to look at the display(s) without moving his/her head. The camera video output signals are sent to the processing unit 103, which is programmed to calculate the current position and orientation of at least one wirelessly tracked surgical instrument 105 relative to a surgical plan 11 Ib or relative to the locations of tissue anomalies, lesions or anatomical structures of interest (such as e.g. an organ) in the interior of a patient's body to be treated by cardiology, interventional radiology or surgery. If necessary for the tracking procedure, claimed headset 100 also contains lighting means to illuminate the surgical field for the purpose of position measurement (not shown). As processing unit 103 may be programmed as described above, it acts in the same way as a computer system of an ordinary surgical naviga- tion system. Images which are recorded by said camera units 101a and 101b are used for calculating the position of the at least one surgical instrument 105. Since display 102, camera units 101a+b and processing unit 103 are preferably assembled within a single device, no obstructing cables are needed during an operation.

According to a special aspect of the present invention, the at least one wireless- Iy tracked instrument 105 can contain markers of a retro -reflective material which reflects light back into its direction of incidence. A light source (not shown) positioned in the vicinity of one of said camera units 101a and 101b can thus create a high intensity reflection in the displayed camera image, which allows for easy image processing. Optical position measurement subsystems for wireless instruments based on this principle are state of the art and can be used in the scope of the present invention.

Docking station 106 allows to recharge the at least one battery 104 in the headset between two operations. Furthermore, said docking station may be connected to a hospital information system. Image data and surgical plans can be uploaded via the docking station to the headset 100 prior to the operation, such as e.g. from a personal computer or workstation in the surgeon's office which is used to create the preoperative surgical plan. Before the operation, this information can be downloaded to the headset 100.

During an operation it might become necessary that the surgeon interacts with the integrated processing unit 103 of the headset 100. For this purpose, a sterilizable input device 107 with one or more buttons, a trackball or a mouse wheel can be used. The input de- vice has the capability to wirelessly communicate with the headset (e.g. via an infrared or

Bluetooth link, etc.). Alternatively, the surgeon can point to a tracked user interaction plate as known from some navigation systems on the market.

According to a variation of the present invention, the processing unit 103 of the navigation system and the at least one battery 104 or battery set can be either integrated into the headset 100 as described above or can be worn somewhere else on the body of the surgeon and connected to the headset. It may also be provided that the proposed navigation system contains a connector for external energy supply in case of an unexpected low battery.

Furthermore, said navigation system can also contain means for wireless transmission of image data (e.g. intra-operative X-ray or ultrasound data used for localization of tissue anomalies, lesions or anatomical structures of interest) and non-image data. If wireless data transmission is not possible, said system can contain a connector for a cable to the hospital information system, the docking station 106, or to an operating room information system. A flow chart for illustrating the proposed method according to the present application is shown in Fig. 3. The method begins with the step of visualizing (Sl) a preopera- tively generated sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image or an intra-operatively acquired fluoroscopy, X-ray or other type of image of an anatomy region of interest of a patient's body to be treated by cardiology, interventional radiology or surgery on at least one integrated display 102 of a head-mounted camera and display unit 100 serving as a navigation system for supporting a surgeon in navigating and tracking navigation motions of at least one surgical instrument 105 during a stereotactic surgical procedure. The at least one display may also be used for graphically displaying (S2) the exact position and/or angular orientation of the at least one surgical instrument relative to the locations of identified tissue anomalies, lesions or anatomical structures of interest in the interior of the patient's body to be treated by cardiology, interventional radiology or surgery or relative to a fixed reference position within a preoperatively elaborated surgical plan and/or within the preoperatively generated sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image of the patient. After that, locations of anatomical landmarks or fiducial markers attached to the patient's body, the coordinates of said landmarks or fiducials being detected by a pointer instrument tracked by means of at least one camera 101a+b of the head-mounted camera and display unit 100 worn by the surgeon during said operation, may be registered (S3) with the coordinates of the anatomical landmarks or fiducial image points in the preoperatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image and/or preoperatively elaborated surgical plan. Finally, a registered graphical representation of the at least one surgical instrument and/or a segmented anatomy object from the preoperatively or intra-operatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image may be displayed (S4a) on the at least one integrated display 102 of the head-mounted wireless camera and display unit 100. As an alternative thereto, a segmented representation of the at least one surgical instrument 105 may be faded (S4b) into the preoperatively acquired image on the at least one display of the head-mounted camera and display unit. APPLICATIONS OF THE PRESENT INVENTION

The proposed navigation system can be used in many surgical applications, such as e.g. in the fields of neurosurgery of the central nervous system, spine surgery and or- thopaedic surgery. Furthermore, it can be used in interventional radiology, such as e.g. for assisting needle-based procedures. Aside therefrom, said system can also be used by nonmedical professions, e.g. for performing complex tasks in repairing technical devices etc.

While the present invention has been illustrated and described in detail in the drawings and in the foregoing description, such illustration and description are to be consi- dered illustrative or exemplary and not restrictive, which means that the invention is not limited to the disclosed embodiments. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure and the appended claims. In the claims, the word comprising" does not exclude other elements or steps, and the indefinite article ,,a" or ,,an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures can not be used to advantage. A computer program may be stored/distributed on a suitable medium, such as e.g. an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as e.g. via the Internet or other wired or wireless telecommunication systems. Furthermore, any reference signs in the claims should not be construed as limiting the scope of the present invention.

Table of used reference numbers and their meanings 0 Surgeon

100 Headset

101a First camera unit

101b S econd camera unit

102 Display

103 Computer

104 Battery

105 Tracked instrument

106 Docking station

107 Input device

108 Dynamic reference tracker

109a Image data, acquired by first camera unit 101a

109b Image data, acquired by second camera unit 101b

110 Output data, provided by computer 103

I l ia Image data, supplied via docking station 106

11 Ib Surgical plan, supplied via docking station 106

112 Input data, entered on input device 107

Claims

CLAIMS:
1. A head-mounted wireless camera and display unit (100) serving as a navigation system for supporting a surgeon in navigating and tracking navigation motions of at least one surgical instrument (105) during a stereotactic surgical procedure by visualizing a preoperatively generated image or an intra-operatively acquired image of an anatomy region of interest of a patient's body to be treated by cardiology, interventional radiology or surgery and graphically displaying the exact position and/or angular orientation of the at least one surgical instrument relative to the locations of identified tissue anomalies, lesions or anatomical structures of interest within a preoperatively elaborated surgical plan and/or within a preoperatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image of the patient.
2. The head-mounted wireless camera and display unit (100) according to claim 1, realized in the form of a headset, a goggle or a helmet to be worn by the surgeon during the sur- gical stereotactic operation.
3. The head-mounted wireless camera and display unit (100) according to claim 1, adapted to simultaneously track cardiac, respiratory and body motions of the patient and compensate for motions of the camera and display unit (100) relative to the patient or relative to the locations of identified tissue anomalies, lesions or anatomical structures of interest aside from tracking navigation motions of the at least one surgical instrument (105).
4. The head-mounted wireless camera and display unit (100) according to claim 3, comprising an integrated processing unit (103), supplied with the video output signals of one or more camera units (101a+b) or optical position sensors integrated in the head-mounted camera and display unit (100), with said processing unit (103) being configured for registering the locations of anatomical landmarks or fiducial markers attached to the patient's body, the coordinates of said landmarks or fϊducials being detected by a pointer instrument tracked by means of at least one camera (101a, 101b) of the head-mounted camera and display unit (100), with the coordinates of said landmarks or fiducial image points in the preoperatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image and/or in a preoperatively elaborated surgical plan and displaying a registered graphical representation of the at least one surgical instrument (105) and/or a segmented anatomy object from the preope- ratively or intra-operatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image within said image on an integrated display (102) of the head-mounted camera and display unit (100).
5. The head-mounted wireless camera and display unit (100) according to claim 4, wherein said processing unit (103) is further configured for calculating the current position and angular orientation of the at least one surgical instrument (105) relative to the locations of said tissue anomalies, lesions or anatomical structures of interest in the interior of the patient's body to be treated by cardiology, interventional radiology or surgery or relative to said fixed reference position within the preoperatively elaborated surgical plan and/or within the preope- ratively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image of the patient.
6. The head-mounted wireless camera and display unit (100) according to claim 5, comprising an infrared or Bluetooth interface for wirelessly communicating with and being remotely controllable by an input device (107).
7. The head-mounted wireless camera and display unit (100) according to claim 5, equipped with at least one integrated rechargeable or non-rechargeable battery (104) or battery set which serves as a power supply unit for operating the head-mounted camera and display unit (100).
8. A surgical navigation system comprising an optical position measurement subsystem for supporting a surgeon in navigating and tracking navigation motions of at least one surgical instrument (105) during a stereotactic surgical procedure by visualizing a preoperatively generated image or an intra-operatively acquired image of an anatomy region of interest of a patient's body to be treated by cardiology, interventional radiology or surgery and graphically displaying the exact position and/or angular orientation of the at least one surgical instrument (105) relative to the locations of identified tissue anomalies, lesions or anatomical structures of interest within a preoperatively elaborated surgical plan and/or within a preoperatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image of the patient, wherein said surgical navigation system comprises a head-mounted wireless camera and display unit (100) according to anyone of claims 1 to 7.
9. The surgical navigation system according to claim 8, comprising a docking station (106) or cable for battery reloading.
10. The surgical navigation system according to claim 9, wherein said docking station (106) is configured for uploading preoperatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image data and/or geometric data of the preoperatively elaborated surgical plan to an integrated data memory of the head- mounted camera and display unit (100), said geometric data including target markers, fϊdu- cials and planned trajectories for navigating the at least one surgical instrument (105) to the locations of the tissue anomalies, lesions or anatomical structures of interest in the interior of said patient's body to be treated by cardiology, interventional radiology or surgery.
11. The surgical navigation system according to claim 10, wherein said processing unit (103) and/or the at least one rechargeable battery (104) or battery set, instead of being integrated in the head-mounted wireless camera and display unit (100), are integrated in a separate device worn somewhere else on the body of the surgeon and wire- lessly or electrically connected to the head-mounted camera and display unit (100).
12. The surgical navigation system according to claim 11, wherein the head-mounted camera and display unit (100) and/or said separate device comprises a connector for external energy supply in case of an unexpected low battery.
13. The surgical navigation system according to claim 12, wherein the head-mounted camera and display unit (100) comprises means for wireless reception of intra-operatively acquired sonography, fluoroscopy, X-ray or other image data and non-image data from an imaging system, from a hospital information system or from an operating room information system.
14. The surgical navigation system according to claim 12, wherein the head-mounted camera and display unit (100) comprises, if wireless data transmission is not possible, a connector for a cable to a hospital information system, to an imaging system, to the docking station (106) or to an operating room information system.
15. The surgical navigation system according to claim 14, comprising imaging means for providing intra-operative X-ray or ultrasound data used for localization of tissue anomalies, lesions or anatomical structures of interest.
16. A method for supporting a surgeon in navigating and tracking navigation motions of at least one surgical instrument (105) during a stereotactic surgical procedure by visualizing (Sl) a preoperatively generated image or an intra-operatively acquired image of an anatomy region of interest of a patient's body to be treated by cardiology, interventional radiology or surgery and graphically displaying (S2) the exact position and/or angular orientation of the at least one surgical instrument relative to the locations of identified tissue anomalies, lesions or anatomical structures of interest in the interior of a patient's body to be treated by cardiology, interventional radiology or surgery or relative to a fixed reference position within a preoperatively elaborated surgical plan and/or within a preoperatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image of the patient, said method comprising the steps of registering (S3) the locations of anatomical landmarks or fiducial markers attached to the patient's body, the coordinates of said landmarks or fiducials being detected by a pointer instrument tracked by means of at least one camera (101a, 101b) of a head-mounted camera and display unit (100) worn by the surgeon during said operation, with the coordinates of the anatomical landmarks or fiducial image points in the preoperative- Iy acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image and/or preoperatively elaborated surgical plan and displaying (S4a) a registered graphical representation of the at least one surgical instrument (105) and/or a segmented anatomy object from the preoperatively or intra-operatively acquired sonography, fluoroscopy, X-ray, CT, 3DRA, MR, PET or SPECT image within said image on an integrated display (102) of the head-mounted wireless camera and display unit (100).
17. A computer software product configured for performing a method according to claim 16 when running on a processing unit (103) of a surgical navigation system according to anyone of claims 8 to 15.
PCT/IB2009/055462 2008-12-09 2009-12-02 Head-mounted wireless camera and display unit WO2010067267A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP08171024 2008-12-09
EP08171024.6 2008-12-09

Publications (1)

Publication Number Publication Date
WO2010067267A1 true true WO2010067267A1 (en) 2010-06-17

Family

ID=41510538

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2009/055462 WO2010067267A1 (en) 2008-12-09 2009-12-02 Head-mounted wireless camera and display unit

Country Status (1)

Country Link
WO (1) WO2010067267A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014099494A1 (en) * 2012-12-17 2014-06-26 Alcon Research, Ltd. Wearable user interface for use with ocular surgical console
WO2014120909A1 (en) * 2013-02-01 2014-08-07 Sarment David Apparatus, system and method for surgical navigation
WO2015110859A1 (en) * 2014-01-21 2015-07-30 Trophy Method for implant surgery using augmented visualization
US9498231B2 (en) 2011-06-27 2016-11-22 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
WO2017005897A1 (en) 2015-07-08 2017-01-12 Sirona Dental Systems Gmbh System and method for scanning anatomical structures and for displaying a scanning result
CN106456145A (en) * 2014-05-05 2017-02-22 维卡瑞斯外科手术股份有限公司 Virtual reality surgical device
EP3161828A4 (en) * 2014-05-27 2017-08-09 Chase, Stephen Video headphones, systems, helmets, methods and video content files
US9928629B2 (en) 2015-03-24 2018-03-27 Augmedics Ltd. Combining video-based and optic-based augmented reality in a near eye display
US10105149B2 (en) 2013-03-15 2018-10-23 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5493595A (en) * 1982-02-24 1996-02-20 Schoolman Scientific Corp. Stereoscopically displayed three dimensional medical imaging
WO1998038908A1 (en) * 1997-03-03 1998-09-11 Schneider Medical Technologies, Inc. Imaging device and method
US20020082498A1 (en) * 2000-10-05 2002-06-27 Siemens Corporate Research, Inc. Intra-operative image-guided neurosurgery with augmented reality visualization
US20050203380A1 (en) * 2004-02-17 2005-09-15 Frank Sauer System and method for augmented reality navigation in a medical intervention procedure
US20070236514A1 (en) * 2006-03-29 2007-10-11 Bracco Imaging Spa Methods and Apparatuses for Stereoscopic Image Guided Surgical Navigation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5493595A (en) * 1982-02-24 1996-02-20 Schoolman Scientific Corp. Stereoscopically displayed three dimensional medical imaging
WO1998038908A1 (en) * 1997-03-03 1998-09-11 Schneider Medical Technologies, Inc. Imaging device and method
US20020082498A1 (en) * 2000-10-05 2002-06-27 Siemens Corporate Research, Inc. Intra-operative image-guided neurosurgery with augmented reality visualization
US20050203380A1 (en) * 2004-02-17 2005-09-15 Frank Sauer System and method for augmented reality navigation in a medical intervention procedure
US20070236514A1 (en) * 2006-03-29 2007-10-11 Bracco Imaging Spa Methods and Apparatuses for Stereoscopic Image Guided Surgical Navigation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Textbook of Stereotactic and Functional Neurosurgery", June 1997, MCGRAW-HILL

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10080617B2 (en) 2011-06-27 2018-09-25 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US9498231B2 (en) 2011-06-27 2016-11-22 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
WO2014099494A1 (en) * 2012-12-17 2014-06-26 Alcon Research, Ltd. Wearable user interface for use with ocular surgical console
CN104869922A (en) * 2012-12-17 2015-08-26 爱尔康研究有限公司 Wearable user interface for use with ocular surgical console
CN104869922B (en) * 2012-12-17 2018-05-15 爱尔康研究有限公司 The wearable user interface for use with an ophthalmic surgical console
US9681982B2 (en) 2012-12-17 2017-06-20 Alcon Research, Ltd. Wearable user interface for use with ocular surgical console
WO2014120909A1 (en) * 2013-02-01 2014-08-07 Sarment David Apparatus, system and method for surgical navigation
US10105149B2 (en) 2013-03-15 2018-10-23 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
WO2015110859A1 (en) * 2014-01-21 2015-07-30 Trophy Method for implant surgery using augmented visualization
CN106456145A (en) * 2014-05-05 2017-02-22 维卡瑞斯外科手术股份有限公司 Virtual reality surgical device
EP3139843A4 (en) * 2014-05-05 2018-05-30 Vicarious Surgical Inc. Virtual reality surgical device
EP3161828A4 (en) * 2014-05-27 2017-08-09 Chase, Stephen Video headphones, systems, helmets, methods and video content files
US9928629B2 (en) 2015-03-24 2018-03-27 Augmedics Ltd. Combining video-based and optic-based augmented reality in a near eye display
WO2017005897A1 (en) 2015-07-08 2017-01-12 Sirona Dental Systems Gmbh System and method for scanning anatomical structures and for displaying a scanning result
DE102015212806A1 (en) * 2015-07-08 2017-01-12 Sirona Dental Systems Gmbh System and method for scanning of anatomical structures and showing a scanning result

Similar Documents

Publication Publication Date Title
Leven et al. DaVinci canvas: a telerobotic surgical system with integrated, robot-assisted, laparoscopic ultrasound capability
US7835778B2 (en) Method and apparatus for surgical navigation of a multiple piece construct for implantation
US5517990A (en) Stereotaxy wand and tool guide
US6019724A (en) Method for ultrasound guidance during clinical procedures
Bucholz et al. Intraoperative localization using a three-dimensional optical digitizer
US7313430B2 (en) Method and apparatus for performing stereotactic surgery
US20080200926A1 (en) Automatic identification of instruments used with a surgical navigation system
US20100030063A1 (en) System and method for tracking an instrument
US20080089566A1 (en) Systems and methods for implant virtual review
US20110046483A1 (en) Methods, systems, and computer readable media for image guided ablation
Baumhauer et al. Navigation in endoscopic soft tissue surgery: perspectives and limitations
US6546279B1 (en) Computer controlled guidance of a biopsy needle
US6314310B1 (en) X-ray guided surgical location system with extended mapping volume
US20100290690A1 (en) System And Method For Automatic Registration Between An Image And A Subject
US5732703A (en) Stereotaxy wand and tool guide
US6402762B2 (en) System for translation of electromagnetic and optical localization systems
US20080161682A1 (en) System and method for tracking positions of uniform marker geometries
US7302288B1 (en) Tool position indicator
US20070078334A1 (en) DC magnetic-based position and orientation monitoring system for tracking medical instruments
US20070276234A1 (en) Systems and Methods for Intraoperative Targeting
US6517478B2 (en) Apparatus and method for calibrating an endoscope
US6016439A (en) Method and apparatus for synthetic viewpoint imaging
Navab et al. Camera augmented mobile C-arm (CAMC): calibration, accuracy study, and clinical applications
US20050054895A1 (en) Method for using variable direction of view endoscopy in conjunction with image guided surgical systems
US20020010384A1 (en) Apparatus and method for calibrating an endoscope

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09774747

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09774747

Country of ref document: EP

Kind code of ref document: A1