WO2006095027A1 - Methodes et appareil de navigation et visualisation a des fins chirurgicales a l'aide d'un microscope - Google Patents

Methodes et appareil de navigation et visualisation a des fins chirurgicales a l'aide d'un microscope Download PDF

Info

Publication number
WO2006095027A1
WO2006095027A1 PCT/EP2006/060654 EP2006060654W WO2006095027A1 WO 2006095027 A1 WO2006095027 A1 WO 2006095027A1 EP 2006060654 W EP2006060654 W EP 2006060654W WO 2006095027 A1 WO2006095027 A1 WO 2006095027A1
Authority
WO
WIPO (PCT)
Prior art keywords
microscope
probe
image
camera
virtual
Prior art date
Application number
PCT/EP2006/060654
Other languages
English (en)
Inventor
Chuanggui Zhu
Kusuma Agusanto
Original Assignee
Bracco Imaging S.P.A.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from PCT/SG2005/000244 external-priority patent/WO2007011306A2/fr
Application filed by Bracco Imaging S.P.A. filed Critical Bracco Imaging S.P.A.
Priority to JP2008500215A priority Critical patent/JP2008532602A/ja
Priority to EP06708740A priority patent/EP1861035A1/fr
Priority to CA002600731A priority patent/CA2600731A1/fr
Publication of WO2006095027A1 publication Critical patent/WO2006095027A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/20Surgical microscopes characterised by non-optical aspects
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation

Definitions

  • the present invention relates to image-based surgical guidance and visualization systems.
  • Neurosurgery is routinely conducted in two operational modes: a macroscopic mode and a microscopic mode.
  • a surgical field is generally viewed with the naked eye, and in the latter the surgical field is viewed through a microscope.
  • image based navigation and visualization systems have been used with success in aiding physicians to perform a wide variety of delicate surgical procedures.
  • images depicting the internal anatomies of a patient are generated, usually from magnetic resonance imaging (MRI), computer tomography (CT), and a variety of other technologies, prior to or during a surgery.
  • a three-dimensional (3D) representation of the patient is generated from the images.
  • the representation can be in varies forms, from volume images and 3D models of varies anatomical structures of the patient reconstructed from the images, to drawings, annotations and measurements added to illustrate a surgical plan, and a combine of them.
  • the 3D representation is aligned with the patient by registration.
  • US Published Patent Application Publication No. 20050015005 describes an improved navigation system where the probe includes a micro camera. This enables augmented reality enhanced navigation within a given operative field by viewing real-time images acquired by the micro-camera overlaid on the 3D representation of the patient.
  • an operation microscope is often used to provide a magnification of the surgical field within which a surgeon is working.
  • the microscope can be tracked for navigation purposes and its focal point can be usually shown in the 3D representation in place of the probe tip.
  • magnification of a microscope is usually set at a high level during the operation.
  • an integrated system can include a computer which has stored three dimensional representations of a patient's internal anatomy, a display, a probe and an operation microscope.
  • reference markers can be attached to the probe and the microscope, and the system can also include a tracking system which can track the 3D position and orientation of each of the probe and microscope.
  • a system can include means for detecting changes in the imaging parameters of the microscope, such as, for example, magnification and focus, which occur as a result of user adjustment and operation of the microscope.
  • the microscope can have, for example, a focal point position relative to the markers attached to the microscope and can, for example, be calibrated in the full range of microscope focus.
  • the position of the microscope can be obtained from the tracking data regarding the microscope and the focus can be obtained from, for example, a sensor integrated with the microscope.
  • a tip position of the probe can also be obtained from the tracking data of the reference markers on the probe, and means can be provided for registration of virtual representations of patient anatomical data with real images from one or more cameras on each of the probe and the microscope.
  • visualization and navigation images can be provided by each of the microscope and the probe, and when both are active the system can intelligently display either a microscopic or a macroscopic (probe based) real, virtual or augmented image according to defined rules.
  • FIGs. 1A-1C illustrate digital zooming of an augmented reality image according to an exemplary embodiment of the present invention
  • FIG. 1 D depicts an exemplary navigation system according to an exemplary embodiment of the present invention
  • FIG. 2 shows a schematic depiction of a real image of an exemplary patient head according to an exemplary embodiment of the present invention
  • FIG. 3 shows a schematic depiction of a virtual image of a tumor and blood vessel according to an exemplary embodiment of the present invention
  • FIG. 4 shows a schematic depiction of a combined (augmented reality) image according to an exemplary embodiment of the present invention
  • FIG. 5 shows a schematic depiction of a magnified augmented reality view according to an exemplary embodiment of the present invention
  • FIG. 6 shows a schematic depiction of a magnified microscopic view according to an exemplary embodiment of the present invention
  • Fig. 7 shows a schematic depiction of digitally zoomed-out (magnified) microscopic view according to an exemplary embodiment of the present invention
  • FIG. 8 shows schematic depiction of an exemplary navigational view from a probe according to an exemplary embodiment of the present invention
  • FIG. 9 shows an exemplary navigational view from a surgical microscope according to an exemplary embodiment of the present invention.
  • Fig. 10 shows the exemplary view of Fig. 9 after digitally zooming-in according to an exemplary embodiment of the present invention.
  • FIG. 11 shows an exemplary augmented reality navigational view from an exemplary probe according to an exemplary embodiment of the present invention.
  • augmented reality enhanced navigation system can be provided which can, for example, provide both microscopic and macroscopic navigational information of three-dimensional (3D) anatomic structures of the patient to a surgeon without the need to move the microscope off of, or away from, as the case may be, the surgical field.
  • a video camera can, for example, be rigidly attached to a microscope.
  • a computer can, for example, store a virtual microscope camera model having the same imaging properties and pose (position and orientation) as the corresponding actual video camera, said imaging properties including focal length, field of view and distortion parameters, zoom and focus.
  • means can be provided to generate an augmented view for microscopic navigation by overlaying the video images from the camera, or cameras, as the case may be, on the microscope with virtual rendered images of the patient's 3D anatomical structures generated by the computer according to the corresponding virtual microscope camera model in response to the position and orientation data of the microscope from the tracking device as well as magnification and focus data obtained form the microscope itself, by, for example, an integrated sensor.
  • a video camera integrated with a probe such as, for example, is described in the Camera-probe Applicaiton.
  • a virtual model of the video camera having the same imaging properties and pose (position and orientation) as the actual video camera, said imaging properties including focal length, field of view and distortion parameters, can be provided.
  • means can be provided to generate an augmented view for macroscopic navigation by overlaying video images from the camera in the probe with rendered images of the patient's 3D anatomical structures generated by the computer according to the virtual camera model in response to the position and orientation data of the probe from the tracking device.
  • an augmented microscopic view can be digitally zoomed so that a magnified view of microscopic navigation can be obtained without requiring a change of the position and settings (magnification and focus) of the microscope.
  • An anatomic structure outside of the optical field of the microscope at its current settings can thus be displayed in such a zoomed-out display, overlayed only partly by the real time video image coming from the microscope's camera in the center of the display.
  • a user need not change the setting of or move the microscope away to obtain a macroscopic navigation view.
  • a user need only move the probe, which can image the surgical field form any arbitrary viewpoint.
  • the microscopic image can be digitally zoomed.
  • Change of magnification or zoom in an AR image works by changing the field of view of the virtual camera (i.e. its frustum shape) together with the real image by ensuring that the video image plane is aligned along the frustum of virtual camera.
  • This concept is illustrated with reference to Figs. 1A-1C. It is noted that the original figures were in color, and the following description makes reference to those colors. However, the referents are easily discernable even in grayscale images.
  • Fig. 1A depicts a virtual camera (red axes at left of left image), and its frustum, represented by a near plane (dark blue; left side of left image) connected to a far plane (dark grey; right edge of left image), together with a virtual object.
  • the video image (pink rectangle), has its image centers aligned to the center of frustum.
  • the video image size is set to be the same as the near plane.
  • the full video image covers the screen-view (or viewport), and there is no zooming effect.
  • Fig. 1B the frustum has been changed such that a virtual object is projected with a magnification or zooming-in effect.
  • Such change in frustum causes a change in what is visible in the screen space for the video image. Because now only some parts of the video image are inside the projection plane (near-plane), covering the screen view, there is a zooming-in effect also in the video image.
  • Fig. 1C the frustum is changed such that the virtual object is projected with a zooming-out effect (appearing smaller).
  • This change in frustum causes the whole video image inside the projection plane (near-plane) to cover only a part of the screen-view, thus the video image appears smaller in the screen view.
  • the change of frustum can be achieved by changing the parameters of the perspective matrix of the virtual camera that produce the perspective projection.
  • a perspective projection matrix of 4x4 matrix defined in an OpenGL context can, for example, be defined with the following parameters:
  • ProjMat [10] - (Far + Near) / (Far - Near)
  • ProjMat [11] -2 * Far * Near / (Far - Near)
  • ProjMat [14] -1
  • ProjMat [15] 0 with element 1 ,3,4,7,8,9,12, and 13 having value of 0 (read from left to right, top to bottom rule).
  • the parameters Left, Right, Top, Bottom are functions of the microscope model based on the intrinsic camera calibration parameters together with focus and zoom setting of the microscope.
  • the parameter for Near and Far can be, for example, set at constant values.
  • the parameter zoomFactor ⁇ s the factor that determines the zooming-in or zooming-out effects. When its value is below 1 , the effect is zooming-out, and when greater than 1 , the effect is zooming-in. No zoom effect is operative when the value is 1.
  • the video image can be displayed as a texture map with the orthographic projection.
  • an OpenGL viewport can be adjusted by following parameters:
  • GLfloat ex fabs (Left) / (Right-Left)
  • GLfloat cy fabs (Bottom) / (Top -Bottom)
  • glViewport (1 - zoomFactor) * screenWidth * ex + originX , (1 - zoomFactor) * screenHeight * cy + originY , screenWidth * zoomFactor , screenHeight * zoomFactor ); which is, basically, scaling the size of screen view with a zoomFactor, and shifting the origin of the viewport according to the zoomFactor, video-image centers (ex, and cy), and the origin of OpenGL window such that the visible video image is overlayed correctly with the virtual image.
  • a probe can be used during microscopic surgery to obtain navigational views from varying orientations and locations.
  • the anatomic structures around the surgical field, together with the focal points and optical axis of the microscope can be displayed from the point of view of the probe camera.
  • the anatomic structures around the surgical area from various view points can, for example, thus be presented to the surgeon without the need of changing the microscope.
  • Operation microscope 115 has a camera 105, which can, for example, be a color camera, installed on its imaging port and reference markers 110 can be mounted to it.
  • the microscope 115 can, for example, have a built-in sensor to detect changes in imaging parameters of the microscope occurring as a result of adjustment of the microscope wherein said imaging parameters include parameters comprising microscope magnification and focus.
  • a sensor can be, for example, an encoder.
  • the adjustment of focus and zoom involves mechanical movement of the lenses and such an encoder can, for example, measure such movement.
  • the parameters can be available from a serial port of the microscope.
  • the data format can be, for example, of the form Zoom: +120; Focus: 362.
  • the microscope can also have an optical axis 111 and a focal point 112 which is defined as the intersection point of the optical axis and the focus plane of the microscope.
  • a focus plane is perpendicular to the optical axis. On focus plane you can get the clearest image.
  • a focal plane changes with focus adjustment.
  • a focal point's position relative to reference markers 110 can be calibrated in the full range of microscope focus and therefore can be depicted from the tracking data.
  • Fig. 1D the microscope is being viewed by a surgeon and in the microscope's light path there is a patient's head 152.
  • the exemplary patient has a tumor 155 (which is the target object of the operation) and a blood vessel structure 150 (which should be avoided during the operation) close to tumor 155.
  • a position tracking system 100 (such as, for example, NDI Polaris) can receive commands and can send tracking data to a computer 120, either, for example, wirelessly or through a cable linked with the computer.
  • Computer 120 can have 3D models 125 of the tumor 155 and blood vessel structure 150 stored in its memory prior to the navigation, such as for example, after pre-operative scanning and processing of such scan data into a volumetric data set containing various segmentations and planning data.
  • a probe 140 can, for example, contain a video camera 135, and a pointer with a tip 136 can be attached to its front end.
  • the probe 140 can be placed within easy reach of a surgeon to facilitate its use during the surgery.
  • the probe can, for example, be of the type as disclosed in the Camera-probe Application.
  • the position tracking system 100 can provide continual real time tracking data of the microscope 115 to the computer.
  • the position tracking system 100 can also provide continual real time tracking data of the probe 140 to the computer.
  • the computer can be connected to (i) a display 130, (ii) a camera and sensor of microscope 115, and (iii) a mini camera of the probe.
  • the system can further include software to detect position and orientation data of the microscope and probe from the tracking data, and from such position data to automatically select one (probe or microscope) to be used as a basis of images for navigation and/or visualization. Such automatic selection can be according to defined priority rules or various algorithms as may be appropriate to a given application and a given user's preferences.
  • a given user may prefer to get his general bearings via a macroscopic view, and then when he gets close to delicate structures, use a microscopic view. If an operation has multiple stages, it can easily be seen that such a surgeon would cycle through using the probe, then the microscope, then again the probe and then again the microscope.
  • the system could realize that for an initial period the main implement is a probe, and then once a microscope has been engaged it is the main implement until a new microscope position has been chosen, when the probe is once again used at the beginning of another stage.
  • the system could, as a result, generate a combined image on the display corresponding to a view from whichever implement was then prioritized.
  • Many alternative rules could be implemented, and a surgeon could always override such priority settings by actuating a switch or voice controlled interface.
  • the computer 120 can receives a real-time video image of a surgical scene acquired by microscope camera 105.
  • Microscope camera 105 can, for example, have a microscope virtual camera model which can be been provided and stored in computer 120.
  • a microscope virtual camera model can have a set of intrinsic parameters and extrinsic parameters wherein said intrinsic parameters can include, for example, focal length, image center and distortion, and said extrinsic parameters can include, for example, position and orientation of the virtual microscope camera model in relative to a reference coordinate system.
  • a reference coordinate system can be, for example, the coordinate system of markers 110 which are rigidly linked to microscope 115.
  • the intrinsic and extrinsic parameters of the microscope camera model can change according to changes of the microscope's magnification and focus.
  • the intrinsic and extrinsic parameters of the microscope camera model can be described as bivariate polynomial functions of the microscope magnification and focus.
  • a parameter p p represents one of the intrinsic and extrinsic parameters
  • a parameter p can be modeled as a qth order bivariate polynomial function of the values of focus (f) and zoom (z) of the microscope, as follows:
  • the microscope can be calibrated as a number of fixed cameras (with fixed focal length) across the full range of the microscope focus and zoom range. After a sufficient number of fixed camera calibrations, under different zoom and focus settings, a group of calibration data can be obtained.
  • the coefficients a m , n of the polynomial functions can then be solved, for example, by bivariate polynomial fitting.
  • An exemplary microscope camera model for an exemplary microscope in an augmented reality microscope system can be as follows:
  • Twcy -0.000000001 *F ⁇ 2*Z + (-0.000001826)*F ⁇ 2 + (0.000002707)*F*Z + (-0.004741056)*F + (-0.003616348)*Z + 5.606256436
  • Twcz 0.000000302*F ⁇ 2*Z + 0.000014187*F ⁇ 2 + (-0.000088499)*F*Z + (-0.018100412)*F + 0.061825291*Z + 422.480480324
  • Owcx, Owcy, Owcz are rotation vectors from which the rotation matrix from the microscope camera to the reference coordinate system can be calculated
  • Twcx, Twcy, and Twcz are transforms in x, y and z and from them the transform matrix to the reference coordinate system can be constructed.
  • a corresponding virtual microscope camera can be created and can be used to generate a virtual image of the virtual objects.
  • computer 120 receives the current magnification and focus values for the microscope.
  • Intrinsic and extrinsic parameters of a virtual microscope camera can thus be calculated from the stored microscope camera model.
  • the virtual microscope camera position and orientation in the position tracking system can be depicted using the tracking data of the markers on the microscope.
  • the microscope has an optical axis 111 and a focal point 112.
  • the position of the focal point changes in relative to the reference markers accordingly to the changes of the microscope focus.
  • the position of the focal point of the microscope relative to the reference markers can be calibrated before navigation.
  • An exemplary calibrated result of the focal point for an exemplary microscope from an augmented reality microscope system is presented below.
  • FocusPoint (x, y, z) (Fpx, Fpy, Fpz), wherein
  • Fpx -0.000001113*F ⁇ 2 + 0.001109120*F + 116.090108990;
  • Fpy 0.000002183*F ⁇ 2 + (-0.000711078)*F + (-27.066366422);
  • Fpx -0.000073468*F ⁇ 2 + (-0.154217215)*F + -369.813473763;
  • F represents focus
  • the calibration result of the focal point can be stored in the computer.
  • a position of focal point can be depicted from the tracking data of the reference markers.
  • the optical axis can be, for example, a line linking the focal points of various microscope focal values.
  • image data of a patient can be mapped to the patient using one of the generally known registration techniques.
  • one such registration technique maps the image data of a patient to the patient using a number of anatomical features (at least 3) on the body surface of the patient by matching their positions identified and located in the scan images and their corresponding positions on the patient determined using a tracked probe.
  • the registration accuracy may be further improved by mapping a surface of a body part of the patient generated from the imaging data to the surface data of the corresponding body part generated on the operating table.
  • the position and orientation of the patient head 152 and the position and orientation of the microscope video camera 105 can be transformed into a common coordinate system, for example the coordinate system of the position tracking system.
  • the relative position and orientation between the head 152 and the microscope video camera 105 can thus be determined dynamically using the position tracking system 100.
  • the microscope camera can capture a video image of patient head 152.
  • the tumor 155 and blood vessel 150 may not be visible in the video image (as they may be visually occluded by an as yet closed portion of the head).
  • the computer can generate a virtual image of tumor 155 and blood vessel 150 based on the intrinsic and extrinsic parameters of the virtual microscope camera and the stored model of the tumor and blood vessel.
  • real image 201 and virtual image 301 can be combined to generate an augmented reality image.
  • the augmented reality image can then, for example, be on display device 130.
  • Display 130 can be a monitor, a HMD, a display build in the microscope for "image injection", etc..
  • the 3D model of tumor and blood vessel may be generated from three-dimensional (3D) images of the patient. For example, from MRI or CT images of the patient head.
  • 3D three-dimensional
  • data can be generated using hardware and software provided by Volume Interactions Pte Ltd., such as, for example, the Dextroscope system running RadioDexter software.
  • the augmented reality image can be displayed in various ways.
  • the real image can be overlaid on the virtual image (real image is on the virtual image), or be overlaid by the virtual image (the virtual image is on the real image).
  • the transparency of the overlay image can be changed so that the augmented reality image can be displayed in various ways, with the virtual image only, real image only, or a combined view.
  • axial, coronal and sagittal planes of the 3D models according to the position changing of the focal point can be displayed in three separate windows.
  • the augmented reality in microscopic navigation can be in various microscope settings across the full magnification and focus range.
  • Fig. 5 shows an exemplary augmented reality view of the patient head in a different (greater, relative to Figs. 3-4) magnification setting.
  • digital zoom can be used to virtually change the magnification of the augmented reality image.
  • the zoom ratio can be an input of a user.
  • the zoomed field of view can, for example, be centered at the center of the window by default.
  • Fig. 6 shows an exemplary virtual image only navigation view of the surgical field through the microscope at a higher magnification.
  • the surgeon is operating on the tumor so part of the tumor is visible in the optical view of the microscope. However, most of the tumor and all of the blood vessel are either hidden under the exposed surface or out of the field of view of the microscope so that the surgeon cannot see directly.
  • a rendered image of tumor and blood vessel generated by the computer can be displayed to the surgeon, but because of the magnification, only a small part of the tumor and blood vessel can be shown.
  • FIG. 7 shows a virtually enlarged view of the microscope in which the whole structure of the tumor and blood vessel are visible.
  • this can be achieved by digital zooming.
  • Digital zooming virtually changes the field of view of the virtual microscope camera model, so that the 3D models in the virtual camera's field of view can be rendered from the same viewpoint but a different field of view.
  • Digital zooming enables the surgeon to see beyond of the microscope's field of view without changing the microscope's actual settings.
  • the video signal can also be zoomed, and thus a zoomed image can have video (real) images, virtual images or any combination of both, with varying transparency of either.
  • Fig. 7 is zoomed-out relative to the view of Fig. 6, but obviously of a much greater magnification (zoom-in) relative to the view of Fig. 5 and of course that of Fig. 3.
  • a user may frequently change zoom values, zooming in and out repeatedly over the course of a given procedure or operation.
  • FIG. 8 depicts the exemplary scene of FIG. 7 from the point of view of the mini-camera inside the probe.
  • the focal point as well as the optical path of the microscope can be shown together with the tumor and blood vessels, indicating the 3D relationship of the microscope, the surgical field and the virtual objects (tumor and blood vessels).
  • Figs. 9-11 are actual screen shots form an exemplary embodiment of the present invention.
  • Fig. 9 shows an exemplary navigational view from a surgical microscope according to an exemplary embodiment of the present invention.
  • Fig. 10 shows the exemplary view of Fig. 9 after digitally zooming-out according to an exemplary embodiment of the present invention, using the techniques described above as in connection with Fig. 7.
  • FIG. 11 shows an exemplary augmented reality navigational view from an exemplary probe according to an exemplary embodiment of the present invention, corresponding somewhat to that shown in Fig. 8, without the optical path and focal point of the microscope.
  • the selection between the microscope and probe can be performed automatically.
  • the automatic selection can be based upon (i.e., be a function of) the tracking data. In exemplary embodiments according to the present invention this can be achieved by setting a higher priority to the probe. If only the microscope tracking data is available, the microscope will be selected as the navigation instrument and its AR image will be displayed. If both the microscope and probe are tracked, the probe will be selected and its AR view be displayed. The microscope in such situation can, for example, be ignored. When the probe is not tracked, the microscope can, for example, be selected automatically for navigation. The video image can also be automatically changed accordingly.
  • the system, methods and apparatus of the present invention thus enable a user to see “beyond the normal field of view” both during macroscopic surgery as well as microscopic surgery. This allows a user to always be aware just how near he or she is to highly sensitive or important hidden structures.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Robotics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Microscoopes, Condenser (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention porte sur un système et un procédé améliorés de navigation et visualisation macroscopique et microscopique à des fins chirurgicales. Une exécution de l'invention, donnée en exemple, comprend un système intégré comportant un ordinateur où sont stockées des représentations en 3D de l'anatomie interne du patient, un écran, une sonde et un microscope opératoire. Une autre exécution de l'invention, donnée en exemple, comprend des marqueurs de référence pouvant se fixer à la sonde et au microscope, ainsi qu'un système de poursuite pouvant suivre la position et l'orientation en 3D de la sonde et du microscope. Une autre exécution de l'invention, donnée en exemple, comprend des moyens de détection des modifications des paramètres d'imagerie du microscope, tels que par exemple son grossissement ou sa focalisation, résultant de son réglage ou de son maniement par l'utilisateur, tels que la position du foyer par rapport aux marqueurs solidaires du microscope, ce qui permet de l'étalonner sur toute la plage de ses foyers. Dans une autre exécution de l'invention, donnée en exemple, la position du microscope peut être obtenue à partir de données de poursuite relatives au microscope et au foyer, par exemple par un détecteur intégré au microscope, et on peut de plus obtenir la position de l'extrémité de la sonde à partir des données de poursuite des marqueurs de référence fixés à la sonde, et on peut prévoir des moyens de mise en coïncidence des représentations virtuelles des données anatomiques du patient avec une ou plusieurs images réelles prises par une ou plusieurs caméra solidaires de la sonde et du microscope. Dans une autre exécution de l'invention, donnée en exemple, la visualisation et la navigation peuvent être effectuées à la fois par le microscope et par la sonde, et lorsqu'ils sont tous deux actifs, le système peut présenter une image intelligente microscopique ou macroscopique (par la sonde) agrandie selon certaines normes établies.
PCT/EP2006/060654 2005-03-11 2006-03-13 Methodes et appareil de navigation et visualisation a des fins chirurgicales a l'aide d'un microscope WO2006095027A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2008500215A JP2008532602A (ja) 2005-03-11 2006-03-13 外科手術ナビゲーションと顕微鏡による可視化の方法と装置
EP06708740A EP1861035A1 (fr) 2005-03-11 2006-03-13 Methodes et appareil de navigation et visualisation a des fins chirurgicales a l'aide d'un microscope
CA002600731A CA2600731A1 (fr) 2005-03-11 2006-03-13 Methodes et appareil de navigation et visualisation a des fins chirurgicales a l'aide d'un microscope

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US66084505P 2005-03-11 2005-03-11
US60/660,845 2005-03-11
PCT/SG2005/000244 WO2007011306A2 (fr) 2005-07-20 2005-07-20 Procede et appareil destines a mapper un modele virtuel d'un objet sur l'objet
SGPCT/SG/2005/00244 2005-07-20

Publications (1)

Publication Number Publication Date
WO2006095027A1 true WO2006095027A1 (fr) 2006-09-14

Family

ID=36405966

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2006/060654 WO2006095027A1 (fr) 2005-03-11 2006-03-13 Methodes et appareil de navigation et visualisation a des fins chirurgicales a l'aide d'un microscope

Country Status (5)

Country Link
US (1) US20060293557A1 (fr)
EP (1) EP1861035A1 (fr)
JP (1) JP2008532602A (fr)
CA (1) CA2600731A1 (fr)
WO (1) WO2006095027A1 (fr)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012033552A1 (fr) * 2010-09-10 2012-03-15 The Johns Hopkins University Visualisation d'une anatomie subsurfacique recalée de référence et applications apparentées
EP2667779A1 (fr) * 2011-01-26 2013-12-04 Inria Institut National de Recherche en Informatique et en Automatique Procédé et système d'aide au positionnement d'un outil médical sur la tête d'un sujet
CN103892919A (zh) * 2014-03-27 2014-07-02 中国科学院光电技术研究所 基于光学相干层析引导的显微外科手术系统及导航方法
CN104470458A (zh) * 2012-07-17 2015-03-25 皇家飞利浦有限公司 用于手术仪器引导的增强现实成像系统
EP2949285A1 (fr) * 2014-05-27 2015-12-02 Carl Zeiss Meditec AG Microscope chirurgical
CN106327587A (zh) * 2016-11-16 2017-01-11 北京航空航天大学 一种用于增强现实手术导航的腹腔镜视频精准融合方法
US9668768B2 (en) 2013-03-15 2017-06-06 Synaptive Medical (Barbados) Inc. Intelligent positioning system and methods therefore
CN107157588A (zh) * 2017-05-08 2017-09-15 上海联影医疗科技有限公司 影像设备的数据处理方法以及影像设备
EP2632382B1 (fr) 2010-10-28 2017-09-20 Fiagon AG Medical Technologies Accessoire de navigation pour appareils optiques en médecine et procédé associé
EP3238649A1 (fr) * 2011-09-28 2017-11-01 Brainlab AG Dispositifs médicaux autolocalisables
WO2018076094A1 (fr) * 2016-10-31 2018-05-03 Synaptive Medical (Barbados) Inc. Système et procédés de navigation 3d
WO2018089827A1 (fr) 2016-11-11 2018-05-17 Intuitive Surgical Operations, Inc. Système chirurgical avec affichage d'image à modalités multiples
EP3412242A1 (fr) * 2017-06-09 2018-12-12 Siemens Healthcare GmbH Émission de données de position d'un instrument technique médical
CN109833092A (zh) * 2017-11-29 2019-06-04 上海复拓知达医疗科技有限公司 体内导航系统和方法
WO2022033656A1 (fr) 2020-08-10 2022-02-17 Brainlab Ag Étalonnage de caméra de microscope
US11357574B2 (en) 2013-10-31 2022-06-14 Intersect ENT International GmbH Surgical instrument and method for detecting the position of a surgical instrument
US11430139B2 (en) 2019-04-03 2022-08-30 Intersect ENT International GmbH Registration method and setup

Families Citing this family (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8560047B2 (en) 2006-06-16 2013-10-15 Board Of Regents Of The University Of Nebraska Method and apparatus for computer aided surgery
GB0613576D0 (en) * 2006-07-10 2006-08-16 Leuven K U Res & Dev Endoscopic vision system
KR100877114B1 (ko) * 2007-04-20 2009-01-09 한양대학교 산학협력단 의료 영상 제공 시스템 및 의료 영상 제공 방법
US8180396B2 (en) 2007-10-18 2012-05-15 Yahoo! Inc. User augmented reality for camera-enabled mobile devices
US20090289955A1 (en) * 2008-05-22 2009-11-26 Yahoo! Inc. Reality overlay device
US8711176B2 (en) 2008-05-22 2014-04-29 Yahoo! Inc. Virtual billboards
DE102009010592B4 (de) * 2009-02-25 2014-09-04 Carl Zeiss Meditec Ag Verfahren und Vorrichtung zum Aufnehmen und Auswerten von digitalen Bilddaten mit einem Operationsmikroskop
DE102009040430B4 (de) * 2009-09-07 2013-03-07 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Vorrichtung, Verfahren und Computerprogramm zur Überlagerung eines intraoperativen Livebildes eines Operationsgebiets oder des Operationsgebiets mit einem präoperativen Bild des Operationsgebiets
US9498231B2 (en) 2011-06-27 2016-11-22 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
CA2840397A1 (fr) 2011-06-27 2013-04-11 Board Of Regents Of The University Of Nebraska Systeme de suivi d'outil integre et procedes de chirurgie assistee par ordinateur
US11911117B2 (en) 2011-06-27 2024-02-27 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US8792693B2 (en) * 2011-07-09 2014-07-29 Gauss Surgical System and method for estimating extracorporeal blood volume in a physical sample
KR20130121521A (ko) * 2012-04-27 2013-11-06 주식회사 고영테크놀러지 환부 및 수술도구의 트랙킹 방법
IN2014DN08500A (fr) * 2012-05-25 2015-05-15 Surgical Theater LLC
US9642606B2 (en) 2012-06-27 2017-05-09 Camplex, Inc. Surgical visualization system
US9615728B2 (en) 2012-06-27 2017-04-11 Camplex, Inc. Surgical visualization system with camera tracking
ES2813625T3 (es) * 2012-08-30 2021-03-24 Alcon Inc Sistema y procedimientos de formación de imágenes que visualizan una imagen reconstruida multidimensional fusionada
KR101449830B1 (ko) 2013-01-28 2014-10-14 동국대학교 산학협력단 진단 위치 추적 장치, 진단 위치 추적 방법 및 진단기기
KR101442953B1 (ko) * 2013-01-28 2014-09-23 동국대학교 산학협력단 진단 위치 추적 장치를 구동하기 위한 gui 제공 방법 및 그 방법을 컴퓨터에서 실행하기 위한 프로그램을 기록하는 컴퓨터 판독 가능한 기록매체
US9386908B2 (en) * 2013-01-29 2016-07-12 Gyrus Acmi, Inc. (D.B.A. Olympus Surgical Technologies America) Navigation using a pre-acquired image
US10105149B2 (en) 2013-03-15 2018-10-23 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
CA2892554C (fr) * 2013-03-15 2017-04-18 Synaptive Medical (Barbados) Inc. Systeme et procede de validation dynamique et de correction d'enregistrement pour une navigation chirurgicale
CN105142561B (zh) 2013-04-30 2018-01-30 株式会社高永科技 光学跟踪系统及利用其的跟踪方法
WO2014189969A1 (fr) 2013-05-21 2014-11-27 Camplex, Inc. Systèmes de visualisation chirurgicaux
EP3047326A4 (fr) 2013-09-20 2017-09-06 Camplex, Inc. Systèmes et affichages de visualisation chirurgicale
WO2015042483A2 (fr) 2013-09-20 2015-03-26 Camplex, Inc. Systèmes de visualisation chirurgicale
DE102014205038B4 (de) * 2014-02-19 2015-09-03 Carl Zeiss Meditec Ag Visualisierungsvorrichtungen mit Kalibration einer Anzeige und Kalibrierverfahren für eine Anzeige in einer Visualisierungsvorrichtung
US10567660B2 (en) * 2014-03-14 2020-02-18 Brainlab Ag Overlay of anatomical information in a microscope image
GB2524498A (en) * 2014-03-24 2015-09-30 Scopis Gmbh Electromagnetic navigation system for microscopic surgery
US10026015B2 (en) * 2014-04-01 2018-07-17 Case Western Reserve University Imaging control to facilitate tracking objects and/or perform real-time intervention
EP3128897B1 (fr) 2014-04-03 2018-11-28 Brainlab AG Procédé et système de prise en charge de procédure médicale de cartographie cérébrale
EP3811891A3 (fr) 2014-05-14 2021-05-05 Stryker European Holdings I, LLC Système de navigation et agencement de processeur pour suivre la position d'une cible de travail
DE102014210053A1 (de) * 2014-05-27 2015-12-03 Carl Zeiss Meditec Ag Operationsmikroskop mit Dateneinheit und Verfahren zum Überlagern von Bildern
JP6290723B2 (ja) * 2014-06-23 2018-03-07 公立大学法人公立はこだて未来大学 手術支援装置および手術支援システム
DE102014216511A1 (de) * 2014-08-20 2016-02-25 Carl Zeiss Meditec Ag Erstellen von Kapitelstrukturen für Videodaten mit Bildern aus einem Operationsmikroskop-Objektbereich
WO2016090336A1 (fr) 2014-12-05 2016-06-09 Camplex, Inc. Systèmes et affichages de visualisation chirurgicale
US9645379B2 (en) * 2014-12-29 2017-05-09 Novartis Ag Magnification in ophthalmic procedures and associated devices, systems, and methods
US10013808B2 (en) 2015-02-03 2018-07-03 Globus Medical, Inc. Surgeon head-mounted display apparatuses
EP3277152A4 (fr) 2015-03-25 2018-12-26 Camplex, Inc. Systèmes et affichages de visualisation à usage chirurgical
US10966798B2 (en) 2015-11-25 2021-04-06 Camplex, Inc. Surgical visualization systems and displays
IL245339A (en) * 2016-04-21 2017-10-31 Rani Ben Yishai Method and system for verification of registration
KR101812001B1 (ko) 2016-08-10 2017-12-27 주식회사 고영테크놀러지 3차원 데이터 정합 장치 및 방법
JP6795744B2 (ja) * 2016-09-21 2020-12-02 学校法人自治医科大学 医療支援方法および医療支援装置
US11490985B2 (en) * 2017-03-29 2022-11-08 Sony Olympus Medical Solutions Inc. Medical observation apparatus and control method
WO2018208691A1 (fr) 2017-05-08 2018-11-15 Camplex, Inc. Source lumineuse variable
US20190254753A1 (en) 2018-02-19 2019-08-22 Globus Medical, Inc. Augmented reality navigation systems for use with robotic surgical systems and methods of their use
TWI741196B (zh) * 2018-06-26 2021-10-01 華宇藥品股份有限公司 整合擴增實境之手術導航方法及系統
US11564678B2 (en) 2018-07-16 2023-01-31 Cilag Gmbh International Force sensor through structured light deflection
US11992373B2 (en) 2019-12-10 2024-05-28 Globus Medical, Inc Augmented reality headset with varied opacity for navigated robotic surgery
US11284963B2 (en) 2019-12-30 2022-03-29 Cilag Gmbh International Method of using imaging devices in surgery
US11744667B2 (en) 2019-12-30 2023-09-05 Cilag Gmbh International Adaptive visualization by a surgical system
US12002571B2 (en) 2019-12-30 2024-06-04 Cilag Gmbh International Dynamic surgical visualization systems
US11648060B2 (en) 2019-12-30 2023-05-16 Cilag Gmbh International Surgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ
US11896442B2 (en) 2019-12-30 2024-02-13 Cilag Gmbh International Surgical systems for proposing and corroborating organ portion removals
US11219501B2 (en) 2019-12-30 2022-01-11 Cilag Gmbh International Visualization systems using structured light
US11832996B2 (en) 2019-12-30 2023-12-05 Cilag Gmbh International Analyzing surgical trends by a surgical system
US11776144B2 (en) 2019-12-30 2023-10-03 Cilag Gmbh International System and method for determining, adjusting, and managing resection margin about a subject tissue
US11759283B2 (en) 2019-12-30 2023-09-19 Cilag Gmbh International Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
US11464581B2 (en) 2020-01-28 2022-10-11 Globus Medical, Inc. Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums
US11382699B2 (en) 2020-02-10 2022-07-12 Globus Medical Inc. Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery
US11207150B2 (en) 2020-02-19 2021-12-28 Globus Medical, Inc. Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
CN111462005B (zh) * 2020-03-30 2023-01-06 腾讯医疗健康(深圳)有限公司 处理显微图像的方法、装置、计算机设备及存储介质
US11607277B2 (en) 2020-04-29 2023-03-21 Globus Medical, Inc. Registration of surgical tool with reference array tracked by cameras of an extended reality headset for assisted navigation during surgery
US11382700B2 (en) 2020-05-08 2022-07-12 Globus Medical Inc. Extended reality headset tool tracking and control
US11153555B1 (en) 2020-05-08 2021-10-19 Globus Medical Inc. Extended reality headset camera system for computer assisted navigation in surgery
US11510750B2 (en) 2020-05-08 2022-11-29 Globus Medical, Inc. Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
CN115697233A (zh) * 2020-05-29 2023-02-03 柯惠Lp公司 用于通过外科机器人系统对3d可视化进行集成控制的系统和方法
US11737831B2 (en) 2020-09-02 2023-08-29 Globus Medical Inc. Surgical object tracking template generation for computer assisted navigation during surgical procedure

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6006126A (en) * 1991-01-28 1999-12-21 Cosman; Eric R. System and method for stereotactic registration of image scan data
US20020151784A1 (en) * 1998-11-10 2002-10-17 Olympus Optical Co., Ltd. Surgical microscope
WO2003105709A1 (fr) * 2002-06-13 2003-12-24 Möller-Wedel GmbH Procede et instrument de navigation chirurgicale
US20040254454A1 (en) * 2001-06-13 2004-12-16 Kockro Ralf Alfons Guide system and a probe therefor
US20050015005A1 (en) 2003-04-28 2005-01-20 Kockro Ralf Alfons Computer enhanced surgical navigation imaging system (camera probe)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5999840A (en) * 1994-09-01 1999-12-07 Massachusetts Institute Of Technology System and method of registration of three-dimensional data sets
US6483948B1 (en) * 1994-12-23 2002-11-19 Leica Ag Microscope, in particular a stereomicroscope, and a method of superimposing two images
US6317616B1 (en) * 1999-09-15 2001-11-13 Neil David Glossop Method and system to facilitate image guided surgery
EP1430427A1 (fr) * 2001-08-28 2004-06-23 Volume Interactions Pte. Ltd. Procedes et systemes d'interaction avec des modeles informatiques tridimensionnels

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6006126A (en) * 1991-01-28 1999-12-21 Cosman; Eric R. System and method for stereotactic registration of image scan data
US20020151784A1 (en) * 1998-11-10 2002-10-17 Olympus Optical Co., Ltd. Surgical microscope
US20040254454A1 (en) * 2001-06-13 2004-12-16 Kockro Ralf Alfons Guide system and a probe therefor
WO2003105709A1 (fr) * 2002-06-13 2003-12-24 Möller-Wedel GmbH Procede et instrument de navigation chirurgicale
US20050015005A1 (en) 2003-04-28 2005-01-20 Kockro Ralf Alfons Computer enhanced surgical navigation imaging system (camera probe)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012033552A1 (fr) * 2010-09-10 2012-03-15 The Johns Hopkins University Visualisation d'une anatomie subsurfacique recalée de référence et applications apparentées
EP2632382B2 (fr) 2010-10-28 2024-06-26 Intersect ENT International GmbH Accessoire de navigation pour appareils optiques en médecine et procédé associé
EP2632382B1 (fr) 2010-10-28 2017-09-20 Fiagon AG Medical Technologies Accessoire de navigation pour appareils optiques en médecine et procédé associé
EP2667779A1 (fr) * 2011-01-26 2013-12-04 Inria Institut National de Recherche en Informatique et en Automatique Procédé et système d'aide au positionnement d'un outil médical sur la tête d'un sujet
US9974615B2 (en) 2011-09-28 2018-05-22 Brainlab Ag Determining a position of a medical device to be localized
EP3238649A1 (fr) * 2011-09-28 2017-11-01 Brainlab AG Dispositifs médicaux autolocalisables
CN104470458A (zh) * 2012-07-17 2015-03-25 皇家飞利浦有限公司 用于手术仪器引导的增强现实成像系统
US9668768B2 (en) 2013-03-15 2017-06-06 Synaptive Medical (Barbados) Inc. Intelligent positioning system and methods therefore
US11357574B2 (en) 2013-10-31 2022-06-14 Intersect ENT International GmbH Surgical instrument and method for detecting the position of a surgical instrument
CN103892919A (zh) * 2014-03-27 2014-07-02 中国科学院光电技术研究所 基于光学相干层析引导的显微外科手术系统及导航方法
EP2949285A1 (fr) * 2014-05-27 2015-12-02 Carl Zeiss Meditec AG Microscope chirurgical
US9933606B2 (en) 2014-05-27 2018-04-03 Carl Zeiss Meditec Ag Surgical microscope
WO2018076094A1 (fr) * 2016-10-31 2018-05-03 Synaptive Medical (Barbados) Inc. Système et procédés de navigation 3d
US11207139B2 (en) 2016-10-31 2021-12-28 Synaptive Medical Inc. 3D navigation system and methods
GB2571857A (en) * 2016-10-31 2019-09-11 Synaptive Medical Barbados Inc 3D navigation system and methods
GB2571857B (en) * 2016-10-31 2022-05-04 Synaptive Medical Inc 3D navigation system and methods
EP4389055A1 (fr) * 2016-11-11 2024-06-26 Intuitive Surgical Operations, Inc. Système chirurgical avec affichage d'image à modalités multiples
CN109982657A (zh) * 2016-11-11 2019-07-05 直观外科手术操作公司 具有多模态图像显示的外科手术系统
EP3538015A4 (fr) * 2016-11-11 2020-12-09 Intuitive Surgical Operations Inc. Système chirurgical avec affichage d'image à modalités multiples
CN109982657B (zh) * 2016-11-11 2023-06-30 直观外科手术操作公司 具有多模态图像显示的外科手术系统
WO2018089827A1 (fr) 2016-11-11 2018-05-17 Intuitive Surgical Operations, Inc. Système chirurgical avec affichage d'image à modalités multiples
CN106327587B (zh) * 2016-11-16 2019-06-28 北京航空航天大学 一种用于增强现实手术导航的腹腔镜视频精准融合方法
CN106327587A (zh) * 2016-11-16 2017-01-11 北京航空航天大学 一种用于增强现实手术导航的腹腔镜视频精准融合方法
CN107157588A (zh) * 2017-05-08 2017-09-15 上海联影医疗科技有限公司 影像设备的数据处理方法以及影像设备
US10977866B2 (en) 2017-06-09 2021-04-13 Siemens Healthcare Gmbh Output of position information of a medical instrument
EP3412242A1 (fr) * 2017-06-09 2018-12-12 Siemens Healthcare GmbH Émission de données de position d'un instrument technique médical
CN109833092A (zh) * 2017-11-29 2019-06-04 上海复拓知达医疗科技有限公司 体内导航系统和方法
US11430139B2 (en) 2019-04-03 2022-08-30 Intersect ENT International GmbH Registration method and setup
WO2022033656A1 (fr) 2020-08-10 2022-02-17 Brainlab Ag Étalonnage de caméra de microscope

Also Published As

Publication number Publication date
EP1861035A1 (fr) 2007-12-05
CA2600731A1 (fr) 2006-09-14
US20060293557A1 (en) 2006-12-28
JP2008532602A (ja) 2008-08-21

Similar Documents

Publication Publication Date Title
US20060293557A1 (en) Methods and apparati for surgical navigation and visualization with microscope ("Micro Dex-Ray")
US7491198B2 (en) Computer enhanced surgical navigation imaging system (camera probe)
CN101170961A (zh) 利用显微镜的用于外科手术导航及可视化的方法及设备
US9289267B2 (en) Method and apparatus for minimally invasive surgery using endoscopes
US11026747B2 (en) Endoscopic view of invasive procedures in narrow passages
US9615772B2 (en) Global endoscopic viewing indicator
JP7460631B2 (ja) デュアル画像センサを有する内視鏡
EP2641561A1 (fr) Système et procédé de détermination d'angles de caméra en utilisant des plans virtuels dérivés d'images réelles
JP2006320722A (ja) 対象領域の2d撮像の表示範囲の拡張方法
AU2022205690A1 (en) Registration degradation correction for surgical navigation procedures
WO2009027088A9 (fr) Visualisation améliorée dans des images bidimensionnelles
Paloc et al. Computer-aided surgery based on auto-stereoscopic augmented reality
CN115623163A (zh) 二维三维图像的采集与融合显示系统及方法
Salb et al. INPRES (intraoperative presentation of surgical planning and simulation results): augmented reality for craniofacial surgery
Akatsuka et al. Navigation system for neurosurgery with PC platform
US12023208B2 (en) Method for operating a visualization system in a surgical application, and visualization system for a surgical application
US20220175485A1 (en) Method for operating a visualization system in a surgical application, and visualization system for a surgical application
US20230032791A1 (en) Measuring method and a measuring device
EP4193957A1 (fr) Dispositif pour provisionner un video d'application chirurgicale
Jannin et al. A ray-traced texture mapping for enhanced virtuality in image-guided neurosurgery
EP4221581A1 (fr) Microscope chirurgical numérique à auto-navigation
Saucer et al. A head-mounted display system for augmented reality image guidance: towards clinical evaluation for imri-guided nuerosurgery

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2008500215

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 2600731

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

WWE Wipo information: entry into national phase

Ref document number: 2006708740

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Country of ref document: RU

WWE Wipo information: entry into national phase

Ref document number: 200680014960.7

Country of ref document: CN

WWP Wipo information: published in national office

Ref document number: 2006708740

Country of ref document: EP