CN101170961A - Methods and devices for surgical navigation and visualization with microscope - Google Patents

Methods and devices for surgical navigation and visualization with microscope Download PDF

Info

Publication number
CN101170961A
CN101170961A CNA2006800149607A CN200680014960A CN101170961A CN 101170961 A CN101170961 A CN 101170961A CN A2006800149607 A CNA2006800149607 A CN A2006800149607A CN 200680014960 A CN200680014960 A CN 200680014960A CN 101170961 A CN101170961 A CN 101170961A
Authority
CN
China
Prior art keywords
microscope
image
detector
patient
video camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CNA2006800149607A
Other languages
Chinese (zh)
Inventor
朱传贵
库苏马·阿古桑托
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bracco Imaging SpA
Original Assignee
Bracco Imaging SpA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bracco Imaging SpA filed Critical Bracco Imaging SpA
Publication of CN101170961A publication Critical patent/CN101170961A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Microscoopes, Condenser (AREA)

Abstract

An improved system and method for macroscopic and microscopic surgical navigation and visualization are presented. In exemplary embodiments of the present invention an integrated system can include a computer which has stored three dimensional representations of a patient's internal anatomy, a display, a probe and an operation microscope. In exemplary embodiments of the present invention reference markers can be attached to the probe and the microscope, and the system can also include a tracking system which can track the 3D position and orientation of each of the probe and microscope. In exemplary embodiments of the present invention a system can include means for detecting changes in the imaging parameters of the microscope, such as, for example, magnification and focus, which occur as a result of user adjustment and operation of the microscope. The microscope can have, for example, a focal point position relative to the markers attached to the microscope and can, for example, be calibrated in the full range of microscope focus. In exemplary embodiments of the present invention, the position of the microscope can be obtained from the tracking data regarding the microscope and the focus can be obtained from, for example, a sensor integrated with the microscope. Additionally, a tip position of the probe can also be obtained from the tracking data of the reference markers on the probe, and means can be provided for registration of virtual representations of patient anatomical data with real images from one or more cameras on each of the probe and the microscope. In exemplary embodiments of the present invention visualization and navigation can be provided by each of the microscope and the probe, and when both are active the system can intelligently display a microscopic or a macroscopic (probe based) augmented image according to defined rules.

Description

Utilize microscopical surgical navigation and visualization method and the equipment of being used for
The cross reference of related application
The application requires the common U.S. Provisional Patent Application No.60/660 that transfers the possession of of application herewith in submission on March 11st, 2005,845 priority, and this application is incorporated herein by reference.The interests of the PCT/SG2005/00244 that the application also requires Systems and Methods For Mapping A Virtual Model Of AnObject To The Object (" Multipoint Registration ") by name, submitted on July 20th, 2005, this application also is included in here by reference.The application is included in also having here by reference: submit, be published as the U.S. Patent application No.10 that U.S. published patent application is announced No.20050015005 on April 27th, 2004,832,902, (" the Camera-probe Application ").
Technical field
The present invention relates to surgical navigation and visualization system based on image.
Background technology
Neurosurgery adopts two kinds of modes of operation to finish usually: macromodel and microscopic mode.Preceding a kind of mode utilizes bore hole to observe surgical area usually, and then a kind of mode is to utilize microscope to observe surgical area.In each operator scheme, be successfully applied to auxiliary doctor based on the navigation of image and visualization system and finished various meticulous surgical procedure.
Based on the navigation of image and visual in, usually before the surgical operation or during, produce the image of the internal anatomy of describing patient by nuclear magnetic resonance (MRI), computed tomography (CT) and multiple other technologies.Produce patient's three-dimensional (3D) expression according to image.This expression can be adopted various ways, from volume images with by the 3D model of the patient's of image reconstruction various anatomical structures, and to the drawing, note and the measurement that increase for the planning of explanation surgical operation, and their combination.In surgical operation,, 3D is represented to mate with patient by image registration.By the image of internal anatomy is combined with actual surgical area, navigation system can be improved the ability of the intravital different anatomical features of surgeon's position patient in operation.
In the macroscopic view navigation, user (surgeon) holds detector, follows the tracks of this detector by tracking equipment.When this detector was introduced in the surgical area, the position that is expressed as the detector end of icon was plotted on the view that patient's 3D represents.Navigation helps the surgeon to decide the inlet point of operation, understanding to avoid key structure towards the anatomical structure of target and along surgical approach.
The navigation system of improvement has been described in the No.20050015005 patent application that the U.S. has announced, and wherein, detector comprises microcam.This system is supported in the enhanced navigation of the augmented reality in the given field of operation by the real time imaging that is obtained by microcam that observation covers that patient's 3D represents.
In microsurgery, operating microscope is used to provide the amplification of the surgical area that the surgeon is operated just therein usually.For navigation purpose can be followed the tracks of microscope, and its focus can be displayed on 3D usually and replaces the detector end in representing.
For fear of sight line is transferred to monitor from operative site, " image injects (imageinjection) " microscope is developed, and wherein, the navigation picture that is produced by computer workstation is added on the microscopical optical imagery.The image that this stack requirement is seen by microscope is consistent on how much with synergetic view data.
The image that stacks in based on microscopical navigation system is made up of the two-dimensional silhouette on the optical imagery plane that is added to.In order to obtain three-dimensional effect, the surgeon must be between the different planes of delineation roll screen and in brains, the profile that injects is merged into threedimensional model.
This conventional art allows the surgeon to navigate in surgical area in macroscopical surgical operation and microsurgery.Then, they also have following significant disadvantage.
At first, in microsurgery, usually the surgeon wish can based on microscopical navigation and visual and based on the navigation of detector and visual between switch.In order to realize this target, the surgeon must move microscope height usually and/or remove surgical area, and the detector that will navigate then moves in the surgical area, has seriously hindered normal surgical procedure.
Secondly, for making the surgeon can be in micro structure finish the program of precision on for example neural and blood vessel, in operation, usually microscopical enlargement ratio is arranged on higher level.
Although the certain permission of this high enlargement ratio is visual to this micro structure, it has also often limited to the visual field.Because the virtual image that after this is applied has identical magnification ratio, so the demonstration of virtual objects also is restricted.This can cause that the surgeon can not be clear and definite identifies his the just situation by the physical location of fractographic zone on patient body.Briefly, he is can observed zone too little.In addition, overlay image also may not provide a large amount of Useful Informations, because the anatomical structure around the field of operation is outside the visual field, and therefore invisible.In addition, under this environment, the surgeon can not observe the 3D anatomical structure of its care from different viewpoints.
The 3rd, in microsurgery, the surgeon wishes to know fully surgical area all structures on every side usually.In legacy system, navigation picture is added on the microscopical optical imagery.Although having the surgeon, this technology do not need sight line is removed the advantage that microscope just can be observed navigator views, but its shortcoming is to be merely able to finite information in the show navigator view, and should show the surgical optical view of obstruction that possibility is serious, and image injects the cost that has increased system.
Therefore, need surgical navigation and visualization method and system in the prior art, it reduces the demand that moves apart the zoomed-in view of surgical area for navigation in the microsurgery navigation.
Also further need surgical operation formation method and system in the prior art, it can provide the enhanced microcosmic and the macroscopic navigation and visual of integrated augmented reality, and can be supported in the seamless and efficient switching between the two.
Summary of the invention
What the present invention relates to a kind of improvement is used for both macro and micro surgical navigation and visual system and method.In exemplary embodiment of the present invention, integrated system can comprise: computer, the three dimensional representation of its storage patient internal anatomy; Display; Detector and operating microscope.In exemplary embodiment of the present invention, reference mark can be attached on detector and the microscope, and this system also can comprise can tracking detector and microscopical each the 3D position and towards tracking system.In exemplary embodiment of the present invention, system can comprise the device that is used for detecting microscopical imaging parameters variation, wherein for example enlargement ratio and focusing of parameter, and it is regulated according to the user and the microscope running takes place.Microscope for example can have with respect to the focal position that is attached to the labelling on the microscope, and it can for example be calibrated in the gamut of microscope focusing.In exemplary embodiment of the present invention, microscopical position can be from about obtaining the microscopical tracking data, and focal length can obtain from the pick off that for example integrates with microscope.In addition, the terminal position of detector also can obtain from the tracking data of the reference mark on the detector, and, can be provided for the virtual representation of patient anatomy data and the true picture that comes from detector and microscopical one or more video camera on each are carried out the device of registration.In exemplary embodiment of the present invention, visual and navigation picture can be provided by each of microscope and detector, and when the two all activates, system can be according to defined rule, (based on detector) intelligent display or microcosmic or macroscopic real, virtual or enhanced image.
Description of drawings
Figure 1A-1C shows the digital zoom according to the augmented reality image of exemplary embodiment of the present invention;
Fig. 1 D shows the exemplary navigation system according to exemplary embodiment of the present invention;
Fig. 2 has shown the sketch map according to the true picture of the exemplary patient head of exemplary embodiment of the present invention;
Fig. 3 has shown the sketch map according to the virtual image of the tumor of exemplary embodiment of the present invention and blood vessel;
Fig. 4 has shown the sketch map according to synthetic (augmented reality) image of exemplary embodiment of the present invention;
Fig. 5 has shown the sketch map according to the augmented reality view of the amplification of exemplary embodiment of the present invention;
Fig. 6 has shown the sketch map according to the micro-view of the amplification of exemplary embodiment of the present invention;
Fig. 7 has shown that the numeral according to exemplary embodiment of the present invention pushes away the far sketch map of the micro-view of (amplification);
Fig. 8 has shown the sketch map from the exemplary navigation view of detector according to exemplary embodiment of the present invention;
Fig. 9 has shown the exemplary navigation view from surgical microscope according to exemplary embodiment of the present invention;
Figure 10 has shown the example view of the Fig. 9 after the numeral according to exemplary embodiment of the present invention furthers; And
Figure 11 has shown the exemplary augmented reality navigator views from exemplary detector according to exemplary embodiment of the present invention.
The specific embodiment
In exemplary embodiment of the present invention, can provide and be integrated in navigation in the both macro and micro surgical operation and visual swimmingly.Therefore, in this exemplary embodiment,, do not need to make surgical microscope to break away from or remove to be used for navigation or visual surgical area in order during microsurgery, to realize macroscopic view navigation or visual.In addition, in exemplary embodiment of the present invention, the real enhanced navigation system of amplification can be provided, and this system can for example provide the microcosmic and the macroscopic navigation information of patient's three-dimensional (3D) anatomical structure according to circumstances for the surgeon, and does not need to make the microscope disengaging or remove surgical area.
In exemplary embodiment of the present invention, for example, but the video camera rigid attachment is on microscope.The virtual microimaging machine model of (position and towards) can for example be stored and had the imaging attribute identical with corresponding actual camera and pose to computer, and described imaging attribute comprises focal length, visual field and distortion parameter, zoom and focusing.In exemplary embodiment of the present invention, can produce the enhancing view that is used for micro-navigation as follows by generator: response is from the microscopical position of tracking equipment with towards data and the enlargement ratio and the focal length data that for example utilize integrated sensor to obtain from microscope itself, utilizes virtually drawing figure according to the patient 3D anatomical structure of respective virtual microimaging machine model to cover video image from depend on the circumstances single camera or a plurality of video cameras on the microscope.
In exemplary embodiment of the present invention, also the video camera that integrates with detector can be arranged, for example, described in Camera-probe Application.As described in the Camera-probeApplication, can provide and have the imaging attribute identical and pose the dummy model of the video camera of (position and towards) with actual camera, described imaging attribute comprises focal length, visual field and distortion parameter.In addition, can produce the enhancing view that is used for the macroscopic view navigation as follows by generator: response is from the position of detector of tracking equipment with towards data, cover video image by the drafting figure that utilizes patient 3D anatomical structure from the video camera in the detector, wherein, drawing figure is produced according to the virtual video camera model by computer.
In exemplary embodiment of the present invention, can carry out digital zoom to enhanced micro-view, therefore do not need to change the zoomed-in view that microscopical position and setting (enlargement ratio and focal length) just can obtain micro-navigation.Therefore anatomical structure outside the microscopical optics visual field under Set For Current can show in this mode of far dwindling demonstration that pushes away: only partly covered by the real time video image from microscopical video camera at display centre.In addition, in exemplary embodiment of the present invention, the user is for obtaining macroscopic navigator views and do not need to change microscopical setting or it being removed.The user only needs mobile detector, and this detector can be from any viewpoint to the surgical area imaging.
As mentioned above, micro-image can be by digital zoom.This is described below.The furthering of the change of enlargement ratio or AR image amplified visual field (its cone shape just) by the change virtual video camera together with true picture and guarantee that video image plane is alignd along the cone of virtual video camera and realize.With reference to figure 1A-1C, this design has been described.Notice that former figure is chromatic, and following these colors of description reference.Yet, even institute's reference object can easily be distinguished in gray level image.
Figure 1A shows: virtual video camera (the red axis in left side among the left figure); And the cone of virtual video camera, by being connected to far plane (Dark grey; The right of left figure) hither plane (navy blue; The left side of left figure) expression; Together with virtual objects.
The picture centre of video image (peach rectangle) and the center-aligned of the cone.In this set, for example, video image is sized to hither plane measure-alike.Therefore, the whole video image has covered screen window (perhaps viewport), thereby does not have zooming effect.
In Figure 1B, the cone changes, makes virtual objects be come projection with the effect of amplifying or further.This change in the cone has caused the change of video image visible part in screen space.Because the some parts that has only video image now is projection plane (hither plane) in and the cover screen form, so the effect of the amplification of furthering is also arranged in video image.
In Fig. 1 C, the cone is changed, and makes virtual objects be come projection to push away the effect (seeming littler) of far dwindling.This change of the cone has caused the whole video image in projection plane (hither plane) only to cover the part of screen window, and therefore video image seems littler in screen window.
In exemplary embodiment of the present invention, the parameter of the perspective matrix of virtual video camera that can be by change producing perspective projection realizes the change of the cone.Especially, the perspective projection matrix of 4 * 4 matrixes that define in the OpenGL environment, for example, can adopt following parameter to define:
ProjMat[0]=2*Near/(Right-Lest)*zoomFactor
ProjMat[2]=(Right+Left)/(Right-Left)
ProjMat[5]=2*Near/(Top-Bottom)*zoomFactor
ProjMat[6]=(Top+Bottom)/(Top-Bottom)
ProjMat[10]=-(Far+Near)/(Far-Near)
ProjMat[11]=-2*Far*Near/(Far-Near)
ProjMat[14]=-1
ProjMat[15]=0
Wherein, element 1,3,4,7,8,9,12 and 13 value be 0 (according to from left to right, from top to bottom rule reads).
Parameter L eft, Right, Top, Bottom are based on the function of the microscope model of intrinsic camera calibrated parameter together with microscopical focusing and zoom setting.Parameter N ear and Far can for example be set to constant value.
Parameter zoomfactor is that decision furthers and amplifies or push away the coefficient that far dwindles effect.When its value less than 1 the time, effect is dwindled for pushing away far, and is worth greater than 1 the time when it, effect is the amplification that furthers.When its value equals 1, there is not zooming effect.
In exemplary embodiment of the present invention, video image can be shown as the texture maps that has rectangular projection.Amplify or push away in the process of far dwindling furthering,, can adopt following parameter to regulate the OpenGL viewport in order to make it possible to carry out the correct and consistent covering of the virtual objects in the video image:
Glfloat?cx=fabs(Left)/(Right-Left)
Glfloat?cy=fabs(Bottom)/(Top-Bottom)
glViewport((1-zoomFactor)*screenWidth*cx+originX,
(1-zoomFactor)*screenHeight*cy+originY,
screenWidth*zoomFactor,
screenHeight*zoomFactor);
Basically, utilize zoomFactor to adjust the size of form, and come the initial point of mobile viewport, thereby visible video image is correctly covered by virtual image according to the initial point of the center (cx and cy) of zoomFactor, video image and OpenGL window.
In exemplary embodiment of the present invention, in microsurgery, detector can be used to from change towards obtaining navigation picture with the position.Can show that surgical area anatomical structure on every side is together with microscopical focus and optical axis from the viewpoint of detector photographing unit.Therefore, can for example present to the surgeon and not need to change microscope from the anatomical structure around the surgical area of different points of observation.
With reference to figure 1D, shown the surgical navigation systems that in carrying out the neurosurgery program, uses according to exemplary embodiment of the present invention.In the figure, surgical operation is a microscopic mode.Operating microscope 115 comprises camera 105, and this camera can be a color camera for example, is installed on the microscopical imaging port and reference mark 110 can be installed on this camera.Microscope 115 for example can comprise the variation of the microscopical imaging parameters that takes place to detect the result who regulates as microscope of built-in pick off, and wherein, described imaging parameters comprises microscope enlargement ratio and focal length.This pick off for example can be an encoder.The adjusting of focusing and zoom comprises that the machinery of camera lens moves, and this encoder for example can be measured this moving.Can obtain parameter from microscopical serial port.Data format for example can be such form: zoom :+120; Focusing: 362.Microscope also can comprise optical axis 111 and focus 112, and this focus 112 is defined as the intersection point of microscopical optical axis and focussing plane.Focussing plane and optical axis are vertical.On focussing plane, obtain distinct image.Focussing plane changes with the adjusting of focal length.In exemplary embodiment of the present invention, can in the gamut that microscope focuses on, calibrate the position of focus, so this position can be described according to tracking data with respect to reference mark 110.
In Fig. 1 D, the surgeon is just by microscopic examination, and patient's head 152 is arranged along microscopical light path.Exemplary patient has tumor 155 (for the destination object of operation) and near the blood vessel structure tumor 155 150 (should avoid) in operation.Positioning control system 100 (for example, NDI Polaris) can receive order and can tracking data be sent to computer 120 by wireless mode or by the cable that links to each other with computer.
Computer 120 can be before navigation, for example, before art scanning and with this scan-data be treated to comprise a plurality of cut apart and the volumetric data set of layout data after, the 3D model 125 of tumor 155 and blood vessel structure 120 is stored in the memorizer of computer.Detector 140 for example can comprise video camera 135, and front end can be attached with the indicator that has end 136.Detector 140 can be placed on the position that the surgeon takes easily, with convenient use in surgical operation.Detector for example can be the disclosed type of Camera-probe Application.Positioning control system 100 can be the continuous real-time tracking data that computer provides microscope 115.When detector 140 was introduced into surgical area, positioning control system 100 also can be the continuous real-time tracking data that computer provides detector 140.Computer can be connected to (i) display 130, the (ii) camera of microscope 115 and pick off, and the (iii) microcam of detector.System can further comprise software detecting microscope and position of detector according to tracking data and towards data, and automatically selects one (detector or microscope) as being used to navigate and/or the basis of visual image from these position datas.This automatic selection can or may be applicable to the various algorithms of given application and given user preference according to the priority rule of definition.
For example, given user may have a preference for the cardinal principle orientation of understanding fully him by macroscopic view, then when he uses micro-image during near meticulous structure.If operation comprises a plurality of stages, can be easy to find out that such surgeon will be that microscope is that detector is that microscope circulates again then more then by using detector then.For such surgeon, system can realize: at the initial period of operation, main tool is a detector, in case used microscope then then microscope is main instrument, up to having selected new microscope position, begin reuse detector the another one stage this moment.Therefore, system can produce composograph on display, and this composograph is with corresponding by the view of the instrument of priority from this moment.Can realize many alternate rules, and the surgeon always covers (override) this priority setting by firing switch or voice control interface.
With reference to figure 1D, computer 120 can receive the real time video image of the surgical site that obtains by microimaging machine 105 again.Microimaging machine 105 for example can have micro-virtual video camera model, this model can be provided and be stored in the computer 120.
In exemplary embodiment of the present invention, micro-virtual video camera model can comprise one group of intrinsic parameter and outer parameter, wherein, described intrinsic parameter can comprise for example focal length, picture centre and distortion, and described outer parameter can comprise for example virtual microimaging machine model with respect to the position of reference frame and towards.
In exemplary embodiment of the present invention, reference frame for example can be the coordinate system that is rigidly attached to the labelling 110 on the microscope 115.
In exemplary embodiment of the present invention, the intrinsic parameter of microimaging machine model and outer parameter can change according to the change of microscopical enlargement ratio and focal length.
In exemplary embodiment according to the present invention, the intrinsic parameter of microimaging machine model and outer parameter can be described to the bivariate polynomial function of microscope enlargement ratio and focal length.For example, parameter ρ (ρ represents one of intrinsic parameter and outer parameter) can be modeled as the q rank bivariate polynomial function of microscopical focal length value (f) and zoom level (z), and is as follows:
ρ ( z , f ) = Σ m , n a m , n z m f n (m,n≥0;m+n≤q)
In order to solve coefficient a M, n, microscope can be calibrated to strides microscope focusing and the FR plurality of fixed video camera of zoom (having fixed focal length).After abundant fixed cameras calibration, be provided with down at different zooms and focusing, can obtain one group of calibration data.Can for example find the solution the coefficient a of polynomial function then by the match of bivariate polynomial M, n
The microscopical exemplary type microimaging of exemplary type in augmented reality microscopic system machine model can followingly be represented:
Intrinsic parameter
Picture size: Nx=768, Ny=576
Picture centre: Cx=384, Cy=288
Focal length:
fx=-0.000000008*F*Z^3+(-0.000004613)*F*Z^2+(-0.001289058)*F*Z+(-0.022283345)*F+0.000039765*Z^3+0.042230380*Z^2+21.010557606*Z+4970.548674307
fy=-0.000000010*F*Z^3+(-0.000001564)*F*Z^2+(-0.001287695)*F*Z+(-0.020680795)*F+0.000034475*Z^3+0.040391899*Z^2+20.227847227*Z+4767.03789957
Outer parameter
Owcx=0.000008797*F+(-0.058476064)
Owcy=-0.000016119*F+(-0.781894036)
Owcz=-0.000004200*F+(-0.078145268)
Twcx=0.000000000*F^2*Z+(-0.000000747)*F^2+(-0.000002558)*F*Z+(-0.006475870)*F+0.000141871*Z+0.271534556
Twcy=-0.000000001*F^2*Z+(-0.000001826)*F^2+(0.000002707)*F*Z+(-0.004741056)*F+(-0.003616348)*Z+5.606256436
Twcz=0.000000302*F^2*Z+0.000014187*F^2+(-0.000088499)*F*Z+(-0.018100412)*F+0.061825291*Z+422.480480324
Wherein, Owcx, Owcy, Owcz are rotating vectors, can calculate spin matrix by this rotating vector, and Twcx, Twcy, Twcz are the conversion on x, y and z, and can be building up to the transformation matrix of reference frame by it from the microimaging machine to reference frame.
Therefore, any zoom level that provides that adjusts the telescope to one's eyes and to coke number can be created corresponding virtual microimaging machine and can be used to produce the virtual image of virtual objects.
Shown in Fig. 1 D, computer 120 receives the current enlargement ratio of microscope and reaches coke number.Therefore can calculate the inside and outside parameter of virtual microimaging machine according to the microimaging machine model of storage.Can adopt the tracking data of labelling on the microscope represent in the Position Tracking Systems system of virtual microimaging machine the position and towards.
Shown in Fig. 1 D, microscope has optical axis 111 and focus 112.In exemplary embodiment according to the present invention, the focal position changes with respect to reference marker according to the change of microscope focusing.
In exemplary embodiment according to the present invention, can before navigation, calibrate the position of microscope focus with respect to reference mark.Provide example calibration result below from the exemplary microscopical focus of augmented reality microscopic system.
FocusPoint (x, y, z)=(Fpx, Fpy, Fpz), wherein
Fpx=-0.000001113*F^2+0.001109120*F+116.090108990;
Fpy=0.000002183*F^2+(-0.000711078)*F+(-27.066366422);
Fpz=-0.000073468*F^2+ (0.154217215) * F+ (369.813473763); And
F represents focusing.
The calibration result of focus can be stored in the computer.Therefore, for microscopical any given, can represent the position of focus by the tracking data of reference mark to coke number.
In exemplary embodiment according to the present invention, optical axis for example can have the line of different microscopes to the focus of coke number (focal value) for connection.
In exemplary embodiment of the present invention, can use known image registration techniques that patient's view data is mapped to patient.For example, a kind of such image registration techniques: use the lip-deep some anatomical features of patient body (at least 3), by the relevant position at patient on one's body these anatomical features definite with using tracking detector of the identification and the position of localized these anatomical features in scanogram mated, thereby patient's view data is mapped to patient.By the surface of the patient's who is generated by imaging data body part being mapped to the surface data of the corresponding body part that on operating-table, produces, can further improve the accuracy of image registration.For example, this method is described in detail in the International Application PCT/SG2005/00224 that is called " Systems and MethodsFor Mapping A Virtual Model Of An Object To The Obj ect (" MultipointRegistration ") " that submitted on July 20th, 2005.The method for registering images of describing in this PCT application can be applied directly in the micro-navigation in the exemplary embodiment here.The purpose of image registration is to make patient's imaging data and patient mate, and it can for example can finish the feasible result that can use image registration in micro-navigation the macroscopical stage before microscope is not got involved.After the image registration, patient's view data comprises all objects of cutting apart and other objects that are associated with imaging data that produce when the surgery surgery planning, be registered to actual patient.For example, in Fig. 1 D, be stored in tumor and the model of blood vessel and the tumor 155 and blood vessel 150 registrations of the reality in the patient head in the computer 120.
The position of patient head 152 and towards and the position of microimaging machine 105 and towards being transformed common coordinate system, for example coordinate system of positioning control system.But the relative position between head 152 and the microimaging machine 105 and come dynamically to determine towards therefore use location tracking system 100.
As shown in Figure 2, in exemplary embodiment of the present invention, the microimaging machine can be caught the video image of patient head 152.Tumor 155 and blood vessel 150 may be sightless (because the part of still sealing in the head may visually hinder them) in video image.
As shown in Figure 3, in exemplary embodiment of the present invention, computer can be based on the virtual image of the inside and outside parameter of virtual microimaging machine and the tumor of being stored and vascular pattern generation tumor 155 and blood vessel 150.
As shown in Figure 4, in exemplary embodiment of the present invention, true picture 201 and virtual image 301 can be combined and produce the augmented reality image.The augmented reality image can for example be displayed on the display device 130 then.Display 130 can be monitor, HMD, be installed in display of microscope inside etc. for " image injection ".
The 3D model of tumor and blood vessel can be produced by patient's three-dimensional (3D) image.For example, from the MRI or the CT image of patient head.In exemplary embodiment according to the present invention, the Dextroscope system that can adopt the hardware and software that is provided by Volume Interaction Pte company for example to move RadioDexter software produces such data.
In exemplary embodiment according to the present invention, the augmented reality image can adopt multiple mode to show.True picture can be coated over (true picture is on virtual image) on the virtual image, is perhaps covered (virtual image is on true picture) by virtual image.Can change the transparency of overlay image, can show the augmented reality image in many ways like this: only show virtual image, only show true picture or show composograph.Meanwhile, for example, axial plane, coronalplane and the sagittal plane of the 3D model that changes according to the focal position can be displayed in three independent windows.
In exemplary embodiment according to the present invention, the augmented reality in micro-navigation can adopt the multiple microscope setting in whole enlargement ratio and focusing gamut.
Fig. 5 has shown the exemplary augmented reality view of the patient head that adopts different (bigger with respect to Fig. 3 and Fig. 4) enlargement ratios settings.
In exemplary embodiment according to the present invention, can adopt the actual enlargement ratio that changes the augmented reality image of digital zoom.The zoom ratio can be user's input.The visual field of acquiescence zoom can for example be centered at the window center place.
Fig. 6 has shown by the example virtual image by the navigation picture that includes only surgical area that microscope produced of higher enlargement ratio.The surgeon just undergos surgery on tumor, so in microscopical optical imagery, the part of tumor is visible.Yet, most of tumor and all blood vessels be hidden in exposure the surface down or not within microscopical visual field, so the surgeon can not directly see.The surgeon can be displayed to by the tumor of computer generation and the drafting figure of blood vessel, but, the sub-fraction of tumor and blood vessel can only be shown owing to the cause of amplifying.
It is crucial understanding the tumor outside microscopic fields of view and the accurate 3D structure and the position of blood vessel under the situation that does not change microscope enlargement ratio and position.Therefore, Fig. 7 has shown microscopical actual enlarged image, and wherein the total of tumor and blood vessel is visible.In exemplary embodiment of the present invention, this can realize by digital zoom.In fact digital zoom changes the visual field of virtual microimaging machine model, makes still to draw the 3D model in the virtual video camera visual field different visual fields from identical viewpoint.Digital zoom make the surgeon can not change microscopical actual just be provided with can the visual field of viewing microscope outside.In exemplary embodiment of the present invention, therefore video signal also can, can be had video (truly) image, virtual image or the merging with different transparencys of the two by the image of zoom by zoom.Fig. 7 dwindles with respect to pushing away far of the view of Fig. 6, is obviously bigger amplification (amplification furthers) but also have Fig. 3 certainly with respect to Fig. 5.Therefore, the user can change zoom level continually in program that provides or operation process, repeatedly amplifies and dwindles.
In the neurosurgery performance, the surgeon may for example use detector 140 to carry out image registration and select inlet point by utilizing detector to navigate.Introduce microscope then and be used to carry out meticulous navigation and guiding.In surgical procedures, move the microscope easier manipulation of navigating owing to carry out navigation ratio by mobile detector 140, the surgeon may need to use detector 140 to navigate frequently.In this exemplary application situation, system is supported in and carries out fast between two kinds of air navigation aids and switch stably.
Fig. 8 shows the example scenarios of Fig. 7 from the viewpoint of the microcam of detector inside.Microscopical focus and light path can show with tumor and blood vessel, show the 3D relation of microscope, surgical area and virtual objects (blood vessel and tumor).
Fig. 9-the 11st, the actual screen shot of exemplary embodiment of the present invention.Fig. 9 has shown the exemplary navigation view from surgical microscope according to exemplary embodiment of the present invention.
Figure 10 has shown that adopting the above-mentioned technology relevant with Fig. 7 to carry out numeral according to exemplary embodiment of the present invention pushes away the exemplary plot of far dwindling back Fig. 9.
Figure 11 has shown the exemplary augmented reality navigation picture from exemplary detector according to exemplary embodiment of the present invention, and some does not have microscopical light path and focus with consistent shown in Fig. 8.
In exemplary embodiment according to the present invention, can be performed automatically in the selection between microscope and the detector.Automatically selection can be based on tracking data (just, being the function of tracking data).In exemplary embodiment according to the present invention, can realize automatic selection by detector being provided with higher priority.If it is available having only microscopical tracking data, microscope will be shown as navigational tool and its AR image selected so.If microscope and detector are all tracked, detector will be shown selected and its AR image so.In this case, for example, microscope can be left in the basket.When detector was not tracked, microscope can for example automatically be selected to be used for navigation.Video image also can automatically be changed thus.
Therefore system of the present invention, method and apparatus make the user can observe " outside the normal view " in macroscopical surgical operation and microsurgery.This allows the user always know that how near extremely sensitive the or important concealed structure of his or she distance have actually.

Claims (19)

1. integrated surgical navigation and visualization system comprise:
Microscope;
Be attached to described microscopical at least one video camera;
Computer;
Be installed in the microimaging machine model in the described computer;
Detector;
Be attached to the video camera of described detector;
Be stored in the detector camera model in the described computer;
Be used for determining the tracking equipment of described detector and described microscopical pose (position and towards);
Be pre-stored in the three-dimensional patient image data in the described computer; And
Display;
Wherein, automatic pose data and the video image of selecting to be associated with described detector or described microscope of described computer is used for surgical navigation.
2. the system as claimed in claim 1, wherein, described automatic selection is based on the detector view of tracking data and definition and the relevant priority algorithm of microscope view.
3. system as claimed in claim 3, wherein, described microscopical enlargement ratio and focusing are adjustable, and the value of wherein sensor enlargement ratio value and focusing and these data are passed to described computer.
4. the system as claimed in claim 1, wherein, have be attached to imaging attribute that described microscopical video camera is complementary, position and towards virtual microimaging machine can produce according to described microimaging machine model, microscope tracking data and microscopical zoom level with to coke number.
5. the system as claimed in claim 1, wherein, described microscope has focus, and focus can microscopically illustrate coke number and described microscopical tracking data according to described with respect to patient's position.
6. system as claimed in claim 4, wherein, strengthen by virtual image from the video image that is attached to described microscopical video camera, described virtual image is produced from described three-dimensional patient image data according to described virtual microimaging machine by described computer, and shows composograph on display.
7. the system as claimed in claim 1, wherein, have be attached to imaging attribute that the video camera on the described detector is complementary, position and towards the dummy detector video camera can produce according to detector camera model and detector tracking data.
8. system as claimed in claim 4, wherein, strengthen by described virtual image from the video image that is attached to the video camera on the described detector, described virtual image is produced from described three-dimensional patient image data according to described dummy detector video camera by described computer, and shows composograph on described display.
9. system as claimed in claim 2, wherein, the user can control described selection by starting interface physics or sound.
10. surgical navigation and visualization method comprise: obtain the 3 d image data from patient; Described 3 d image data is stored in the data processor; With described 3 d image data and described patient's registration; Obtain this patient's real time video image from being attached to video camera on the microscope; Follow the tracks of described microscopical position and towards; Receive described microscopical zoom level and to coke number; Build virtual microimaging machine according to microimaging machine model, tracking data, zoom level with to coke number and described microscope; Produce the virtual image of described patient's 3 d image data; By the described real time video image of stack on described virtual image, produce the augmented reality image; And on one or more display, show described augmented reality image.
11. method as claimed in claim 10, wherein, described augmented reality image can be by digital zoom, and does not need to change described microscopical position, zoom level or to coke number.
12. method as claimed in claim 11 wherein, in the augmented reality image of described digital zoom, goes up alignment mutually for how much with true picture and virtual image.
13. method as claimed in claim 12, wherein, described augmented reality image is pushed far, produces and be presented at the virtual image of the 3 d image data of the patient outside the visual field of described true picture; Partly cover on the described true picture from described video camera.
14. method as claimed in claim 13 is included as the described microscope that detector that automatic selection is attached with video camera replaces being used for surgical navigation.
15. method as claimed in claim 14 comprises: from being attached to the real time video image of the video camera acquisition patient on the described detector; Follow the tracks of described position of detector and towards; Build the dummy detector video camera according to detector camera model and tracking data; Produce the virtual image of patient's 3 d image data according to described dummy detector video camera; By the described real time video image of stack on described virtual image, produce the augmented reality image.
16. method as claimed in claim 15 wherein, can be carried out digital zoom to described augmented reality image, and not need to change described position of detector.
17. method as claimed in claim 16 comprises from being attached to video camera on the described detector and obtains patient's real time video image that described microscope remains in the surgical environment simultaneously; Produce the virtual image of focus, optical axis and patient's 3 d image data according to described dummy detector video camera; By the described real time video image of stack on described virtual image, produce the augmented reality image.
18. method as claimed in claim 10, thus be included in during the microsurgery to described detector position according to change towards obtaining navigator views with the position.
19. method as claimed in claim 18, wherein, the anatomical structure around the surgical area can show from the viewpoint of described detector video camera together with described microscopical focus and optical axis.
CNA2006800149607A 2005-03-11 2006-03-13 Methods and devices for surgical navigation and visualization with microscope Pending CN101170961A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US66084505P 2005-03-11 2005-03-11
US60/660,845 2005-03-11
SGPCT/SG/2005/00244 2005-07-20

Publications (1)

Publication Number Publication Date
CN101170961A true CN101170961A (en) 2008-04-30

Family

ID=39391306

Family Applications (1)

Application Number Title Priority Date Filing Date
CNA2006800149607A Pending CN101170961A (en) 2005-03-11 2006-03-13 Methods and devices for surgical navigation and visualization with microscope

Country Status (1)

Country Link
CN (1) CN101170961A (en)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011069469A1 (en) * 2009-12-11 2011-06-16 Hospital Authority Stereoscopic visualization system for surgery
CN102448398A (en) * 2009-06-01 2012-05-09 皇家飞利浦电子股份有限公司 Distance-based position tracking method and system
CN102946784A (en) * 2010-06-22 2013-02-27 皇家飞利浦电子股份有限公司 System and method for real-time endoscope calibration
CN103006332A (en) * 2012-12-27 2013-04-03 广东圣洋信息科技实业有限公司 Scalpel tracking method and device and digital stereoscopic microscope system
CN103402453A (en) * 2011-03-03 2013-11-20 皇家飞利浦有限公司 System and method for automated initialization and registration of navigation system
CN103957834A (en) * 2011-12-03 2014-07-30 皇家飞利浦有限公司 Automatic depth scrolling and orientation adjustment for semiautomated path planning
CN104104862A (en) * 2013-04-04 2014-10-15 索尼公司 Image processing device and image processing method
CN104582624A (en) * 2012-06-29 2015-04-29 儿童国家医疗中心 Automated surgical and interventional procedures
EP2949285A1 (en) * 2014-05-27 2015-12-02 Carl Zeiss Meditec AG Surgical microscope
CN105142561A (en) * 2013-04-30 2015-12-09 株式会社高永科技 Optical tracking system and tracking method using same
CN105264571A (en) * 2013-05-30 2016-01-20 查尔斯·安东尼·史密斯 Hud object design and method
CN106456273A (en) * 2014-06-20 2017-02-22 索尼奥林巴斯医疗解决方案公司 Medical observation device and medical observation system
CN106551696A (en) * 2015-09-24 2017-04-05 柯惠有限合伙公司 Labelling is placed
CN106772996A (en) * 2017-01-23 2017-05-31 清华大学 A kind of augmented reality operating method and system
CN107111122A (en) * 2014-12-29 2017-08-29 诺华股份有限公司 Amplification and associated device in ophthalmologic operation, system and method
CN107105972A (en) * 2014-11-30 2017-08-29 埃尔比特系统公司 Model register system and method
CN107530133A (en) * 2015-05-14 2018-01-02 诺华股份有限公司 Surigical tool is tracked to control surgical system
CN108348295A (en) * 2015-09-24 2018-07-31 圣纳普医疗(巴巴多斯)公司 Motor-driven full visual field adaptability microscope
CN109300387A (en) * 2018-12-05 2019-02-01 济南大学 A kind of virtual microscopic material object interaction suite and its application
CN109326166A (en) * 2018-12-05 2019-02-12 济南大学 A kind of virtual microscopic material object external member and its application
CN109478346A (en) * 2016-08-04 2019-03-15 诺华股份有限公司 It is experienced using the ophthalmologic operation that virtual reality head-mounted display enhances
CN109801368A (en) * 2019-02-26 2019-05-24 浙江未来技术研究院(嘉兴) A kind of microscope visual area light field image merges display methods and device
CN110264504A (en) * 2019-06-28 2019-09-20 北京国润健康医学投资有限公司 A kind of three-dimensional registration method and system for augmented reality
CN110490130A (en) * 2019-08-16 2019-11-22 腾讯科技(深圳)有限公司 Intelligent optical data processing method, device and computer readable storage medium
CN110494921A (en) * 2017-03-30 2019-11-22 诺瓦拉德公司 Utilize the RUNTIME VIEW of three-dimensional data enhancing patient
CN112584789A (en) * 2018-06-19 2021-03-30 托尼尔公司 Mixed reality surgical system with physical markers registering virtual models
CN113413207A (en) * 2021-06-22 2021-09-21 南京康友医疗科技有限公司 3D visual medical operation system
CN114488501A (en) * 2018-04-25 2022-05-13 卡尔蔡司医疗技术股份公司 Microscope system and method for operating a microscope system
CN114730082A (en) * 2019-05-29 2022-07-08 S·B·墨菲 System and method for utilizing augmented reality in surgery

Cited By (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102448398A (en) * 2009-06-01 2012-05-09 皇家飞利浦电子股份有限公司 Distance-based position tracking method and system
WO2011069469A1 (en) * 2009-12-11 2011-06-16 Hospital Authority Stereoscopic visualization system for surgery
CN102946784A (en) * 2010-06-22 2013-02-27 皇家飞利浦电子股份有限公司 System and method for real-time endoscope calibration
US10290076B2 (en) 2011-03-03 2019-05-14 The United States Of America, As Represented By The Secretary, Department Of Health And Human Services System and method for automated initialization and registration of navigation system
CN103402453A (en) * 2011-03-03 2013-11-20 皇家飞利浦有限公司 System and method for automated initialization and registration of navigation system
CN103957834B (en) * 2011-12-03 2017-06-30 皇家飞利浦有限公司 Rolled for the automatic depth of semi-automatic path planning and direction regulation
US10758212B2 (en) 2011-12-03 2020-09-01 Koninklijke Philips N.V. Automatic depth scrolling and orientation adjustment for semi-automated path planning
CN103957834A (en) * 2011-12-03 2014-07-30 皇家飞利浦有限公司 Automatic depth scrolling and orientation adjustment for semiautomated path planning
CN104582624A (en) * 2012-06-29 2015-04-29 儿童国家医疗中心 Automated surgical and interventional procedures
CN104582624B (en) * 2012-06-29 2018-01-02 儿童国家医疗中心 Automatic surgical operation and intervention procedure
US10675040B2 (en) 2012-06-29 2020-06-09 Children's National Medical Center Automated surgical and interventional procedures
CN103006332B (en) * 2012-12-27 2015-05-27 广东圣洋信息科技实业有限公司 Scalpel tracking method and device and digital stereoscopic microscope system
CN103006332A (en) * 2012-12-27 2013-04-03 广东圣洋信息科技实业有限公司 Scalpel tracking method and device and digital stereoscopic microscope system
CN104104862B (en) * 2013-04-04 2018-12-07 索尼公司 Image processing apparatus and image processing method
CN104104862A (en) * 2013-04-04 2014-10-15 索尼公司 Image processing device and image processing method
US10307210B2 (en) 2013-04-30 2019-06-04 Koh Young Technology Inc. Optical tracking system and tracking method using the same
CN105142561A (en) * 2013-04-30 2015-12-09 株式会社高永科技 Optical tracking system and tracking method using same
CN105264571A (en) * 2013-05-30 2016-01-20 查尔斯·安东尼·史密斯 Hud object design and method
CN105264571B (en) * 2013-05-30 2019-11-08 查尔斯·安东尼·史密斯 HUD object designs and method
EP2949285A1 (en) * 2014-05-27 2015-12-02 Carl Zeiss Meditec AG Surgical microscope
US9933606B2 (en) 2014-05-27 2018-04-03 Carl Zeiss Meditec Ag Surgical microscope
CN106456273A (en) * 2014-06-20 2017-02-22 索尼奥林巴斯医疗解决方案公司 Medical observation device and medical observation system
CN106456273B (en) * 2014-06-20 2019-08-02 索尼奥林巴斯医疗解决方案公司 Medical observation device and medical observing system
CN107105972B (en) * 2014-11-30 2019-02-01 埃尔比特系统公司 Model register system and method
CN107105972A (en) * 2014-11-30 2017-08-29 埃尔比特系统公司 Model register system and method
CN107111122A (en) * 2014-12-29 2017-08-29 诺华股份有限公司 Amplification and associated device in ophthalmologic operation, system and method
CN107530133A (en) * 2015-05-14 2018-01-02 诺华股份有限公司 Surigical tool is tracked to control surgical system
US10986990B2 (en) 2015-09-24 2021-04-27 Covidien Lp Marker placement
CN106551696B (en) * 2015-09-24 2019-12-06 柯惠有限合伙公司 Mark placement
US11672415B2 (en) 2015-09-24 2023-06-13 Covidien Lp Marker placement
US10925675B2 (en) 2015-09-24 2021-02-23 Synaptive Medical Inc. Motorized full field adaptive microscope
CN106551696A (en) * 2015-09-24 2017-04-05 柯惠有限合伙公司 Labelling is placed
CN108348295A (en) * 2015-09-24 2018-07-31 圣纳普医疗(巴巴多斯)公司 Motor-driven full visual field adaptability microscope
CN109478346A (en) * 2016-08-04 2019-03-15 诺华股份有限公司 It is experienced using the ophthalmologic operation that virtual reality head-mounted display enhances
CN109478346B (en) * 2016-08-04 2024-02-13 爱尔康公司 Enhanced ophthalmic surgical experience using virtual reality head mounted display
CN106772996A (en) * 2017-01-23 2017-05-31 清华大学 A kind of augmented reality operating method and system
CN110494921A (en) * 2017-03-30 2019-11-22 诺瓦拉德公司 Utilize the RUNTIME VIEW of three-dimensional data enhancing patient
CN110494921B (en) * 2017-03-30 2023-11-28 诺瓦拉德公司 Enhancing real-time views of a patient with three-dimensional data
CN114488501A (en) * 2018-04-25 2022-05-13 卡尔蔡司医疗技术股份公司 Microscope system and method for operating a microscope system
CN112584789A (en) * 2018-06-19 2021-03-30 托尼尔公司 Mixed reality surgical system with physical markers registering virtual models
CN109300387A (en) * 2018-12-05 2019-02-01 济南大学 A kind of virtual microscopic material object interaction suite and its application
CN109326166B (en) * 2018-12-05 2020-11-06 济南大学 Virtual microscope object kit and application thereof
CN109326166A (en) * 2018-12-05 2019-02-12 济南大学 A kind of virtual microscopic material object external member and its application
CN109801368A (en) * 2019-02-26 2019-05-24 浙江未来技术研究院(嘉兴) A kind of microscope visual area light field image merges display methods and device
CN109801368B (en) * 2019-02-26 2023-06-13 浙江未来技术研究院(嘉兴) Microscopic operation field light field image fusion display method and device
CN114730082A (en) * 2019-05-29 2022-07-08 S·B·墨菲 System and method for utilizing augmented reality in surgery
CN110264504B (en) * 2019-06-28 2021-03-30 北京国润健康医学投资有限公司 Three-dimensional registration method and system for augmented reality
CN110264504A (en) * 2019-06-28 2019-09-20 北京国润健康医学投资有限公司 A kind of three-dimensional registration method and system for augmented reality
CN110490130A (en) * 2019-08-16 2019-11-22 腾讯科技(深圳)有限公司 Intelligent optical data processing method, device and computer readable storage medium
CN113413207A (en) * 2021-06-22 2021-09-21 南京康友医疗科技有限公司 3D visual medical operation system

Similar Documents

Publication Publication Date Title
CN101170961A (en) Methods and devices for surgical navigation and visualization with microscope
US20060293557A1 (en) Methods and apparati for surgical navigation and visualization with microscope ("Micro Dex-Ray")
Wang et al. Augmented reality navigation with automatic marker-free image registration using 3-D image overlay for dental surgery
US6483948B1 (en) Microscope, in particular a stereomicroscope, and a method of superimposing two images
ES2292593T3 (en) GUIDING SYSTEM
US7912532B2 (en) Method and instrument for surgical navigation
CN110638527B (en) Operation microscopic imaging system based on optical coherence tomography augmented reality
US7491198B2 (en) Computer enhanced surgical navigation imaging system (camera probe)
US20130250081A1 (en) System and method for determining camera angles by using virtual planes derived from actual images
US20040254454A1 (en) Guide system and a probe therefor
CN106456271A (en) Alignment of q3d models with 3d images
JP7460631B2 (en) ENDOSCOPY HAVING DUAL IMAGE SENSORS - Patent application
CN105555221A (en) Medical needle path display
EP0711421B1 (en) Operating microscope
Liao et al. Intra-operative real-time 3-D information display system based on integral videography
CN117413060A (en) Augmented reality system for real space navigation and surgical system using the same
CN114365214A (en) System and method for superimposing virtual image on real-time image
Paloc et al. Computer-aided surgery based on auto-stereoscopic augmented reality
Salb et al. INPRES (intraoperative presentation of surgical planning and simulation results): augmented reality for craniofacial surgery
US11628037B2 (en) System and method for viewing a subject
Liao et al. Real-time 3D-image-guided navigation system based on integral videography
Akatsuka et al. Navigation system for neurosurgery with PC platform
García et al. Calibration of a surgical microscope with automated zoom lenses using an active optical tracker
US20230181262A1 (en) Devices and Methods for Imaging and Surgical Applications
US20230363830A1 (en) Auto-navigating digital surgical microscope

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication