Tool Navigator
The present invention relates to a system for providing a user with data indicating the location of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane and also to methods and apparatus for providing that data to a user. Ultrasound imaging is a useful technique for guiding tools m situations where they are not externally visible. In particular, ultrasound imaging techniques are often used m surgery and are continuing to be developed for this application. One problem with known ultrasound imaging techniques however is that a surgeon may not always be able to tell from the ultrasound images obtained where the tool is located m space. This is of vital importance m surgery m particular. International patent application W096/25881 describes an ultrasound guided surgical system m which the point at which the trajectory of a tool intersects the image plane of a 2 -dimensional ultrasound image may be superimposed onto the ultrasound image. However, m this instance, the surgeon will only be provided with data indicating the location and orientation of the tool when the trajectory of the tool intersects the portion of the image plane that is displayed. Additionally, the data is provided to the surgeon m the ultrasound image so that it will be relatively difficult to understand. Ideally, a means should be provided of allowing a surgeon or other user of ultrasound imagery to ascertain the location of the tip of a tool and the orientation of the tool relative to an ultrasound image plane quickly and easily and without having to digest the complex information contained m the ultrasound image.
Therefore, from a first aspect, the present
invention provides a system for providing a user with data indicating the location of the tip of a tool and the orientation of the tip portion of the tool relative to an ultrasound image plane wherein the data is presented separately from the ultrasound image so that the location and orientation of the tool are easily discemable regardless of the location and orientation of the tool relative to the ultrasound image plane.
Although the location of the tool may be derived from the ultrasound data by identifying data corresponding to the tool within the data set, preferably the location of the tool is determined independently. Various suitable positioning systems are known and several are discussed m 096/25881. This approach is fast and enables the location of the tool to be determined even when it is some distance from the imaged region.
It will be appreciated that the data could be provided to the user m a separate and easily understandable manner using many different methods. For example, the system may comprise means for representing the data m graphic form which can be presented to the user m many different ways. For example, the data could be provided on a monitor which was separate from the rest of the ultrasound imaging system. Preferably however, an ultrasound image is provided on a first part of a monitor screen and the data m graphic form is provided on a second part of that monitor screen. For example, the graphic data could be provided m one corner of the monitor screen showing the ultrasound image. This would allow the user to ascertain the exact position and orientation of the tip of the tool at regular intervals without having to turn away from the ultrasound image itself. The graphic data may be provided to the user m many different forms. For example, the numerical values of the distance of the tool tip from the image plane and
the angle between the tool tip portion and the image could simply be provided. However, this would clearly not be particularly easy for a user to understand at a glance. Thus, m one embodiment, to display the data to the user, the ultrasound image plane is indicated by a line, the shortest distance between the tool tip and the image plane is represented by an indicia spaced by a distance representing that distance from the vertical line; and the angle of the tool tip portion trajectory relative to the image plane is represented by a colour or grey scale coded area.
In one embodiment, the line indicating the ultrasound image plane may extend vertically. Thus, the indicia may be a line or block spaced horizontally from the vertical line. Further, the colour or grey scale area may be a block extending to one side or the other of the vertical line so as to show the direction m which the tool is facing.
Alternatively, m another preferred embodiment the data is displayed to the user by means of virtual objects representing the tool and the ultrasound image plane m 3 dimensional space m which the virtual objects are arranged relative to one another as m the real situation, wherein the virtual objects may be viewed from any chosen location and m any direction to produce a 2 dimensional image. As will be immediately apparent, either of these embodiments provides the data to the user m a way which allows them to obtain the necessary information from only a quick glance. Although the graphical feedback to the user as described above would work well, it is also possible to provide the data to the user m acoustic form. This could be used either alone or m combination with the graphical data representation. The acoustic representation of the data has the advantage that the user could be made aware of the location and orientation of the probe even when they were not free to look at the
graphical representation.
Any form of varying acoustic signal could be used to represent the data to the user. Preferably however the pitch of the signal is used to indicate the location of the tool tip and the signal may pulse at a rate whicn mdicates the angle between the tool and the image plane. (The opposite arrangement would also be used) For example, the pitch of the acoustic signal may be relatively low when the tip portion of the tool is parallel to the image plane and increase as the angle between the tip portion of the tool and the image plane increases. The frequency of the pulses may be relatively low when the tip of the tool is located on the image plane and increase as the distance between the tip of the tool and the image plane increases.
In some situations, the user might wish to vary the maximum distance between the tip of the tool and the image plane for which they are provided with data. Consequently, the system may include first control means for setting the maximum distance between the tool tip and the image plane for which the data is provided to the user to correspond to a maximum distance chosen by the user.
Also, although it would often be useful to the user to know when the tool is at its maximum angle to the ultrasound image plane (i.e. at 90°), the system may also comprise second control means for setting the maximum angle between the tool tip portion and the image plane for which the data is provided to the user to correspond to a maximum angle chosen by the user. It is further believed that each of the above described means of providing data to a user are inventive m their own right. Thus, from a second aspect, the present invention provides a method of graphically representing the position of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane to a user,
comprising the steps of : determining the shortest distance between the tool tip and the image plane and the angle between the tool tip portion trajectory and the image plane; displaying the image plane as a line on a monitor screen; representing the distance between the tool tip and the image plane as an indicia spaced by a distance representing that distance from the line; and representing the angle of the tool tip portion trajectory relative to the image plane by a colour or grey scale coded area.
Likewise, from a third aspect, the present invention provides an apparatus for graphically representing the position of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane to a user, comprising: means for determining the shortest distance between the tool tip and the image plane and the angle between the tool tip portion trajectory and the image plane; means for displaying the image plane as a line on a monitor screen; means for representing the distance between the tool tip and the image plane as an indicia spaced by a distance representing that distance from the line; and means for representing the angle of the tool tip portion trajectory relative to the image plane by a colour or grey scale coded area.
In one embodiment, the line indicating the ultrasound image plane may extend vertically. Thus, the indicia may be a line or block spaced horizontally from the vertical line. Further, the colour or grey scale area may be a block extending to one side or the other of the vertical line so as to show the direction m which the tool is facing.
From a fourth aspect, the present invention provides a method of providing a user with data indicating the position of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane produced by an ultrasound probe,
comprising the steps of: representing the tool and image plane by virtual objects m 3 -dimensional space; controlling the movement of the virtual objects m 3 dimensional space so that the virtual objects follow the movements of the tool and the image plane m real space; and viewing the virtual objects so as to produce a 2- dimensional image of the virtual objects.
Preferably, the virtual objects may be viewed from any chosen position to produce the 2 -dimensional image. Thus, the user may vary the image m order to obtain the optimum information at all times.
In one useful embodiment the direction of viewing the virtual objects corresponds to the user's direction of view of the real objects. Thus a realistic view of the relationship between the objects may be obtained. Alternatively, the direction of viewing may correspond to the direction of the longitudinal extent of the ultrasound probe. This would allow the user to directly see how the tool is oriented relative to the image plane and also how to move either the tool or the ultrasound probe m order to obtain a desired relative orientation between the tool and image plane.
In order to further assist the user, the ultrasound probe and/or a position sensor attached to the ultrasound probe may also be represented by a virtual obj ect .
From a further aspect, the present invention provides an apparatus for providing a user with data indicating the position of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane produced by an ultrasound probe, comprising: means for representing the tool and image plane by virtual objects m 3 -dimensional space; means for controlling the movement of the virtual objects m 3 dimensional space so that the virtual objects follow the movements of the tool and the image plane m real space; and means for viewing the virtual objects so as to
produce a 2 -dimensional image of the virtual objects. From a yet further aspect, the present invention provides a method of acoustically providing a user with data representing the location of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane, comprising the steps of: determining the shortest distance between the tool tip and the image plane and the angle between the tool tip portion trajectory and the image plane; providing an acoustic signal to the user; controlling the frequency of pulses of the acoustic signal so as to indicate the distance of the tool tip from the image plane; and controlling the pitch of the signal so as to indicate the angle between the tool tip portion trajectory and the image plane.
From another aspect, the present invention provides an apparatus for acoustically providing a user with data representing the location of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane, comprising: means for determining the shortest distance between the tool tip and the image plane and the angle between the tool tip portion trajectory and the image plane; means for providing an acoustic signal to the user; means for controlling the frequency of pulses of the acoustic signal so as to indicate the distance of the tool tip from the image plane; and means for controlling the pitch of the signal so as to indicate the angle between the tool tip portion trajectory and the image plane. When the data concerning location and orientation of the tool tip are to be determined independently of the ultrasound data, the location/orientation of the tool may conveniently be determined using a sensor attached to the tool. If this is done, m order to obtain the data representing the location of the tip of the tool and the orientation of the tool tip portion, in is necessary to calibrate the tool to ascertain the
exact location of the tip of the tool with respect to a position sensor on the tool .
Thus, from a yet further aspect, the present invention provides a method of determining the location of the tip of a tool and the orientation of the tip portion of a tool relative to a position sensor attached to the tool, the method comprising the steps of: placing the tool m a calibration device such that the locations m a global coordinate system of the tip of the tool and of another point on the tool removed from the tip are known; measuring the position of the position sensor m the global coordinate system; and establishing a mathematical transformation for calibrating the tip of the tool relative to the position sensor. The apparatus with which the calibration is carried out is also believed to be novel and inventive and so, from another aspect, the invention provides an apparatus for determining the location of the tip of a tool and the orientation of the tip portion of a tool relative to a position sensor attached to the tool, the apparatus comprising: a calibration device for holding the tool, the calibration device comprising means for determining the location m a global coordinate system of the tip of the tool and of another point on the tool which is removed from the tip.
The apparatus may further comprise means for establishing a mathematical transformation for calibrating the tip of the tool relative to the known location m the global coordinate system of the position sensor.
In one preferred embodiment, the calibration device comprises a main body part having a passage extending through it, such that the tool may be supported m the passage . In one possible means for determining the location of the tip of the tool, the means comprises a first support m the passage through which the tool passes
which is m a known position, and a substantially vertically extending planar surface at one end of the passage against which the tool tip abuts.
Preferably, the means for determining the location of the other point on the tool comprises a second support provided at a known position m the passage through which the tool passes, wherein the second support is removed from the first support.
Preferably, the apparatus further comprises a chuck mechanism which can be clamped together to hold the tool m a known position within the passage. Thus, the tool can be easily inserted into the calibration device and then, once the chuck mechanism is closed, will be securely held m a known position. Preferred embodiments of the invention will now be described by way of example only and with reference to the accompanying drawings m which:
Figure 1 shows a cross section through a calibration device according to the invention; Figures 2a to 2f show the appearance of part of a monitor screen which includes a navigation panel for displaying data to a user according to a first embodiment of the invention for various locations and orientations of a tool relative to the ultrasound image plane;
Figures 3a and 3b show a control panel including a tool navigator according to one embodiment of the invention;
Figures 4a to 4c show a monitor screen displaying data to a user according to an alternative embodiment of the invention;
Figure 5 is a schematic view showing the use of ultrasound m surgery; and
Figure 6 is a schematic representation of the angle and distance between the tool tip and the image plane. As can be seen m Figure 5, an ultrasound probe 1 is positioned so as to obtain real-time ultrasound
images of a part of the body which is being operated on. A user can move a handheld surgical tool 2 freely so as to perform the surgery.
Figure 1 illustrates a tool, the position and orientation of which is to be determined and displayed to a surgeon.
As shown m the Figure the surgical tool 2 has a position sensor 4 attached to the end of it remote from the tip. The position and orientation of the sensor m space may readily be determined m the conventional manner with reference to a global co-ordinate frame. However, to determine the position and orientation of the tip 6 of the tool relative to the position sensor 4, and the orientation of the tip portion of the tool relative to the position sensor, the tool must be calibrated before use.
Thus, a calibration device 8 is provided. This device is provided on a board 10 having a reference position sensor 12, the position of which is also determined m the global coordinate frame. Two separate chuck mechanisms 14 and 16 are provided on the board. The first of these chuck mechanisms, 14 has an apertuic 18 within it through which the tool passes. This aperture is blocked at one end by a vertically extending planar member 20. The tool is located such that the tip 6 of the tool abuts the planar member 20.
The second chuck 16 is spaced from the first chuck 14 by a variable distance along the board. The second chuck 16 has an aperture 22 through which the tool passes. The board is constructed so that the direction D from the centre of the first chuck 14 to the centre of the second chuck 16 will always be known. The second chuck 16 is moveable relative to the first chuck 14 m this known direction D so that tools of different sizes can be calibrated by the device. The direction D can be made to correspond to one of the axes of the reference sensor frame system. Thus, when both the first 14 and
second 16 chucks have been closed, the tool is held securely m such a way that both the location of the tip 6 and the orientation of the tip portion are known. The location of the position sensor is then measured and the necessary calculations are performed to calculate the position of the tool tip and the orientation of the tip portion relative to the position sensor.
The calibration device 8 is formed as a rigid body. Thus, the apertures 18, 22 m the chucks 14 and 16 can be made to point m a known direction relative to the coordinate system of the reference frame and this direction will not alter due to the rigidity of the device .
In an alternative embodiment, a single moveable chuck is provided on the device. This chuck being moveable m a known direction relative to a vertically extending planar member which is fixed relative to the sensor frame.
A co-ordinate system with its origin at the tool tip is defined. The definition is arbitrary, except for one axis, which is aligned with the tip portion, since the tip is symmetric. However, it is convenient to set the orientation coplanar with or normal to the sensor frame if the orientation of the sensor frame relative to the tool is such that a coplanar or normal orientation would be comfortable for the surgeon using the navigator. The co-ordinate system for the sensor is read directly from the position sensing system. The result of the instrument calibration is the transformation matrix
JO t where tt represents the tip of the tool and lo represents the sensor coordinate system (local system, as opposed to the global reference coordinate system) . Equation 1 shows the matrix.
The tip of the tool m the global system is given by the following transformation:
(2)
The matrix C M is the position of the sensor provided by the position sensing system and the right most vector represents the position of the tool tip in its own co-ordinate system.
One advantage of the calibration device described above is that it allows a surgeon to move the position sensor on the tool during a surgical procedure and merely to recalibrate the tool quickly before continuing his work. Thus, the position sensor may be located on the tool at whatever point is most convenient to the surgeon .
In order to ascertain the location and orientation of the tool relative to an ultrasound image it is also necessary to calibrate the ultrasound probe used to acquire the image. This is a more complex procedure than the calibration of the tool. However, several methods of calibrating ultrasound probes are known m the art and so will not be further discussed here. A general discussion of probe calibration methods is included m a paper entitled "Rapid Calibration for 3-D freehand ultrasound" , Ultrasound m Medicine and Biology, Volume 24, No. 6, pages 855 to 869 by Prager et
al.
In order to avoid having to recalibrate the ultrasound probe for each use, a clip-on adapter is glued to the probe. The position sensor can then be placed in the adapter and removed after use without any need for recalibration . Similarly, the entire sensor frame may be fixed to certain tools such as biopsy forceps prior to use and is sterilised with the tool. For some other surgical tools however it is preferable that the surgeon is able to decide where on the tool the sensor frame should be attached during the procedure and so in this instance, the sensor frame is not permanently attached to the tool .
The result of the calibration is a matrix JO _. analogous to the instrument calibration, and given by:
lo, probe — M us (3)
A point m the ultrasound image is then given by the following transformation:
(4)
Thus, these are the necessary calibrations, m order to know the position and orientation of the tip of the surgical instrument relative to the sensor attached to the instrument, and the position and orientation of the ultrasound image relative to the sensor attached to
the ultrasound probe.
In order to obtain data showing the position of the tip of the tool and the orientation of the tip portion of the tool relative to the ultrasound image plane, a 5 navigator program is employed. The positions of the sensors attached to the ultrasound probe and the surgical instrument are read continuously. The navigator reads one position from each sensor as many times as possible within a short enough time interval so 10 that the tool does not move significantly while the positions are read and averaged. For example, the navigator may calculate the average of the last three positions for each sensor before further calculations. This average is represented by the matrices QlMlG, prote and
- - —1 as—l o instr '
The following matrix multiplications give the position and orientation m the global coordimate system of the ultrasound image and the tip portion of the tool respectively:
g M^ lo probe lo probe — M us ( κ5^>) I
g ^- lo instr lo instr — tt
20 It is now possible to utilise this information to calculate various properties, which are presented to the surgeon. The two most obvious properties to be calculated are the distance (normal) between the ultrasound image and the tip of the tool, and the angle
25 between the tip portion of the tool and the image plane. With these properties and continuous update on the properties, the surgeon will be able to adjust either the probe or the instrument m order to obtain a longitudinal cross sectional view of the instrument m
30 the image plane.
To calculate the distance between the tool tip and the ultrasound plane, firstly the shortest distance between the tip of the tool and the ultrasound image
plane, which is equal to the length of a vector extending perpendicular to the image plane from the tool tip, is determined This distance can be calculated as follows Let the image plane be defined by three points on the plane P, Q and R, and let A be a point not on the image plane The shortest distance d between the poirt A and the image plane can be calculated from equation 6
The point A defines the tool tip m the global coordinate system, P is set to the image plane origin, Q is a point along the y-axis of the image plane (some distance greater than zero from P) , R is a point along the z-axis of the image plane (some distance greater than zero from P) , and ϋ is a unit vector along the x- axis of the image plane, perpendicular to the image plane All these data (A, P, Q, R and ϋ ) are readily available from the position readings of the position sensing system and the calibration matrices.
The distance between the tool tip and the ultrasound image origin is simply equal to the length of the vector between point A and P. Next, the angle φ between the tool tip extension trajectory line T and the image plane is determined. This angle can be found from the equation for the dot product between two vectors, given by:
f
• Q
χ - \f\
■
■ cos l (f,Q
χ) = \f\
■
■ cos(α) (7)
where is the angle between the x-axis of the image plane and the tool tip extension trajectory line.
There are three possible situations: 1) the tool is pointing completely or partially along the negative direction of the image plane x-axis (illustrated m Figure 6) ; 2) the tool is pointing completely or partially along the positive direction of the image plane x-axis, or 3) the tool is pointing perpendicularly at the image plane x-axis, m which case the tool is parallel with the image plane. Which case is true can be determined from the angle a . α > 90° corresponds to case 1) , α < 90° corresponds to case 2, while = 90° corresponds to case 3. This means that it is possible to determine which way the tip portion of the tool is pointing with respect to the image plane, e.g. by differentiating between whether the tool is pointing towards the sensor side of the probe or vice versa. The angle of interest for the tool navigator is angle φ m figure 6, which is always less than or equal to 90°. The angle is given by (differentiating between the three situations) :
Case 3) Φ (8c)
Various methods of displaying data representing the position and orientation of the tool relative to an ultrasound image plane will now be described. A first way of displaying this data is shown m Figures 2a to 2f. The dotted line m these figures represents the actual image plane and the arrow represents the location of the tool relative to it .
In this embodiment a navigation panel is located m
the bottom corner of the same monitor screen used to display the ultrasound image to the surgeon. The navigation panel occupies about 5% of the area of the monitor screen. In the navigation panel, the ultrasound image plane is represented by a vertically extending straight line. A rectangle extends horizontally outward from the line by the distance of the tool tip from the image plane. A second rectangle is provided extending from the vertical line. This rectangle is colour coded depending on the angle at which the tool tip portion extends to the image plane.
The operator can at any time change the maximum distance between the tool and image plane to be shown by the navigator. This means that if the distance is greater than this maximum, the navigator indicates that the tool and image are too far away from each other to be of interest for navigation. Likewise, the angle can be set to some maximum value, but sometimes it can be of value to know if the instrument is pointing at a normal angle (the maximum value angle) to the image plane. In the navigator feedback display shown on the navigation panel, the operator can flip the entire set-up if he should happen to switch position from one side of the patient to the other so that left and right m the display keeps its correspondence to the patient.
Figure 3a shows an example of the panel for setting the desired maximum values and colour panel to be used m the navigation panel, as well as the probe and surgical tool to be used. The operator can also call the program for calibrating a tool if the desired tool has not been calibrated previously. Figure 3b shows an example of the navigator feedback display with a situation where the tool tip is 34.6 mm away from the image plane and is oriented at an 3.8° angle relative to the ultrasound scan plane.
The navigator can be used m five different ways for this method:
1. The first and maybe the most useful to the clinician, is to first attach the ultrasound probe m a position that gives him the best view of the structure of interest and then use the navigator as feedback m order to adjust the instrument during insertion into the body .
2. The second method is to attach the tool m a certain position due to for instance limitations m the anatomy, and then adjust the ultrasound probe according to the feedback from the navigator. In this way the surgeon achieves the best possible longitudinal view of the cross section of the instrument .
3. The third method is to allow both the instrument and the ultrasound probe to be adjusted during insertion of the instrument into the body, and also during the procedure. This could be advantageous if the tool needs to be moved a lot.
4. The tool may be removed and the position sensor may be a sensor without three dimensional axis definition (a position locator, just position m space, not orientation, for instance a small piezoelectric element m acoustic position sensing) . In this case the navigator would indicate just the distance (orthogonal distance from point onto plane) between the position sensor and the ultrasound image. The position sensor could be located inside or outside the body of the patient (depending on the type of position sensing system) . The probe can then be adjusted to an optimal view of the area of interest m relation to the position sensor.
5. A robot may control the position of the probe according to the orientation and position m space of the tool or position of a position sensor. The robot could then make sure that the ultrasound image always displays the best possible or preferred view of the tool or just position locator, for instance a longitudinal cross sectional view, or any other preferred view
(depending on the specific anatomical and/or surgical situation) .
The navigation panel could for instance be located at the bottom corner of the same screen as the navigation computer uses to present processed data during the surgical procedure. It should be small enough not to obstruct any important data or images on the screen, but at the same time not so small that the operator can not see the necessary information with a quick glance at the screen.
An alternative embodiment of the method of providing the data to the surgeon, which is used alone or in conjunction with the graphical display methods is acoustic feedback. The acoustic feedback needs to be of such a kind that it does not irritate or annoy the operators present in the operating room. This means that when the ultrasound probe and the surgical tool are too far away from each other to be interesting for navigation purposes, the system is silent. This is natural since this will be the situation during most of the time in the operating room.
The different situations are summarised as follows :
1. Silence : Tool and probe too far away from each other for the situation to be interesting for navigation .
2. Low pitch low frequency signal : Situation is close to optimal. That is, the tip portion of the tool line is close to being in the image plane. 3. High pitch low frequency signal: Tool tip is close to being in the image plane, but the angle between the tip portion of the tool and the image plane is significant .
4. High pitch high frequency signal : Tool tip is far away and tip portion of tool is at a significant angle relative to image plane in space.
5. Low pitch high frequency signal : Tip portion
of tool is close to parallel with the image plane but the tool tip is at a significant distance from the image plane m space.
6. Lowest pitch lowest frequency signal : Perfect situation. The tool tip portion is m the image plane. In this a tone that is distinct from the tone of the signal for the other cases is used so that there is no doubt about the fact that the situation is optimal (easier for the operator to understand) . 7. Highest pitch highest frequency signal : Worst case. The tip portion is at 90° angle to the image plane and the tip is located at maximum distance from the image plane for the navigator to run.
The pitches and frequencies should not be at annoying levels. A signal consisting of a 'ping' about once a second could for instance mean that the tool tip is about 1 cm from the image plane and that the angle between the plane and tip portion of the tool is about 10°. The frequency for the optimal situation could for instance be a 'pmg' every 5 seconds. When the situation just reaches the worst case (maximum values, or limits for when to start the feedback) and goes beyond these, the system could present a continuous signal for 2 seconds and then be silent until the tool and image are within the limits again. The difference between minimum and maximum pitch and between minimum and maximum frequency should not be too large. Too large dynamic range for frequency and pitch could cause irritation m the operating room (acoustic noise) . The volume and the tones will of course be adjustable, just like the colour scale m the graphical display method.
In another embodiment of the invention, the graphical display method is used m a 3 dimensional ultrasound image. In this embodiment, a 3D volume is acquired using an ultrasound probe and is then transferred to a visualisation computer together with
position data from the position sensor.
The volume is visualised stereoscopically or as three orthogonal slices. The surgical tool with a position sensor is then inserted into the area that has just been scanned. As the surgical procedure progresses new volumes are acquired to update the scene .
In a first embodiment, the tool tip position and orientation control the visualisation of the volume and a line (or some other geometric structure) is drawn representing the tool m the volume at the pixels determined by the position readings. That is, the pixels where the tip portion of the tool is located m the volume are replaced by some bright colour. One could for example make sure that the tool tip is positioned m the centre of the visualised volume. The orthogonal slices can be picked according to the coordinate system on the tip of the tool (determined m the calibration process) . The tip itself can be indicated by a different colour than the rest of the tip portion of the tool.
In a second embodiment, m any-plane visualisation mode, the tool is coded with colour and shown m the image on the screen. This means that the image of the tool itself (or the tip portion at least, or just a small part of this) becomes a line on either side of which colour coded blocks representing the distance between the tool tip and the image plane and the angle between the tool tip portion and the image plane are displayed. The surgeon is then able to see where the tip portion of the tool is located m relation to the current image from the 3D volume. In other words, the tool incorporated with a coloured bar coding the distance and angle measures is projected onto the image. The image can be picked according to the position of the ultrasound probe. In this manner, the surgeon will have an any-plane view from a 3D volume with the tool position and orientation information included, m
addition to the real-time 2D-ultrasound image on the scanner .
With stereoscopic visualisation, the navigator is able to indicate with coloured arrows on the surface of the volume where the tip would enter if it were moved into the volume. The tip portion of the tool (except the tip) is coloured with a bright colour m the images, but with a different colour than the tip.
A yet further embodiment of the invention, in which the position and orientation of the tool relative to an ultrasound image plane is shown by means of virtual objects m 3 dimensional space, is described with reference to Figure 4.
In one very useful embodiment of the invention, the surgeon operates based on a realistic and integrated 3D scene. This scene consists of model objects (surgical tools and ultrasound probe with scan plane) , preoperative objects (collected from imaging modalities such as MRI , functional MRI and CT) , and real-time mtra-operative objects (from ultrasound, intervention MRI and microscope/video) . However, a scene consisting of only the model objects would be sufficient to visualise the position and orientation of the surgical tool relative to the ultrasound scan-plane m a direct and simple way.
Assuming a polygon description of the interesting model objects (which could be made arbitrary accurate and realistic), these virtual objects follow the same movements as the real objects (by means of the position data from the positioning system) . This will create a virtual scene of all the surgical tools and ultrasound probe activity m surgical space. At the same time, the surgeon may view the scene from an arbitrary position and orientation. By setting the viewpoint m the proximity of the surgeon one would obtain a realistic overview of the relationship between the objects that constitute the
scene (Figure 4, view 1 a to c) . In this scene, left/right, up/down, and m front/back on the screen is m correspondence with what is happening on the operating table, as seen from the surgeon. However, m order to navigate a tool into the image plane (or the other way around) , it is more convenient to constantly let the viewpoint follow the probe. This could be done by letting the viewpoint be located a short distance into the probe casing. The view- forward direction could be along the image plane radial direction (i.e. parallel to the image plane and along the centre beam) , and view-up direction along the second axis of the image plane. The axis from left to right m this view situation would be normal to the image plane In this manner one would directly see how a tool is oriented m relation to the image plane, as well as how to move one of them (position and orientation) m order to obtain the desired relative orientation (Figure 4,
Figure 4. This figure shows three (a to c) screen snap-shots from a prototype application with four different views (1-4) . The two objects m the scene are a tool and an ultrasound probe. The position sensor spheres attached to the probe are indicated as yellow points, the sensor on the tool is indicated with green colour, while the reference system, or global coordinate system, is shown as three red dots. View 1 is an over view situation, view 2 shows the parallel view, view 3 is the normal view from one side of the probe, while view 4 is the normal view from the opposite side. More than one view can be shown at once (as m this figure) or the surgeon can toggle the active view with voice recognition control for instance. In the parallel view (view 2) , the image plane coordinate system can be seen m the middle, the tool at the top, and the reference system at the right bottom of the view.
The visualisation can be flipped (manually or
automatically) , supplemented with accurate numeric measures of distance and angle coded m colour or sound as described previously. For instance, a parallel view gives little or no parallel information, so as to for example explain where m the image plane the surgical tool is located, or whether the tool is above or below the image plane But by letting the colour of the tool reflect the parallel distance from known reference points (i.e. the middle of the image plane depth for instance) , less important data like this could be incorporated into a single view.
Alternatively, the viewpoint can be positioned m such a way that one is looking in a direction normal to the image plane. In this situation the viewpoint should follow the probe and thus allow the surgeon to directly see where m the image plane the tool is located (Figure 4, view 3 and 4 m snap-shots a to c) . In many ways this view shows the same as what can be seen on the ultrasound scanner (if the tool lies m the plane) . Therefore, this view could to a certain degree be superfluous. However, this view could be used as a qualitative verification of the tool calibration (and also the probe calibration) . It could be utilised to visualise how well the real world (the tool m a real- time 2D image on the ultrasound scanner) maps over to the virtual representation (the objects m this normal view) . By texture-mapping a real-time 2D ultrasound image on the virtual image plane m a parallel view one would see, assuming a high degree of accuracy, that the virtual tool would coincide with the real tool seen m the real-time 2D image.
Although a number of preferred embodiments of the invention have been described above, it will be readily apparent to the skilled person that various modifications to these would be possible. For example, the colour coded block showing the angle at which the tool extends to the image plane could use a grey scale
rather than colour scale.