WO2001039683A1 - Tool navigator - Google Patents

Tool navigator Download PDF

Info

Publication number
WO2001039683A1
WO2001039683A1 PCT/GB2000/004648 GB0004648W WO0139683A1 WO 2001039683 A1 WO2001039683 A1 WO 2001039683A1 GB 0004648 W GB0004648 W GB 0004648W WO 0139683 A1 WO0139683 A1 WO 0139683A1
Authority
WO
WIPO (PCT)
Prior art keywords
tool
image plane
tip
tip portion
system
Prior art date
Application number
PCT/GB2000/004648
Other languages
French (fr)
Inventor
Thomas Lango
Frank Lindseth
Steinar Ommedal
Atle Kleven
Age Gronningsater
Geirmund Unsgard
Original Assignee
Sinvent As
Jackson, Robert, Patrick
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to GB9928695.7 priority Critical
Priority to GBGB9928695.7A priority patent/GB9928695D0/en
Application filed by Sinvent As, Jackson, Robert, Patrick filed Critical Sinvent As
Publication of WO2001039683A1 publication Critical patent/WO2001039683A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00725Calibration or performance testing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound

Abstract

The present invention provides a system for providing a user with data indicating the location of the tip (6) of a tool (2) and the orientation of the tip portion of the tool relative to an ultrasound image plane wherein the data is presented separately from the ultrasound image so that the location and orientation of the tool are easily discernable regardless of the location and orientation of the tool relative to the ultrasound image plane. The system may comprise means for presenting the data in graphical or acoustic form. The invention also extends to a method of presenting data relating to the position of the tool (2) relative to an ultrasound image plane to a user.

Description

Tool Navigator

The present invention relates to a system for providing a user with data indicating the location of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane and also to methods and apparatus for providing that data to a user. Ultrasound imaging is a useful technique for guiding tools m situations where they are not externally visible. In particular, ultrasound imaging techniques are often used m surgery and are continuing to be developed for this application. One problem with known ultrasound imaging techniques however is that a surgeon may not always be able to tell from the ultrasound images obtained where the tool is located m space. This is of vital importance m surgery m particular. International patent application W096/25881 describes an ultrasound guided surgical system m which the point at which the trajectory of a tool intersects the image plane of a 2 -dimensional ultrasound image may be superimposed onto the ultrasound image. However, m this instance, the surgeon will only be provided with data indicating the location and orientation of the tool when the trajectory of the tool intersects the portion of the image plane that is displayed. Additionally, the data is provided to the surgeon m the ultrasound image so that it will be relatively difficult to understand. Ideally, a means should be provided of allowing a surgeon or other user of ultrasound imagery to ascertain the location of the tip of a tool and the orientation of the tool relative to an ultrasound image plane quickly and easily and without having to digest the complex information contained m the ultrasound image.

Therefore, from a first aspect, the present invention provides a system for providing a user with data indicating the location of the tip of a tool and the orientation of the tip portion of the tool relative to an ultrasound image plane wherein the data is presented separately from the ultrasound image so that the location and orientation of the tool are easily discemable regardless of the location and orientation of the tool relative to the ultrasound image plane.

Although the location of the tool may be derived from the ultrasound data by identifying data corresponding to the tool within the data set, preferably the location of the tool is determined independently. Various suitable positioning systems are known and several are discussed m 096/25881. This approach is fast and enables the location of the tool to be determined even when it is some distance from the imaged region.

It will be appreciated that the data could be provided to the user m a separate and easily understandable manner using many different methods. For example, the system may comprise means for representing the data m graphic form which can be presented to the user m many different ways. For example, the data could be provided on a monitor which was separate from the rest of the ultrasound imaging system. Preferably however, an ultrasound image is provided on a first part of a monitor screen and the data m graphic form is provided on a second part of that monitor screen. For example, the graphic data could be provided m one corner of the monitor screen showing the ultrasound image. This would allow the user to ascertain the exact position and orientation of the tip of the tool at regular intervals without having to turn away from the ultrasound image itself. The graphic data may be provided to the user m many different forms. For example, the numerical values of the distance of the tool tip from the image plane and the angle between the tool tip portion and the image could simply be provided. However, this would clearly not be particularly easy for a user to understand at a glance. Thus, m one embodiment, to display the data to the user, the ultrasound image plane is indicated by a line, the shortest distance between the tool tip and the image plane is represented by an indicia spaced by a distance representing that distance from the vertical line; and the angle of the tool tip portion trajectory relative to the image plane is represented by a colour or grey scale coded area.

In one embodiment, the line indicating the ultrasound image plane may extend vertically. Thus, the indicia may be a line or block spaced horizontally from the vertical line. Further, the colour or grey scale area may be a block extending to one side or the other of the vertical line so as to show the direction m which the tool is facing.

Alternatively, m another preferred embodiment the data is displayed to the user by means of virtual objects representing the tool and the ultrasound image plane m 3 dimensional space m which the virtual objects are arranged relative to one another as m the real situation, wherein the virtual objects may be viewed from any chosen location and m any direction to produce a 2 dimensional image. As will be immediately apparent, either of these embodiments provides the data to the user m a way which allows them to obtain the necessary information from only a quick glance. Although the graphical feedback to the user as described above would work well, it is also possible to provide the data to the user m acoustic form. This could be used either alone or m combination with the graphical data representation. The acoustic representation of the data has the advantage that the user could be made aware of the location and orientation of the probe even when they were not free to look at the graphical representation.

Any form of varying acoustic signal could be used to represent the data to the user. Preferably however the pitch of the signal is used to indicate the location of the tool tip and the signal may pulse at a rate whicn mdicates the angle between the tool and the image plane. (The opposite arrangement would also be used) For example, the pitch of the acoustic signal may be relatively low when the tip portion of the tool is parallel to the image plane and increase as the angle between the tip portion of the tool and the image plane increases. The frequency of the pulses may be relatively low when the tip of the tool is located on the image plane and increase as the distance between the tip of the tool and the image plane increases.

In some situations, the user might wish to vary the maximum distance between the tip of the tool and the image plane for which they are provided with data. Consequently, the system may include first control means for setting the maximum distance between the tool tip and the image plane for which the data is provided to the user to correspond to a maximum distance chosen by the user.

Also, although it would often be useful to the user to know when the tool is at its maximum angle to the ultrasound image plane (i.e. at 90°), the system may also comprise second control means for setting the maximum angle between the tool tip portion and the image plane for which the data is provided to the user to correspond to a maximum angle chosen by the user. It is further believed that each of the above described means of providing data to a user are inventive m their own right. Thus, from a second aspect, the present invention provides a method of graphically representing the position of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane to a user, comprising the steps of : determining the shortest distance between the tool tip and the image plane and the angle between the tool tip portion trajectory and the image plane; displaying the image plane as a line on a monitor screen; representing the distance between the tool tip and the image plane as an indicia spaced by a distance representing that distance from the line; and representing the angle of the tool tip portion trajectory relative to the image plane by a colour or grey scale coded area.

Likewise, from a third aspect, the present invention provides an apparatus for graphically representing the position of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane to a user, comprising: means for determining the shortest distance between the tool tip and the image plane and the angle between the tool tip portion trajectory and the image plane; means for displaying the image plane as a line on a monitor screen; means for representing the distance between the tool tip and the image plane as an indicia spaced by a distance representing that distance from the line; and means for representing the angle of the tool tip portion trajectory relative to the image plane by a colour or grey scale coded area.

In one embodiment, the line indicating the ultrasound image plane may extend vertically. Thus, the indicia may be a line or block spaced horizontally from the vertical line. Further, the colour or grey scale area may be a block extending to one side or the other of the vertical line so as to show the direction m which the tool is facing.

From a fourth aspect, the present invention provides a method of providing a user with data indicating the position of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane produced by an ultrasound probe, comprising the steps of: representing the tool and image plane by virtual objects m 3 -dimensional space; controlling the movement of the virtual objects m 3 dimensional space so that the virtual objects follow the movements of the tool and the image plane m real space; and viewing the virtual objects so as to produce a 2- dimensional image of the virtual objects.

Preferably, the virtual objects may be viewed from any chosen position to produce the 2 -dimensional image. Thus, the user may vary the image m order to obtain the optimum information at all times.

In one useful embodiment the direction of viewing the virtual objects corresponds to the user's direction of view of the real objects. Thus a realistic view of the relationship between the objects may be obtained. Alternatively, the direction of viewing may correspond to the direction of the longitudinal extent of the ultrasound probe. This would allow the user to directly see how the tool is oriented relative to the image plane and also how to move either the tool or the ultrasound probe m order to obtain a desired relative orientation between the tool and image plane.

In order to further assist the user, the ultrasound probe and/or a position sensor attached to the ultrasound probe may also be represented by a virtual obj ect .

From a further aspect, the present invention provides an apparatus for providing a user with data indicating the position of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane produced by an ultrasound probe, comprising: means for representing the tool and image plane by virtual objects m 3 -dimensional space; means for controlling the movement of the virtual objects m 3 dimensional space so that the virtual objects follow the movements of the tool and the image plane m real space; and means for viewing the virtual objects so as to produce a 2 -dimensional image of the virtual objects. From a yet further aspect, the present invention provides a method of acoustically providing a user with data representing the location of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane, comprising the steps of: determining the shortest distance between the tool tip and the image plane and the angle between the tool tip portion trajectory and the image plane; providing an acoustic signal to the user; controlling the frequency of pulses of the acoustic signal so as to indicate the distance of the tool tip from the image plane; and controlling the pitch of the signal so as to indicate the angle between the tool tip portion trajectory and the image plane.

From another aspect, the present invention provides an apparatus for acoustically providing a user with data representing the location of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane, comprising: means for determining the shortest distance between the tool tip and the image plane and the angle between the tool tip portion trajectory and the image plane; means for providing an acoustic signal to the user; means for controlling the frequency of pulses of the acoustic signal so as to indicate the distance of the tool tip from the image plane; and means for controlling the pitch of the signal so as to indicate the angle between the tool tip portion trajectory and the image plane. When the data concerning location and orientation of the tool tip are to be determined independently of the ultrasound data, the location/orientation of the tool may conveniently be determined using a sensor attached to the tool. If this is done, m order to obtain the data representing the location of the tip of the tool and the orientation of the tool tip portion, in is necessary to calibrate the tool to ascertain the exact location of the tip of the tool with respect to a position sensor on the tool .

Thus, from a yet further aspect, the present invention provides a method of determining the location of the tip of a tool and the orientation of the tip portion of a tool relative to a position sensor attached to the tool, the method comprising the steps of: placing the tool m a calibration device such that the locations m a global coordinate system of the tip of the tool and of another point on the tool removed from the tip are known; measuring the position of the position sensor m the global coordinate system; and establishing a mathematical transformation for calibrating the tip of the tool relative to the position sensor. The apparatus with which the calibration is carried out is also believed to be novel and inventive and so, from another aspect, the invention provides an apparatus for determining the location of the tip of a tool and the orientation of the tip portion of a tool relative to a position sensor attached to the tool, the apparatus comprising: a calibration device for holding the tool, the calibration device comprising means for determining the location m a global coordinate system of the tip of the tool and of another point on the tool which is removed from the tip.

The apparatus may further comprise means for establishing a mathematical transformation for calibrating the tip of the tool relative to the known location m the global coordinate system of the position sensor.

In one preferred embodiment, the calibration device comprises a main body part having a passage extending through it, such that the tool may be supported m the passage . In one possible means for determining the location of the tip of the tool, the means comprises a first support m the passage through which the tool passes which is m a known position, and a substantially vertically extending planar surface at one end of the passage against which the tool tip abuts.

Preferably, the means for determining the location of the other point on the tool comprises a second support provided at a known position m the passage through which the tool passes, wherein the second support is removed from the first support.

Preferably, the apparatus further comprises a chuck mechanism which can be clamped together to hold the tool m a known position within the passage. Thus, the tool can be easily inserted into the calibration device and then, once the chuck mechanism is closed, will be securely held m a known position. Preferred embodiments of the invention will now be described by way of example only and with reference to the accompanying drawings m which:

Figure 1 shows a cross section through a calibration device according to the invention; Figures 2a to 2f show the appearance of part of a monitor screen which includes a navigation panel for displaying data to a user according to a first embodiment of the invention for various locations and orientations of a tool relative to the ultrasound image plane;

Figures 3a and 3b show a control panel including a tool navigator according to one embodiment of the invention;

Figures 4a to 4c show a monitor screen displaying data to a user according to an alternative embodiment of the invention;

Figure 5 is a schematic view showing the use of ultrasound m surgery; and

Figure 6 is a schematic representation of the angle and distance between the tool tip and the image plane. As can be seen m Figure 5, an ultrasound probe 1 is positioned so as to obtain real-time ultrasound images of a part of the body which is being operated on. A user can move a handheld surgical tool 2 freely so as to perform the surgery.

Figure 1 illustrates a tool, the position and orientation of which is to be determined and displayed to a surgeon.

As shown m the Figure the surgical tool 2 has a position sensor 4 attached to the end of it remote from the tip. The position and orientation of the sensor m space may readily be determined m the conventional manner with reference to a global co-ordinate frame. However, to determine the position and orientation of the tip 6 of the tool relative to the position sensor 4, and the orientation of the tip portion of the tool relative to the position sensor, the tool must be calibrated before use.

Thus, a calibration device 8 is provided. This device is provided on a board 10 having a reference position sensor 12, the position of which is also determined m the global coordinate frame. Two separate chuck mechanisms 14 and 16 are provided on the board. The first of these chuck mechanisms, 14 has an apertuic 18 within it through which the tool passes. This aperture is blocked at one end by a vertically extending planar member 20. The tool is located such that the tip 6 of the tool abuts the planar member 20.

The second chuck 16 is spaced from the first chuck 14 by a variable distance along the board. The second chuck 16 has an aperture 22 through which the tool passes. The board is constructed so that the direction D from the centre of the first chuck 14 to the centre of the second chuck 16 will always be known. The second chuck 16 is moveable relative to the first chuck 14 m this known direction D so that tools of different sizes can be calibrated by the device. The direction D can be made to correspond to one of the axes of the reference sensor frame system. Thus, when both the first 14 and second 16 chucks have been closed, the tool is held securely m such a way that both the location of the tip 6 and the orientation of the tip portion are known. The location of the position sensor is then measured and the necessary calculations are performed to calculate the position of the tool tip and the orientation of the tip portion relative to the position sensor.

The calibration device 8 is formed as a rigid body. Thus, the apertures 18, 22 m the chucks 14 and 16 can be made to point m a known direction relative to the coordinate system of the reference frame and this direction will not alter due to the rigidity of the device .

In an alternative embodiment, a single moveable chuck is provided on the device. This chuck being moveable m a known direction relative to a vertically extending planar member which is fixed relative to the sensor frame.

A co-ordinate system with its origin at the tool tip is defined. The definition is arbitrary, except for one axis, which is aligned with the tip portion, since the tip is symmetric. However, it is convenient to set the orientation coplanar with or normal to the sensor frame if the orientation of the sensor frame relative to the tool is such that a coplanar or normal orientation would be comfortable for the surgeon using the navigator. The co-ordinate system for the sensor is read directly from the position sensing system. The result of the instrument calibration is the transformation matrix JO t where tt represents the tip of the tool and lo represents the sensor coordinate system (local system, as opposed to the global reference coordinate system) . Equation 1 shows the matrix.

Figure imgf000013_0001

The tip of the tool m the global system is given by the following transformation:

(2)

Figure imgf000013_0002

The matrix C M is the position of the sensor provided by the position sensing system and the right most vector represents the position of the tool tip in its own co-ordinate system.

One advantage of the calibration device described above is that it allows a surgeon to move the position sensor on the tool during a surgical procedure and merely to recalibrate the tool quickly before continuing his work. Thus, the position sensor may be located on the tool at whatever point is most convenient to the surgeon .

In order to ascertain the location and orientation of the tool relative to an ultrasound image it is also necessary to calibrate the ultrasound probe used to acquire the image. This is a more complex procedure than the calibration of the tool. However, several methods of calibrating ultrasound probes are known m the art and so will not be further discussed here. A general discussion of probe calibration methods is included m a paper entitled "Rapid Calibration for 3-D freehand ultrasound" , Ultrasound m Medicine and Biology, Volume 24, No. 6, pages 855 to 869 by Prager et al.

In order to avoid having to recalibrate the ultrasound probe for each use, a clip-on adapter is glued to the probe. The position sensor can then be placed in the adapter and removed after use without any need for recalibration . Similarly, the entire sensor frame may be fixed to certain tools such as biopsy forceps prior to use and is sterilised with the tool. For some other surgical tools however it is preferable that the surgeon is able to decide where on the tool the sensor frame should be attached during the procedure and so in this instance, the sensor frame is not permanently attached to the tool .

The result of the calibration is a matrix JO _. analogous to the instrument calibration, and given by:

lo, probe — M us (3)

Figure imgf000014_0001

A point m the ultrasound image is then given by the following transformation:

(4)

Figure imgf000014_0002

Thus, these are the necessary calibrations, m order to know the position and orientation of the tip of the surgical instrument relative to the sensor attached to the instrument, and the position and orientation of the ultrasound image relative to the sensor attached to the ultrasound probe.

In order to obtain data showing the position of the tip of the tool and the orientation of the tip portion of the tool relative to the ultrasound image plane, a 5 navigator program is employed. The positions of the sensors attached to the ultrasound probe and the surgical instrument are read continuously. The navigator reads one position from each sensor as many times as possible within a short enough time interval so 10 that the tool does not move significantly while the positions are read and averaged. For example, the navigator may calculate the average of the last three positions for each sensor before further calculations. This average is represented by the matrices QlMlG, prote and

- - —1 as—l o instr '

The following matrix multiplications give the position and orientation m the global coordimate system of the ultrasound image and the tip portion of the tool respectively:

g M^ lo probe lo probe — M us ( κ5^>) I

g ^- lo instr lo instr — tt

20 It is now possible to utilise this information to calculate various properties, which are presented to the surgeon. The two most obvious properties to be calculated are the distance (normal) between the ultrasound image and the tip of the tool, and the angle

25 between the tip portion of the tool and the image plane. With these properties and continuous update on the properties, the surgeon will be able to adjust either the probe or the instrument m order to obtain a longitudinal cross sectional view of the instrument m

30 the image plane.

To calculate the distance between the tool tip and the ultrasound plane, firstly the shortest distance between the tip of the tool and the ultrasound image plane, which is equal to the length of a vector extending perpendicular to the image plane from the tool tip, is determined This distance can be calculated as follows Let the image plane be defined by three points on the plane P, Q and R, and let A be a point not on the image plane The shortest distance d between the poirt A and the image plane can be calculated from equation 6

Figure imgf000016_0001

The point A defines the tool tip m the global coordinate system, P is set to the image plane origin, Q is a point along the y-axis of the image plane (some distance greater than zero from P) , R is a point along the z-axis of the image plane (some distance greater than zero from P) , and ϋ is a unit vector along the x- axis of the image plane, perpendicular to the image plane All these data (A, P, Q, R and ϋ ) are readily available from the position readings of the position sensing system and the calibration matrices.

The distance between the tool tip and the ultrasound image origin is simply equal to the length of the vector between point A and P. Next, the angle φ between the tool tip extension trajectory line T and the image plane is determined. This angle can be found from the equation for the dot product between two vectors, given by:

f Qχ - \f\

Figure imgf000016_0002
cos l (f,Qχ) = \f\
Figure imgf000016_0003
cos(α) (7)

where is the angle between the x-axis of the image plane and the tool tip extension trajectory line. There are three possible situations: 1) the tool is pointing completely or partially along the negative direction of the image plane x-axis (illustrated m Figure 6) ; 2) the tool is pointing completely or partially along the positive direction of the image plane x-axis, or 3) the tool is pointing perpendicularly at the image plane x-axis, m which case the tool is parallel with the image plane. Which case is true can be determined from the angle a . α > 90° corresponds to case 1) , α < 90° corresponds to case 2, while = 90° corresponds to case 3. This means that it is possible to determine which way the tip portion of the tool is pointing with respect to the image plane, e.g. by differentiating between whether the tool is pointing towards the sensor side of the probe or vice versa. The angle of interest for the tool navigator is angle φ m figure 6, which is always less than or equal to 90°. The angle is given by (differentiating between the three situations) :

Figure imgf000017_0001

Case 2) φ = (8b)

Figure imgf000017_0002

Case 3) Φ (8c)

Various methods of displaying data representing the position and orientation of the tool relative to an ultrasound image plane will now be described. A first way of displaying this data is shown m Figures 2a to 2f. The dotted line m these figures represents the actual image plane and the arrow represents the location of the tool relative to it .

In this embodiment a navigation panel is located m the bottom corner of the same monitor screen used to display the ultrasound image to the surgeon. The navigation panel occupies about 5% of the area of the monitor screen. In the navigation panel, the ultrasound image plane is represented by a vertically extending straight line. A rectangle extends horizontally outward from the line by the distance of the tool tip from the image plane. A second rectangle is provided extending from the vertical line. This rectangle is colour coded depending on the angle at which the tool tip portion extends to the image plane.

The operator can at any time change the maximum distance between the tool and image plane to be shown by the navigator. This means that if the distance is greater than this maximum, the navigator indicates that the tool and image are too far away from each other to be of interest for navigation. Likewise, the angle can be set to some maximum value, but sometimes it can be of value to know if the instrument is pointing at a normal angle (the maximum value angle) to the image plane. In the navigator feedback display shown on the navigation panel, the operator can flip the entire set-up if he should happen to switch position from one side of the patient to the other so that left and right m the display keeps its correspondence to the patient.

Figure 3a shows an example of the panel for setting the desired maximum values and colour panel to be used m the navigation panel, as well as the probe and surgical tool to be used. The operator can also call the program for calibrating a tool if the desired tool has not been calibrated previously. Figure 3b shows an example of the navigator feedback display with a situation where the tool tip is 34.6 mm away from the image plane and is oriented at an 3.8° angle relative to the ultrasound scan plane.

The navigator can be used m five different ways for this method: 1. The first and maybe the most useful to the clinician, is to first attach the ultrasound probe m a position that gives him the best view of the structure of interest and then use the navigator as feedback m order to adjust the instrument during insertion into the body .

2. The second method is to attach the tool m a certain position due to for instance limitations m the anatomy, and then adjust the ultrasound probe according to the feedback from the navigator. In this way the surgeon achieves the best possible longitudinal view of the cross section of the instrument .

3. The third method is to allow both the instrument and the ultrasound probe to be adjusted during insertion of the instrument into the body, and also during the procedure. This could be advantageous if the tool needs to be moved a lot.

4. The tool may be removed and the position sensor may be a sensor without three dimensional axis definition (a position locator, just position m space, not orientation, for instance a small piezoelectric element m acoustic position sensing) . In this case the navigator would indicate just the distance (orthogonal distance from point onto plane) between the position sensor and the ultrasound image. The position sensor could be located inside or outside the body of the patient (depending on the type of position sensing system) . The probe can then be adjusted to an optimal view of the area of interest m relation to the position sensor.

5. A robot may control the position of the probe according to the orientation and position m space of the tool or position of a position sensor. The robot could then make sure that the ultrasound image always displays the best possible or preferred view of the tool or just position locator, for instance a longitudinal cross sectional view, or any other preferred view (depending on the specific anatomical and/or surgical situation) .

The navigation panel could for instance be located at the bottom corner of the same screen as the navigation computer uses to present processed data during the surgical procedure. It should be small enough not to obstruct any important data or images on the screen, but at the same time not so small that the operator can not see the necessary information with a quick glance at the screen.

An alternative embodiment of the method of providing the data to the surgeon, which is used alone or in conjunction with the graphical display methods is acoustic feedback. The acoustic feedback needs to be of such a kind that it does not irritate or annoy the operators present in the operating room. This means that when the ultrasound probe and the surgical tool are too far away from each other to be interesting for navigation purposes, the system is silent. This is natural since this will be the situation during most of the time in the operating room.

The different situations are summarised as follows :

1. Silence : Tool and probe too far away from each other for the situation to be interesting for navigation .

2. Low pitch low frequency signal : Situation is close to optimal. That is, the tip portion of the tool line is close to being in the image plane. 3. High pitch low frequency signal: Tool tip is close to being in the image plane, but the angle between the tip portion of the tool and the image plane is significant .

4. High pitch high frequency signal : Tool tip is far away and tip portion of tool is at a significant angle relative to image plane in space.

5. Low pitch high frequency signal : Tip portion of tool is close to parallel with the image plane but the tool tip is at a significant distance from the image plane m space.

6. Lowest pitch lowest frequency signal : Perfect situation. The tool tip portion is m the image plane. In this a tone that is distinct from the tone of the signal for the other cases is used so that there is no doubt about the fact that the situation is optimal (easier for the operator to understand) . 7. Highest pitch highest frequency signal : Worst case. The tip portion is at 90° angle to the image plane and the tip is located at maximum distance from the image plane for the navigator to run.

The pitches and frequencies should not be at annoying levels. A signal consisting of a 'ping' about once a second could for instance mean that the tool tip is about 1 cm from the image plane and that the angle between the plane and tip portion of the tool is about 10°. The frequency for the optimal situation could for instance be a 'pmg' every 5 seconds. When the situation just reaches the worst case (maximum values, or limits for when to start the feedback) and goes beyond these, the system could present a continuous signal for 2 seconds and then be silent until the tool and image are within the limits again. The difference between minimum and maximum pitch and between minimum and maximum frequency should not be too large. Too large dynamic range for frequency and pitch could cause irritation m the operating room (acoustic noise) . The volume and the tones will of course be adjustable, just like the colour scale m the graphical display method.

In another embodiment of the invention, the graphical display method is used m a 3 dimensional ultrasound image. In this embodiment, a 3D volume is acquired using an ultrasound probe and is then transferred to a visualisation computer together with position data from the position sensor.

The volume is visualised stereoscopically or as three orthogonal slices. The surgical tool with a position sensor is then inserted into the area that has just been scanned. As the surgical procedure progresses new volumes are acquired to update the scene .

In a first embodiment, the tool tip position and orientation control the visualisation of the volume and a line (or some other geometric structure) is drawn representing the tool m the volume at the pixels determined by the position readings. That is, the pixels where the tip portion of the tool is located m the volume are replaced by some bright colour. One could for example make sure that the tool tip is positioned m the centre of the visualised volume. The orthogonal slices can be picked according to the coordinate system on the tip of the tool (determined m the calibration process) . The tip itself can be indicated by a different colour than the rest of the tip portion of the tool.

In a second embodiment, m any-plane visualisation mode, the tool is coded with colour and shown m the image on the screen. This means that the image of the tool itself (or the tip portion at least, or just a small part of this) becomes a line on either side of which colour coded blocks representing the distance between the tool tip and the image plane and the angle between the tool tip portion and the image plane are displayed. The surgeon is then able to see where the tip portion of the tool is located m relation to the current image from the 3D volume. In other words, the tool incorporated with a coloured bar coding the distance and angle measures is projected onto the image. The image can be picked according to the position of the ultrasound probe. In this manner, the surgeon will have an any-plane view from a 3D volume with the tool position and orientation information included, m addition to the real-time 2D-ultrasound image on the scanner .

With stereoscopic visualisation, the navigator is able to indicate with coloured arrows on the surface of the volume where the tip would enter if it were moved into the volume. The tip portion of the tool (except the tip) is coloured with a bright colour m the images, but with a different colour than the tip.

A yet further embodiment of the invention, in which the position and orientation of the tool relative to an ultrasound image plane is shown by means of virtual objects m 3 dimensional space, is described with reference to Figure 4.

In one very useful embodiment of the invention, the surgeon operates based on a realistic and integrated 3D scene. This scene consists of model objects (surgical tools and ultrasound probe with scan plane) , preoperative objects (collected from imaging modalities such as MRI , functional MRI and CT) , and real-time mtra-operative objects (from ultrasound, intervention MRI and microscope/video) . However, a scene consisting of only the model objects would be sufficient to visualise the position and orientation of the surgical tool relative to the ultrasound scan-plane m a direct and simple way.

Assuming a polygon description of the interesting model objects (which could be made arbitrary accurate and realistic), these virtual objects follow the same movements as the real objects (by means of the position data from the positioning system) . This will create a virtual scene of all the surgical tools and ultrasound probe activity m surgical space. At the same time, the surgeon may view the scene from an arbitrary position and orientation. By setting the viewpoint m the proximity of the surgeon one would obtain a realistic overview of the relationship between the objects that constitute the scene (Figure 4, view 1 a to c) . In this scene, left/right, up/down, and m front/back on the screen is m correspondence with what is happening on the operating table, as seen from the surgeon. However, m order to navigate a tool into the image plane (or the other way around) , it is more convenient to constantly let the viewpoint follow the probe. This could be done by letting the viewpoint be located a short distance into the probe casing. The view- forward direction could be along the image plane radial direction (i.e. parallel to the image plane and along the centre beam) , and view-up direction along the second axis of the image plane. The axis from left to right m this view situation would be normal to the image plane In this manner one would directly see how a tool is oriented m relation to the image plane, as well as how to move one of them (position and orientation) m order to obtain the desired relative orientation (Figure 4,

Figure imgf000024_0001
Figure 4. This figure shows three (a to c) screen snap-shots from a prototype application with four different views (1-4) . The two objects m the scene are a tool and an ultrasound probe. The position sensor spheres attached to the probe are indicated as yellow points, the sensor on the tool is indicated with green colour, while the reference system, or global coordinate system, is shown as three red dots. View 1 is an over view situation, view 2 shows the parallel view, view 3 is the normal view from one side of the probe, while view 4 is the normal view from the opposite side. More than one view can be shown at once (as m this figure) or the surgeon can toggle the active view with voice recognition control for instance. In the parallel view (view 2) , the image plane coordinate system can be seen m the middle, the tool at the top, and the reference system at the right bottom of the view.

The visualisation can be flipped (manually or automatically) , supplemented with accurate numeric measures of distance and angle coded m colour or sound as described previously. For instance, a parallel view gives little or no parallel information, so as to for example explain where m the image plane the surgical tool is located, or whether the tool is above or below the image plane But by letting the colour of the tool reflect the parallel distance from known reference points (i.e. the middle of the image plane depth for instance) , less important data like this could be incorporated into a single view.

Alternatively, the viewpoint can be positioned m such a way that one is looking in a direction normal to the image plane. In this situation the viewpoint should follow the probe and thus allow the surgeon to directly see where m the image plane the tool is located (Figure 4, view 3 and 4 m snap-shots a to c) . In many ways this view shows the same as what can be seen on the ultrasound scanner (if the tool lies m the plane) . Therefore, this view could to a certain degree be superfluous. However, this view could be used as a qualitative verification of the tool calibration (and also the probe calibration) . It could be utilised to visualise how well the real world (the tool m a real- time 2D image on the ultrasound scanner) maps over to the virtual representation (the objects m this normal view) . By texture-mapping a real-time 2D ultrasound image on the virtual image plane m a parallel view one would see, assuming a high degree of accuracy, that the virtual tool would coincide with the real tool seen m the real-time 2D image.

Although a number of preferred embodiments of the invention have been described above, it will be readily apparent to the skilled person that various modifications to these would be possible. For example, the colour coded block showing the angle at which the tool extends to the image plane could use a grey scale rather than colour scale.

Claims

1. A system for providing a user with data indicating the location of the tip of a tool and the orientation of the tip portion of the tool relative to an ultrasound image plane wherein the data is presented separately from the ultrasound image so that the location and orientation of the tool are easily discemable regardless of the location and orientation of the tool relative to the ultrasound image plane.
2. A system as claimed m claim 1, the system comprising means for representing the data m graphic form.
3. A system as claimed m claim 1 or 2 , wherein an ultrasound image is provided on a first part of a monitor screen and the data m graphic form is provided on a second part of the monitor screen.
4. A system as claimed m any of claims 1 to 3 , wherein to display the data to the user, the ultrasound image plane is indicated by a line: the shortest distance between the tool tip and the image plane is represented by an indicia spaced by a distance representing that distance from the line; and the angle of the tool tip portion trajectory relative to the image plane is represented by a colour or grey scale coded area.
5. A system as claimed m claim 4, wherein the line extends substantially vertically.
6. A system as claimed m claim 4 or 5 , wherein the indicia comprises a line or block extending parallel to the line representing the image plane.
7. A system as claimed m claim 6, wherein the coded area extends to one side or the other of the line representing the image plane so as to show the direction m which the tool is facing.
8. A system as claimed m any preceding claim, wherein the data is displayed to the user by means of virtual objects representing the tool and the ultrasound image plane m three dimensional space m which the virtual objects are arranged relative to one another as m the real situation, wherein the virtual objects may be viewed from any chosen location and m any direction to produce a two dimensional image.
9. A system as claimed m claim 8, wherein the virtual objects further represent an ultrasound probe by which the ultrasound images are obtained.
10. A system as claimed m claim 9, wherein the virtual objects further represent a position sensor attached to the ultrasound probe.
11. A system as claimed m any preceding claim, the system comprising means for representing the data m the form of an acoustic signal.
12. A system as claimed m claim 11, wherein the pitch of the acoustic signal is relatively low when the tip portion of the tool is parallel to the image plane.
13. A system as claimed m claim 12, wherein the pitch of the acoustic signal increases as the angle between the tip portion of the tool and the image plane increases .
14. A system as claimed m any of claims 11 to 13, wherein the frequency of the acoustic signal is relatively low when the tip of the tool is located on the image plane.
15. A system as claimed m claim 14, wherein the frequency of the acoustic signal increases as the distance between the tip of the tool and the image plane increases .
16. A system as claimed m any preceding claim, the system comprising first control means for setting the maximum distance between the tool tip and the image plane for which the data is provided to the user, to correspond to a maximum distance chosen by the user.
17. A system as claimed m any preceding claim, the system comprising second control means for setting the maximum angle between the tool tip portion and the image plane for which the data is provided to the user, to correspond to a maximum angle chosen by the user.
18. A method of graphically representing the position of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane to a user, comprising the steps of: determining the shortest distance between the tool tip and the image plane and the angle between the tool tip portion trajectory and the image plane; displaying the image plane as a line on a monitor screen; representing the distance between the tool tip and the image plane as an indicia spaced by a distance representing that distance from the line; and representing the angle of the tool tip portion trajectory relative to the image plane by a colour or grey scale coded region.
19. An apparatus for graphically displaying the position of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane to a user, comprising: means for determining the shortest distance between the tool tip and the image plane and the angle between the tool tip portion trajectory and the image plane; means for representing the image plane as a line on a monitor screen; means for representing the distance between the tool tip and the image plane as an indicia spaced by a distance representing that distance from the line; and means for representing the angle of the tool tip portion trajectory relative to the image plane by a colour or grey scale coded region.
20. A method or apparatus as claimed in claim 18 or 19, wherein the line representing the image plane is substantially vertical.
21. A method or apparatus as claimed in claim 18, 19 or 20 wherein the indicia is a line or block extending substantially parallel to the line representing the image plane.
22. A method or apparatus as claimed in any of claims 18 to 21, wherein the colour or grey-scale coded region extends to one side or the other of the line representing the image plane so as to show the direction in which the tool extends.
23. A method of providing a user with data indicating the position of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane produced by an ultrasound probe, comprising the steps of: representing the tool and image plane by virtual objects in 3 -dimensional space; controlling the movement of the virtual objects m 3 -dimensional space so that the virtual objects follow the movements of the tool and the image plane m real space ; and viewing the virtual objects so as to produce a 2- dimensional image of the virtual objects.
24. A method as claimed m claim 23, wherein the virtual objects may be viewed from any chosen position to produce the 2 -dimensional image.
25. A method as claimed m claim 24, wherein the direction of viewing the virtual objects corresponds to the user's direction of view of the real objects.
26. A method as claimed m claim 24, wherein the direction of viewing the virtual objects corresponds to the direction of the longitudinal extent of the ultrasound probe.
27. A method as claimed m any of claims 23 to 26, wherein the ultrasound probe is also represented by a virtual object.
28. A method as claimed m any of claims 23 to 27, wherein a position sensor provided on the ultrasound probe is also represented by a virtual object.
29. An apparatus for providing a user with data indicating the position of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane produced by an ultrasound probe, comprising : means for representing the tool and image plane by virtual objects m 3 -dimensional space; means for controlling the movement of the virtual objects m 3 dimensional space so that the virtual objects follow the movements of the tool and the image plane m real space; and means for viewing the virtual objects so as to produce a 2 -dimensional image of the virtual objects.
30 A method of acoustically providing a user with data representing the location of the tip of a tool and the orientation of the tip portion of the tool relative to an ultrasound image plane, comprising the steps of: determining the shortest distance between the tool tip and the image plane and the angle between the tool tip portion trajectory and the image plane; and representing the distance by the pitch of the signal or the frequency of pings of the signal and representing the angle by the other of the pitch of the signal and the frequency of pings of the signal .
31. A method of acoustically providing a user with data representing the location of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane, comprising the steps of: determining the shortest distance between the tool tip and the image plane and the angle between the tool tip portion trajectory and the image plane; providing an acoustic signal to the user; controlling the frequency of pings of the acoustic signal so as to increase with the increasing distance of the tool tip from the image plane; and controlling the pitch of the signal so as to increase with the increasing angle between the tool tip portion trajectory and the image plane.
32. An apparatus for acoustically providing a user with data representing the location of the tip of a tool and the orientation of the tip portion of a tool relative to an ultrasound image plane, comprising: means for determining the shortest distance between the tool tip and the image plane and the angle between the tool tip portion trajectory and the image plane; means for providing an acoustic signal to the user; means for controlling the frequency of pings of the acoustic signal so as to increase with the increasing distance of the tool tip from the image plane, and means for controlling the pitch of the signal so as to increase with the increasing angle between the tool tip portion trajectory and the image plane.
33. A method of determining the location of the tip of a tool and the orientation of the tip portion of a tool relative to a position sensor attached to the tool, the method comprising the steps of : placing the tool m a calibration device such that the location m a global coordinate system of the tip of the tool and the direction m which the tip portion of the tool extends are known; measuring the position of the position sensor m the global coordinate system, and establishing a mathematical transformation for calibrating the tip of the tool relative to the position sensor .
34. A calibration device for holding a tool, the device comprising means for determining the location m a global coordinate system of the tip of the tool and the direction m which the tip portion of the tool extends.
35. An apparatus for determining the location of the tip of a tool and the orientation of the tip portion of a tool relative to a position sensor attached to the tool, the apparatus comprising: a calibration device as claimed m claim 34; and means for establishing a mathematical transformation for calibrating the tip of the tool relative to the known location in the global coordinate system of the position sensor.
36. An apparatus as claimed m claim 34 or 35, wherein the means for determining the location of the tip of the tool comprises a first support m the passage through which the tool passes which is m a known position, and a substantially vertically extending planar surface at one end of the passage against which the tool tip abuts.
37. An apparatus as claimed m any of claims 34 to 36, wherein the means for determining the direction m which the tip portion of the tool extends comprises a second support provided m the passage through which the tool passes, wherein the second support is removed from the first support by a variable distance, the second support being moveable only m a known direction relative to the first support.
38. An apparatus as claimed any of claims 35 to 37, wherein the calibration device further comprises a main body part having a passage extending through it, and a chuck mechanism which can be clamped together to hold the tool m a known position within the passage.
39. A system, method or apparatus as claimed m any preceding claim, wherein the tool is a tool for use during surgery and the user is a surgeon or surgeon's assistant .
PCT/GB2000/004648 1999-12-03 2000-12-04 Tool navigator WO2001039683A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB9928695.7 1999-12-03
GBGB9928695.7A GB9928695D0 (en) 1999-12-03 1999-12-03 Tool navigator

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU17196/01A AU1719601A (en) 1999-12-03 2000-12-04 Tool navigator

Publications (1)

Publication Number Publication Date
WO2001039683A1 true WO2001039683A1 (en) 2001-06-07

Family

ID=10865720

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2000/004648 WO2001039683A1 (en) 1999-12-03 2000-12-04 Tool navigator

Country Status (3)

Country Link
AU (1) AU1719601A (en)
GB (1) GB9928695D0 (en)
WO (1) WO2001039683A1 (en)

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1364183A1 (en) * 2001-01-30 2003-11-26 Z-Kat Inc. Tool calibrator and tracker system
WO2004049954A2 (en) * 2002-12-02 2004-06-17 Aesculap Ag & Co. Kg Localization device display method and apparatus
DE10313829A1 (en) * 2003-03-21 2004-10-07 Aesculap Ag & Co. Kg Medical navigation system and method for use thereof, whereby the surgical instrument being used has an attached marker element so that the imaged operation area can be centered on the instrument
EP1768568A2 (en) * 2004-05-07 2007-04-04 Johns Hopkins University Image guided interventions with interstitial or transmission ultrasound
EP1937153A2 (en) * 2005-06-21 2008-07-02 Traxtal Inc. Device and method for a trackable ultrasound
WO2012141913A1 (en) * 2011-04-11 2012-10-18 Imacor, Inc. Ultrasound guided positioning of cardiac replacement valves
US8690776B2 (en) 2009-02-17 2014-04-08 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US8781555B2 (en) 2007-11-26 2014-07-15 C. R. Bard, Inc. System for placement of a catheter including a signal-generating stylet
US8784336B2 (en) 2005-08-24 2014-07-22 C. R. Bard, Inc. Stylet apparatuses and methods of manufacture
US8849382B2 (en) 2007-11-26 2014-09-30 C. R. Bard, Inc. Apparatus and display methods relating to intravascular placement of a catheter
US8858455B2 (en) 2006-10-23 2014-10-14 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
US9107698B2 (en) 2010-04-12 2015-08-18 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US9125578B2 (en) 2009-06-12 2015-09-08 Bard Access Systems, Inc. Apparatus and method for catheter navigation and tip location
US9265572B2 (en) 2008-01-24 2016-02-23 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for image guided ablation
US9265443B2 (en) 2006-10-23 2016-02-23 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
US9339206B2 (en) 2009-06-12 2016-05-17 Bard Access Systems, Inc. Adaptor for endovascular electrocardiography
US9364294B2 (en) 2009-02-17 2016-06-14 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US9415188B2 (en) 2010-10-29 2016-08-16 C. R. Bard, Inc. Bioimpedance-assisted placement of a medical device
US9456766B2 (en) 2007-11-26 2016-10-04 C. R. Bard, Inc. Apparatus for use with needle insertion guidance system
US9492097B2 (en) 2007-11-26 2016-11-15 C. R. Bard, Inc. Needle length determination and calibration for insertion guidance system
US9521961B2 (en) 2007-11-26 2016-12-20 C. R. Bard, Inc. Systems and methods for guiding a medical instrument
US9532724B2 (en) 2009-06-12 2017-01-03 Bard Access Systems, Inc. Apparatus and method for catheter navigation using endovascular energy mapping
WO2017004029A1 (en) * 2015-06-29 2017-01-05 Medtronic Navigation, Inc. Method and apparatus for identification of multiple navigated instruments
US9554716B2 (en) 2007-11-26 2017-01-31 C. R. Bard, Inc. Insertion guidance system for needles and medical components
US9636031B2 (en) 2007-11-26 2017-05-02 C.R. Bard, Inc. Stylets for use with apparatus for intravascular placement of a catheter
US9649048B2 (en) 2007-11-26 2017-05-16 C. R. Bard, Inc. Systems and methods for breaching a sterile field for intravascular placement of a catheter
US9659345B2 (en) 2006-08-02 2017-05-23 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US9675319B1 (en) 2016-02-17 2017-06-13 Inneroptic Technology, Inc. Loupe display
US9681823B2 (en) 2007-11-26 2017-06-20 C. R. Bard, Inc. Integrated system for intravascular placement of a catheter
US9839372B2 (en) 2014-02-06 2017-12-12 C. R. Bard, Inc. Systems and methods for guidance and placement of an intravascular device
US9901406B2 (en) 2014-10-02 2018-02-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
US9901714B2 (en) 2008-08-22 2018-02-27 C. R. Bard, Inc. Catheter assembly including ECG sensor and magnetic assemblies
US9907513B2 (en) 2008-10-07 2018-03-06 Bard Access Systems, Inc. Percutaneous magnetic gastrostomy
US9949700B2 (en) 2015-07-22 2018-04-24 Inneroptic Technology, Inc. Medical device approaches
US10046139B2 (en) 2010-08-20 2018-08-14 C. R. Bard, Inc. Reconfirmation of ECG-assisted catheter tip placement
US10188467B2 (en) 2014-12-12 2019-01-29 Inneroptic Technology, Inc. Surgical guidance intersection display
US10278778B2 (en) 2016-10-27 2019-05-07 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
US10314559B2 (en) 2013-03-14 2019-06-11 Inneroptic Technology, Inc. Medical device guidance
US10349890B2 (en) 2015-06-26 2019-07-16 C. R. Bard, Inc. Connector interface for ECG-based catheter positioning system

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8670816B2 (en) 2012-01-30 2014-03-11 Inneroptic Technology, Inc. Multiple medical device guidance

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4168707A (en) * 1977-06-13 1979-09-25 Douvas Nicholas G Control apparatus for microsurgical instruments
WO1996025881A1 (en) 1995-02-22 1996-08-29 Groenningsaeter Aage Method for ultrasound guidance during clinical procedures
WO1997003609A1 (en) * 1995-07-16 1997-02-06 Ultra-Guide Ltd. Free-hand aiming of a needle guide
US5638819A (en) * 1995-08-29 1997-06-17 Manwaring; Kim H. Method and apparatus for guiding an instrument to a target
US5748767A (en) * 1988-02-01 1998-05-05 Faro Technology, Inc. Computer-aided surgery apparatus
US5817022A (en) * 1995-03-28 1998-10-06 Sonometrics Corporation System for displaying a 2-D ultrasound image within a 3-D viewing environment
EP0919203A2 (en) * 1997-11-28 1999-06-02 Picker International, Inc. Frameless stereotactic surgical apparatus
EP0986991A1 (en) * 1998-09-18 2000-03-22 Howmedica Leibinger GmbH &amp; Co KG Calibration device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4168707A (en) * 1977-06-13 1979-09-25 Douvas Nicholas G Control apparatus for microsurgical instruments
US5748767A (en) * 1988-02-01 1998-05-05 Faro Technology, Inc. Computer-aided surgery apparatus
WO1996025881A1 (en) 1995-02-22 1996-08-29 Groenningsaeter Aage Method for ultrasound guidance during clinical procedures
US5817022A (en) * 1995-03-28 1998-10-06 Sonometrics Corporation System for displaying a 2-D ultrasound image within a 3-D viewing environment
WO1997003609A1 (en) * 1995-07-16 1997-02-06 Ultra-Guide Ltd. Free-hand aiming of a needle guide
US5638819A (en) * 1995-08-29 1997-06-17 Manwaring; Kim H. Method and apparatus for guiding an instrument to a target
EP0919203A2 (en) * 1997-11-28 1999-06-02 Picker International, Inc. Frameless stereotactic surgical apparatus
EP0986991A1 (en) * 1998-09-18 2000-03-22 Howmedica Leibinger GmbH &amp; Co KG Calibration device

Cited By (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1364183A4 (en) * 2001-01-30 2008-06-18 Z Kat Inc Tool calibrator and tracker system
EP1364183A1 (en) * 2001-01-30 2003-11-26 Z-Kat Inc. Tool calibrator and tracker system
US7319897B2 (en) 2002-12-02 2008-01-15 Aesculap Ag & Co. Kg Localization device display method and apparatus
WO2004049954A2 (en) * 2002-12-02 2004-06-17 Aesculap Ag & Co. Kg Localization device display method and apparatus
WO2004049954A3 (en) * 2002-12-02 2004-08-12 Aesculap Ag & Co Kg Localization device display method and apparatus
DE10313829A1 (en) * 2003-03-21 2004-10-07 Aesculap Ag & Co. Kg Medical navigation system and method for use thereof, whereby the surgical instrument being used has an attached marker element so that the imaged operation area can be centered on the instrument
DE10313829B4 (en) * 2003-03-21 2005-06-09 Aesculap Ag & Co. Kg Method and device for selecting an image section of an operational area
EP1768568A2 (en) * 2004-05-07 2007-04-04 Johns Hopkins University Image guided interventions with interstitial or transmission ultrasound
EP1768568A4 (en) * 2004-05-07 2009-03-18 Univ Johns Hopkins Image guided interventions with interstitial or transmission ultrasound
EP1937153A4 (en) * 2005-06-21 2010-02-10 Traxtal Inc Device and method for a trackable ultrasound
EP1937153A2 (en) * 2005-06-21 2008-07-02 Traxtal Inc. Device and method for a trackable ultrasound
US10004875B2 (en) 2005-08-24 2018-06-26 C. R. Bard, Inc. Stylet apparatuses and methods of manufacture
US8784336B2 (en) 2005-08-24 2014-07-22 C. R. Bard, Inc. Stylet apparatuses and methods of manufacture
US10127629B2 (en) 2006-08-02 2018-11-13 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US9659345B2 (en) 2006-08-02 2017-05-23 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
US9265443B2 (en) 2006-10-23 2016-02-23 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
US9833169B2 (en) 2006-10-23 2017-12-05 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
US8858455B2 (en) 2006-10-23 2014-10-14 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
US9345422B2 (en) 2006-10-23 2016-05-24 Bard Acess Systems, Inc. Method of locating the tip of a central venous catheter
US9554716B2 (en) 2007-11-26 2017-01-31 C. R. Bard, Inc. Insertion guidance system for needles and medical components
US10165962B2 (en) 2007-11-26 2019-01-01 C. R. Bard, Inc. Integrated systems for intravascular placement of a catheter
US9681823B2 (en) 2007-11-26 2017-06-20 C. R. Bard, Inc. Integrated system for intravascular placement of a catheter
US8849382B2 (en) 2007-11-26 2014-09-30 C. R. Bard, Inc. Apparatus and display methods relating to intravascular placement of a catheter
US8781555B2 (en) 2007-11-26 2014-07-15 C. R. Bard, Inc. System for placement of a catheter including a signal-generating stylet
US9649048B2 (en) 2007-11-26 2017-05-16 C. R. Bard, Inc. Systems and methods for breaching a sterile field for intravascular placement of a catheter
US9636031B2 (en) 2007-11-26 2017-05-02 C.R. Bard, Inc. Stylets for use with apparatus for intravascular placement of a catheter
US10231753B2 (en) 2007-11-26 2019-03-19 C. R. Bard, Inc. Insertion guidance system for needles and medical components
US9456766B2 (en) 2007-11-26 2016-10-04 C. R. Bard, Inc. Apparatus for use with needle insertion guidance system
US9492097B2 (en) 2007-11-26 2016-11-15 C. R. Bard, Inc. Needle length determination and calibration for insertion guidance system
US9521961B2 (en) 2007-11-26 2016-12-20 C. R. Bard, Inc. Systems and methods for guiding a medical instrument
US9526440B2 (en) 2007-11-26 2016-12-27 C.R. Bard, Inc. System for placement of a catheter including a signal-generating stylet
US10342575B2 (en) 2007-11-26 2019-07-09 C. R. Bard, Inc. Apparatus for use with needle insertion guidance system
US9999371B2 (en) 2007-11-26 2018-06-19 C. R. Bard, Inc. Integrated system for intravascular placement of a catheter
US9549685B2 (en) 2007-11-26 2017-01-24 C. R. Bard, Inc. Apparatus and display methods relating to intravascular placement of a catheter
US10238418B2 (en) 2007-11-26 2019-03-26 C. R. Bard, Inc. Apparatus for use with needle insertion guidance system
US10105121B2 (en) 2007-11-26 2018-10-23 C. R. Bard, Inc. System for placement of a catheter including a signal-generating stylet
US9265572B2 (en) 2008-01-24 2016-02-23 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for image guided ablation
US9901714B2 (en) 2008-08-22 2018-02-27 C. R. Bard, Inc. Catheter assembly including ECG sensor and magnetic assemblies
US9907513B2 (en) 2008-10-07 2018-03-06 Bard Access Systems, Inc. Percutaneous magnetic gastrostomy
US9398936B2 (en) 2009-02-17 2016-07-26 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US9364294B2 (en) 2009-02-17 2016-06-14 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US8690776B2 (en) 2009-02-17 2014-04-08 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US10136951B2 (en) 2009-02-17 2018-11-27 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US10271762B2 (en) 2009-06-12 2019-04-30 Bard Access Systems, Inc. Apparatus and method for catheter navigation using endovascular energy mapping
US9339206B2 (en) 2009-06-12 2016-05-17 Bard Access Systems, Inc. Adaptor for endovascular electrocardiography
US9125578B2 (en) 2009-06-12 2015-09-08 Bard Access Systems, Inc. Apparatus and method for catheter navigation and tip location
US9532724B2 (en) 2009-06-12 2017-01-03 Bard Access Systems, Inc. Apparatus and method for catheter navigation using endovascular energy mapping
US10231643B2 (en) 2009-06-12 2019-03-19 Bard Access Systems, Inc. Apparatus and method for catheter navigation and tip location
US9107698B2 (en) 2010-04-12 2015-08-18 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US10046139B2 (en) 2010-08-20 2018-08-14 C. R. Bard, Inc. Reconfirmation of ECG-assisted catheter tip placement
US9415188B2 (en) 2010-10-29 2016-08-16 C. R. Bard, Inc. Bioimpedance-assisted placement of a medical device
WO2012141914A1 (en) * 2011-04-11 2012-10-18 Imacor, Inc. Ultrasound guided positioning of cardiac replacement valves with 3d visualization
WO2012141913A1 (en) * 2011-04-11 2012-10-18 Imacor, Inc. Ultrasound guided positioning of cardiac replacement valves
US10314559B2 (en) 2013-03-14 2019-06-11 Inneroptic Technology, Inc. Medical device guidance
US9839372B2 (en) 2014-02-06 2017-12-12 C. R. Bard, Inc. Systems and methods for guidance and placement of an intravascular device
US9901406B2 (en) 2014-10-02 2018-02-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
US10188467B2 (en) 2014-12-12 2019-01-29 Inneroptic Technology, Inc. Surgical guidance intersection display
US10349890B2 (en) 2015-06-26 2019-07-16 C. R. Bard, Inc. Connector interface for ECG-based catheter positioning system
WO2017004029A1 (en) * 2015-06-29 2017-01-05 Medtronic Navigation, Inc. Method and apparatus for identification of multiple navigated instruments
US9949700B2 (en) 2015-07-22 2018-04-24 Inneroptic Technology, Inc. Medical device approaches
US9675319B1 (en) 2016-02-17 2017-06-13 Inneroptic Technology, Inc. Loupe display
US10278778B2 (en) 2016-10-27 2019-05-07 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space

Also Published As

Publication number Publication date
GB9928695D0 (en) 2000-02-02
AU1719601A (en) 2001-06-12

Similar Documents

Publication Publication Date Title
US6690960B2 (en) Video-based surgical targeting system
JP5433032B2 (en) The ultrasonic diagnostic apparatus
US5638819A (en) Method and apparatus for guiding an instrument to a target
US6442416B1 (en) Determination of the position and orientation of at least one object in space
US9101397B2 (en) Real-time generation of three-dimensional ultrasound image using a two-dimensional ultrasound transducer in a robotic system
US7107090B2 (en) Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure
Prager et al. Rapid calibration for 3-D freehand ultrasound
US5572999A (en) Robotic system for positioning a surgical instrument relative to a patient&#39;s body
EP1523940B1 (en) Ultrasound diagnosis apparatus
Fuchs et al. Augmented reality visualization for laparoscopic surgery
US6675032B2 (en) Video-based surgical targeting system
Trobaugh et al. Frameless stereotactic ultrasonography: method and applications
JP5705403B2 (en) Method and apparatus for tracking a predetermined point in the ultrasound image
US6167296A (en) Method for volumetric image navigation
US5515160A (en) Method and apparatus for representing a work area in a three-dimensional structure
JP4828802B2 (en) Ultrasonic diagnostic apparatus for puncture treatment
US6689067B2 (en) Method and apparatus for ultrasound guidance of needle biopsies
US5678546A (en) Method for displaying moveable bodies
DE60018247T2 (en) System and method for use with imaging techniques to facilitate the planning of surgical procedures
US20090306509A1 (en) Free-hand three-dimensional ultrasound diagnostic imaging with position and angle determination sensors
US6416476B1 (en) Three-dimensional ultrasonic diagnosis apparatus
US5891158A (en) Method and system for directing an instrument to a target
US6478802B2 (en) Method and apparatus for display of an image guided drill bit
JP5146692B2 (en) System for optical position measurement and guidance to rigid or semi-flexible needle target
CA2140786C (en) Process for imaging the interior of bodies

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ CZ DE DE DK DK DM DZ EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: JP