WO2017103142A1 - Navigation assistance system - Google Patents
Navigation assistance system Download PDFInfo
- Publication number
- WO2017103142A1 WO2017103142A1 PCT/EP2016/081477 EP2016081477W WO2017103142A1 WO 2017103142 A1 WO2017103142 A1 WO 2017103142A1 EP 2016081477 W EP2016081477 W EP 2016081477W WO 2017103142 A1 WO2017103142 A1 WO 2017103142A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vessel
- interventional
- opening
- model
- implanted object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2061—Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3983—Reference marker arrangements for use with image guided surgery
Definitions
- the invention relates to a navigation assistance system, method and computer program for assisting in navigating an interventional instrument within a subject.
- the invention further relates to an interventional system comprising the navigation assistance system.
- WO 2014/191262 A3 discloses an assisting apparatus for assisting a user in moving an insertion element to a target element within an object.
- the assisting apparatus comprises a target element image providing unit for providing a target element image showing the target element and a target element representation generating unit for generating a target element representation representing the target element within the object in its three- dimensional position and three-dimensional orientation and with its size based on the target element image.
- the target element comprises at least one opening, wherein the target element representation generating unit is adapted to generate a target element representation comprising at least one ring representing the at least one opening of the target element within the object in the three-dimensional position, the three-dimensional orientation and size of the at least one opening of the target element.
- the assisting apparatus further comprises a tracking unit for tracking a three-dimensional position of the insertion element within the object, while the insertion element is moved to the target element, wherein the tracked insertion element has at least one opening, and a display for displaying the at least one ring of the target element representation and at least one ring representing the at least one opening of the insertion element.
- WO 2015/177012 Al discloses an imaging apparatus for imaging a first object within a second object.
- the imaging apparatus comprises a representation providing unit for providing a three-dimensional representation of the second object, wherein the three- dimensional representation includes a representation of a surface of the second object.
- the imaging apparatus further comprises a position providing unit for providing the position of the first object relative to the position of the second object and a projection unit for determining a projection of the first object onto the representation of the surface of the second object based on the provided position of the first object. The projection of the first object on the representation of the surface of the second object is finally displayed.
- a navigation assistance system for assisting in navigating an interventional instrument within a subject is presented, wherein the navigation assistance system comprises:
- an interventional image data set providing unit for providing an interventional image data set showing an implanted object with an opening and a vessel with an opening, a position providing unit for providing the position of the interventional instrument in a frame of reference,
- a model generation unit for generating an implanted object opening model and a vessel opening model based on the provided interventional image data set, wherein the implanted object opening model defines the position, shape and dimension of the opening of the implanted object in the frame of reference and wherein the vessel opening model defines the position, shape and dimension of the opening of the vessel in the frame of reference
- a graphical representation generation unit for generating a graphical representation of the implanted object opening model, the vessel opening model and the provided position of the interventional instrument. Since an interventional image data set showing an implanted object with an opening and a vessel with an opening is provided and since this interventional image data set is used for generating the implanted object opening model and the vessel opening model, i.e.
- a position defines a location and optionally also an orientation, i.e., for instance, preferentially the provided position of the interventional instrument defines the location of the interventional instrument, especially of its tip, and optionally also its orientation.
- the position is preferentially a three-dimensional position.
- the position providing unit can be adapted to provide the position of the tip of the interventional instrument only. However, the position providing unit can also be adapted to provide the position of a larger part of the interventional instrument, for instance, the position of the tip and of a part of the interventional instrument being adjacent to the tip. The position providing unit may also be adapted to provide the shape of this larger part such that the graphical representation can also show this shape.
- the graphical representation may comprise a three-dimensional curve representing the position and shape of this part of the interventional instrument. In an embodiment the position providing unit may provide the position and shape of the entire interventional instrument, wherein the graphical
- representation may show a three-dimensional curve representing the entire interventional instrument.
- the interventional image data set preferentially comprises one or several interventional images which have been acquired while a contrast agent was within the vessel, in order to enhance the visibility of the vessel in the interventional image data set.
- the interventional image data set may comprise one or several images showing the implanted object and the vessel without a contrast agent.
- the tracked position of the interventional instrument and the interventional image data set are registered to each other such that the spatial relationship between the generated implanted object opening model, the generated vessel opening model and the tracked position of the interventional instrument is known and can be provided in the same frame of reference.
- an interventional image data set generation unit generating the interventional image data set like an x-ray C-arm system and a tracking unit like an optical shape sensing tracking unit can be registered to each other such that the interventional image data set generated by the interventional image data set generation unit and the position of the interventional instrument tracked by the tracking unit are registered to each other.
- the model generation unit is adapted to use the provided position of the interventional instrument for generating the implanted object opening model and/or for generating the vessel opening model. For instance, if the model generation unit is adapted to generate the vessel opening model based on a segmentation of one or several vessels in two-dimensional x-ray projection images, the application of a corresponding segmentation algorithm may be confined to a region surrounding a virtual projection of the provided position of the interventional instrument onto an imaging plane of the respective two-dimensional x-ray projection image, thereby allowing for a faster and maybe more accurate segmentation of the desired structure.
- a segmentation of the opening of the implanted object for instance, by segmenting markers surrounding the opening of the implanted object, can be confined to a region surrounding the virtual projection of the provided position of the interventional instrument, in order to facilitate the segmentation of the opening of the implanted object.
- the interventional image data set providing unit can be adapted to provide the interventional image data set such that it comprises at least one first interventional image showing the implanted object and the vessel without a contrast agent and at least one second interventional image showing the implanted object and the vessel with a contrast agent, wherein the model generation unit is adapted to generate the implanted object opening model based on the at least one first interventional image and to generate the vessel opening model based on the at least one second interventional image.
- the generation of the implanted object opening model is therefore not disturbed by a contrast agent, thereby allowing for an improved generation of the implanted object opening model.
- the vessel opening model is determined based on the at least one second interventional image showing the vessel with a contrast agent, the vessel opening model can be more reliably determined.
- the implanted object preferentially comprises markers having a known spatial relation to the opening of the implanted object, wherein the interventional image data set providing unit is adapted to provide the interventional image data set such that it shows the markers of the implanted object, wherein the model generation unit is adapted to detect the positions of the markers in the interventional image data set and to generate the implanted object opening model based on the detected positions of the markers and the known spatial relation.
- the implanted object can comprise several markers surrounding the opening of the implanted object such that by detecting the markers in the interventional image data set the opening can be detected, wherein this detection can be used for generating the implanted object opening model.
- the model generation unit may be adapted to determine the position of at least a part of the implanted object in the interventional image data set, to generate the implanted object opening model by using the determined position of at least the part of the implanted object, to determine the position, dimensions and shape of at least a part of the vessel in the interventional image data set by using the determined position of at least the part of the implanted object and to generate the vessel opening model based on the determined position, dimension and shape of at least the part of the vessel in the image data set.
- a segmentation of at least a part of the vessel for determining its position, dimensions and shape in the two-dimensional x-ray projection image can be confined to region surrounding the determined positions of the markers. Also this can facilitate the generation of the vessel opening model.
- the navigation assistance system preferentially further comprises a path determination unit for determining a path along which the interventional instrument is movable for moving the interventional instrument through the opening of the implanted object and through the opening of the vessel, wherein the path determination unit is adapted to determine the path based on the generated implanted object opening model, the generated vessel opening model and the provided position of the interventional instrument, wherein the graphical representation generation unit is adapted to generate the graphical representation such that it also includes the determined path.
- the path determination unit is adapted to determine the path based on the generated implanted object opening model, the generated vessel opening model and the provided position of the interventional instrument
- the graphical representation generation unit is adapted to generate the graphical representation such that it also includes the determined path.
- the path determination unit can be adapted to determine the path by defining a line starting from the provided position of the interventional instrument and traversing the two openings of the implanted object and the vessel, especially traversing as good as possible the centers of these openings.
- the path determination unit can be adapted to use fitting algorithms for determining the path, wherein constraints can be used like a maximum degree of curvature of the interventional instrument which cannot be exceeded.
- the graphical representation generation unit is adapted to generate the graphical representation in accordance with representation parameters defining how the implanted object opening model, the vessel opening model and the provided position of the interventional instrument are to be presented, wherein the representation parameters depend on the provided position of the interventional instrument.
- the representation parameters can define the size of the graphical representation and hence the magnification and/or the viewing direction.
- the representation parameters can define whether the graphical representation should represent the different elements in a lateral view, in which the position of the interventional instrument and the opening models are shown from aside, or in a so called bull's eye view in which the opening models are in a viewing plane and the position of the interventional instrument is seen from the top.
- a navigation assistance system for assisting in navigating the interventional instrument as defined in claim 1.
- a navigation assistance method for assisting in navigating an interventional instrument within a subject comprises: providing an interventional image data set showing an implanted object with an opening and a vessel with an opening by an interventional image data set providing unit, providing the position of the interventional instrument by a position providing unit in a frame of reference,
- a computer program for assisting in navigating an interventional instrument comprises program code means for causing a navigation assistance system as defined in claim 1 to carry out the navigation assistance method as defined in claim 13, when the computer program is run on the navigation assistance system.
- navigation assistance system of claim 1 the interventional system of claim 12, the navigation assistance method of claim 13 and the computer program of claim 14 have similar and/or identical preferred embodiments, in particular, as defined in the dependent claims.
- Fig. 4 illustrates schematically and exemplarily an endovascular aneurysm repair (EVAR) procedure.
- EVAR endovascular aneurysm repair
- Fig. 1 shows schematically and exemplarily an embodiment of an interventional system for performing an interventional procedure.
- the interventional system 1 comprises an interventional instrument 10 like a catheter or guidewire for being navigated within a patient 7 arranged on a support means like a table 9.
- the interventional instrument 10 can comprise a handle 31 allowing a physician to navigate the interventional instrument 10 within the patient 7, especially within the heart 8 of the patient 7.
- the handle 31 can be adapted to allow the physician to push and pull the interventional instrument 10 and to deflect the distal tip of the interventional instrument 10.
- the handle 31 can particularly be used for moving the distal tip of the interventional instrument 10 through an opening of a fenestrated stent, which has been implanted in a vessel, and through an ostium of a further vessel, in order to navigate the distal tip of the interventional instrument 10 into the further vessel.
- the further vessel is denoted as first vessel and the vessel, in which the fenestrated stent is implanted, is denoted as second vessel.
- the interventional system 1 further comprises an interventional image data set providing unit 2 for providing an interventional image data set showing the fenestrated stent and at least the first vessel.
- the interventional image data set providing unit 2 is an x-ray C-arm system for acquiring two-dimensional x-ray projection images in different acquisition directions.
- the x-ray C-arm system comprises an x-ray source 3 for emitting x-rays 6 and a detector 4 for detecting the x-rays 6 after having traversed the patient 7 and for generating a two-dimensional x-ray projection image based on the detected x-rays 6.
- the x-ray source 3 and the detector 4 are arranged at opposing ends of a C-arm 5 which is rotatable around the patient 7, in order to provide two-dimensional x-ray projection images in different acquisition directions, which are provided to a control and processing device 1 1.
- a contrast agent is injected at least into the first vessel, in order to enhance the detectability of at least the first vessel in the two-dimensional x-ray projection images.
- the model generation unit 29 can be adapted to detect the markers
- the model generation unit 29 for determining the three-dimensional positions of the markers by using known localization techniques which may be based on, for instance, an intersection of rays defined by the respective two-dimensional position of the respective marker in the respective two-dimensional x-ray projection image and the respective position of the x-ray source.
- the implanted object opening model can have a predefined or selectable appearance, i.e. for example, having a predefined or selectable line width and/or color.
- the model generation unit 29 may be adapted to generate the implanted object opening model in another way.
- a segmentation algorithm may be used, which is adapted to directly segment the opening of the fenestrated stent in the two- dimensional x-ray projection images, in order to determine the two-dimensional dimensions and positions of the opening of the fenestrated stent in the respective two-dimensional x-ray projection images, wherein also these two-dimensional dimensions and positions of the opening of the fenestrated stent in the different two-dimensional x-ray projection images can be used for generating the implanted object opening model 21.
- a pair of two-dimensional x-ray projection images and optionally one or more further two-dimensional x-ray projection images are used, wherein each of these two-dimensional x- ray projection images shows a projection of the element.
- the two-dimensional x-ray projection images of the pair have been acquired in different acquisition directions, wherein the angular difference between these acquisition directions is preferentially at least 30 degrees.
- the position of the element i.e. of the projection of the element, is identified and these positions are used together with the known acquisition geometry for defining for each two-dimensional x-projection image a corresponding projection line in three dimensions, on which the respective element projection is located.
- the intersection of these projection lines defines the three-dimensional position of the element. If the projection lines do not intersect, for instance, because of an inaccuracy in the data defining the acquisition geometry or because of patient motion between the acquisitions of the different two-dimensional x-ray projection images, the three-dimensional position being closest to the projection lines can be determined as defining the three- dimensional position of the element.
- known detection techniques can be used for detecting the element in the two-dimensional x-ray projection images.
- the element is detected in one of the two-dimensional x-ray projection images, wherein the detected element in the projection image is used together with the acquisition geometry for defining a corresponding projection line in three dimensions.
- the counterpart projection of the element can be detected by using a pairing operation, wherein this pairing operation can be realized by using the epipolar geometry and by resorting to a similarity criterion.
- the two-dimensional x-ray projection images can be used for determining the three-dimensional positions of the markers. However, they can of course also be used for determining three-dimensional positions of other elements like a branching point between two vessels, which may be detected in the two-dimensional x- ray projection images. In this way the three-dimensional position of the ostium of a vessel can be determined.
- the model generation 29 is further adapted to generate a model of the opening of the first vessel, i.e. of the ostium of the first vessel, based on the provided interventional image data set.
- the model generation unit 29 preferentially uses two-dimensional x-ray projection images which have been generated in different acquisition directions, after a contrast agent has been injected such that it is present in the first and second vessels.
- the model generation unit 29 may be further adapted to determine the connection area where the first and second vessels are connected, wherein the circumference of the connection area can define the three-dimensional vessel opening model.
- other techniques can be used for generating the model of the ostium of the first vessel based on the provided interventional image data set.
- the ostium may directly be segmented in the two-dimensional x-ray projection images, in order to determine the two-dimensional dimensions and positions of the ostium in the two-dimensional x-ray projection images, and the three-dimensional model of the ostium may be determined based on the determined two-dimensional dimensions and positions by using known techniques which may be based, for instance, on intersections between rays defined by respective two-dimensional dimensions and positions in respective two- dimensional x-ray projection images and by respective three-dimensional positions of the x- ray source.
- the model generation unit 29 is adapted to generate the vessel opening model such that it models at least the ostium of the first vessel and preferentially also an adjacent part of the first vessel.
- the adjacent part of the first vessel i.e. the part of the first vessel being adjacent to the ostium, can be determined based on the two-dimensional dimension and position of the first vessel detected in the respective two-dimensional x-ray projection images and the epipolar geometry.
- the interventional system 1 further comprises a position providing unit 12 for providing the position of the interventional instrument 10.
- the interventional instrument 10 is enabled to allow for a determination of the position of the interventional instrument 10 by optical shape sensing.
- the interventional instrument 10 comprises optical fibers with Bragg gratings and the position providing unit 12 comprises a light source and a light detector for emitting light into the optical fibers and for detecting light received from the optical fibers, wherein the detected light is used for determining the three-dimensional shape of the interventional instrument 10 and three- dimensional position of this shape.
- the optical shape sensing technology has been described in U.S.
- Patent Application Publications 2006/0013523 Al and 2007/0065077 Al has been proposed for integration into medical instruments (e.g., guidewires and catheters) in U.S. Patent Application Publication US 2008/0285909 Al.
- the optical shape sensing technique is sometimes called FORS (Fiber-Optic RealShape) technique.
- the model generation unit 29 can be adapted to use the known position of the interventional instrument 10 for generating the model of the fenestration of the fenestrated stent and/or the model of the ostium of the first vessel.
- a user may indicate via an input unit 32, when the generation of the model of the fenestration and/or of the model of the ostium should start, wherein the user may provide this indication, when the tip of the interventional instrument is close to the openings as recognizable by the user based on, for example, a two- dimensional x-ray projection image showing the tip and the markers surrounding the fenestration.
- interventional i.e. intra-operative, image data, but no pre-interventional image data for generating the models.
- the interventional image data set providing unit 2 and the position providing unit 12, i.e. the x-ray C-arm system and the optical shape sensing tracking system, are registered to each other by using known registration techniques such that the position of the interventional instrument 10 as provided by the position providing unit 12 is known relative to a coordinate system defined by the interventional image data set providing unit 2.
- the interventional image data set providing unit 2 can be adapted to provide the interventional image data set such that it comprises first interventional images showing the fenestrated stent without a contrast agent and second interventional images showing the fenestrated stent and the first and second vessels with a contrast agent, wherein the model generation unit 29 may be adapted to generate the model of the fenestration of the fenestrated stent based on the first interventional images and to generate the model of the ostium of the first vessel based on the second interventional images, wherein the first interventional images and the second interventional images are registered to each other by being acquired by using the same interventional image data set providing unit 2.
- the interventional system 1 further comprises a path determination unit 30 for determining a path along which the interventional instrument 10 is movable for moving the interventional instrument 10 through the fenestration of the fenestrated stent and through the ostium of the first vessel, wherein the path determination unit 30 is adapted to determine the path based on the generated model of the fenestration of the fenestrated stent, the generated model of the ostium of the first vessel and the provided position of the interventional instrument 10.
- the interventional system 1 further comprises a graphical representation generation unit 34 for generating a graphical representation including the implanted object opening model 21, the vessel opening model 27 showing the vessel opening 40, i.e. the ostium, and the adjacent part 28, the provided position and shape 25 of the distal end of the interventional instrument and the determined path 26.
- the graphical representation can optionally further include the markers 24 as schematically and exemplarily shown in Fig. 2.
- the graphical representation can be displayed on a display 33.
- the input unit 32 allows a user to input data, information, indications et cetera into the interventional system 1.
- the input 32 may be used for allowing the user to indicate when an interventional image data set to be used for the generation of the models should be acquired. The user may provide this indication when the distal end of the interventional instrument is close to the fenestration of the stent and/or close to the ostium of the first vessel. It may also be used to allow the user to indicate that the navigation assistance, i.e. the generation and displaying of the graphical representation, should start or stop.
- the input unit 32 may comprise, for instance, a keyboard, a computer mouse, a touch pad, a foot switch, a button to be actuated by hand, et cetera.
- the interventional image data set providing unit, the position providing unit, the model generation unit, the graphical representation generation unit and the path determination unit are adapted to finally provide a representation of the spatial relationships between the ostium of the first vessel, the fenestration of the fenestrated stent and the current, real-time position of the distal end of the interventional instrument and also a representation of an optimal path, these components can be regarded as being components of a navigation assistance system for assisting in navigating the interventional instrument within the patient.
- step 101 the interventional instrument 10 is introduced into the patient 7 and moved such that the distal tip of the interventional instrument 10 is close to the fenestration of the fenestrated stent and/or to the ostium of the first vessel.
- the interventional image data set providing unit 2 can provide interventional images, especially two-dimensional x-ray projection images, which can be shown on the display 33, in order to provide some guidance for the physician navigating the interventional instrument 10.
- the interventional image data set providing unit 2 provides an interventional image data set showing at least the first vessel and the fenestrated stent.
- the interventional image data set can comprise contrast agent images, in order to allow for a simplified detection of the first vessel in the interventional image data set.
- step 103 the provided interventional image data set is used by the model generation unit 29 for generating an implanted object opening model and for generating a vessel opening model based on the interventional image data set.
- the path determination unit 30 determines a path along which the interventional instrument 10 is movable for moving the interventional instrument 10 through the fenestration of the fenestrated stent and through the ostium of the first vessel based on the generated implanted opening model, the generated vessel opening model and the current position of the interventional instrument 10.
- a graphical representation is generated and displayed, wherein the graphical representation comprises the vessel opening model, the implanted object opening model, the determined path and the current position of the interventional instrument, in order to visualize their spatial relationship.
- the position of the interventional instrument 10 and the determination of the path which considers, inter alia, the current position of the interventional instrument 10
- the updated position of the interventional instrument 10 and the updated path are shown together with the models determined in step 103 on the display 33, in order to allow the physician to monitor the movement of the interventional instrument 10 and to always show an optimal path considering the current, real-time position of the interventional instrument 10.
- the graphical representation generation unit 34 is adapted to generate the graphical representation in accordance with representation parameters defining how the implanted object opening model 21, the vessel opening model 27, the determined path 26 and the provided position and shape 25 of the interventional instrument 10 are to be presented, wherein the representation parameters depend on the provided position of the interventional instrument.
- the representation parameters can define the size of the graphical representation and hence the magnification and/or the viewing direction.
- the representation parameters can define whether the graphical representation should represent the different elements in a lateral view or in a bull's eye view.
- the representation parameters can depend on the distance between a) the position of the interventional instrument and b) the position of the opening of the implanted object and/or the position of the opening of the vessel.
- the size of the different elements may be smaller, i.e. the magnification may be smaller, and, if this distance is smaller, the size of the different elements may be larger, i.e. the magnification may be larger.
- the lateral view may be shown, and, if this distance is smaller, the bull's eye view may be shown. It is also possible that the lateral view is always shown and that the bull's eye view is shown only, if the distance is smaller than a predefined or selectable threshold.
- both views or multiple other views may be shown independently of the position of the interventional instrument and only the magnification may be modified depending on the distance.
- the interventional system can be adapted to be used in an EVAR procedure.
- a fenestrated stent may be implanted in the aorta such that fenestrations of the stent are aligned with ostia of branching arteries.
- a wire which has been inserted in the aorta 13 may be threaded into the renal artery 14.
- This threading requires a steering through the double difficulty constituted by the gate, i.e. the fenestration 19, and the renal ostium 20.
- the gate i.e. the fenestration 19
- the renal ostium 20 may be threaded into the renal artery 14.
- the gate i.e. the fenestration 19
- the renal ostium 20 Generally, such a steering may be achieved under the guidance of two- dimensional x-ray projection images showing at least the gate.
- finding the three- dimensional path through the gate and the possibly misaligned arterial ostium based on the two-dimensional x-ray projection image is very difficult.
- the navigation assistance system creates the virtual navigation view, i.e.
- the graphical representation containing at least the three main elements being a modeling of the fenestration complex, i.e. a modeling of the gate which may be regarded as being an implanted object opening model, a modeling of the targeted vessel, i.e. of the anatomy of the targeted vessel which includes at least the ostium and which may therefore also be regarded as being a vessel opening model, and a live-tracked position of the threaded system, i.e. of the wire being in this example the interventional instrument, wherein, for instance, a curve or silhouette representing the tip of the wire and a part adjacent to the tip may be shown.
- modeling of both the gate and the anatomy occurs at a very late stage, i.e.
- the respective side branch i.e. the respective artery
- the renal and the internal iliac arteries may be targeted and selectively injected.
- the position providing unit can be adapted to provide a live localization of the wire to be used for stenting also the renal arteries.
- the position providing unit can be adapted to provide an accurate live three-dimensional wire tracking, wherein preferentially optical shape sensing is used, in order to integrate a curve representing a distal part of the wire in the virtual view.
- another tracking technique is used like electromagnetic tracking, wherein only the tip of the wire may be tracked, only the position of the tip of the wire may be integrated in the virtual view.
- a catheter In order to perform the super-selective injection of the contrast agent, a catheter may be used, which is equipped with optical shape sensing technology, in order to allow for a determination of the position and shape of the catheter by optical shape sensing.
- the model generation unit is adapted to generate the implanted object opening model, wherein this generation of this model may also be regarded as being a gate modeling.
- this gate modeling is achieved from the gate markers, i.e. the markers surrounding the respective fenestration, detected under both angulation and classical modeling.
- this gate modeling is preferentially only started when the localized device, i.e. the interventional instrument, is situated close to the gate. This can facilitate the distinction of the gate markers from cluttered material which may be produced by the rest of the stent-body. It is also possible to start marker detection in images without a contrast agent and to propagate this detection to images which have been acquired by using a contrast agent. Another alternative is to rely on the injection of the contrast agent to localize the gate markers.
- the presence of the nearby localized device for instance, of the localized wire or the localized catheter used for super-selective injection, can be used to facilitate local vessel segmentation in the images.
- the live device localization i.e. the live provision of the position of the interventional instrument which might be, for instance, a catheter, a guidewire, et cetera
- the interventional instrument which might be, for instance, a catheter, a guidewire, et cetera
- the resulting three-dimensional information about the gate, the vessel including the ostium and the interventional instrument can be used to compute an optimal path for optimal threading.
- the interventional system may be applied in interventional suites, hybrid rooms and catheter labs with x-ray systems.
- a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium, supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
- a suitable medium such as an optical storage medium or a solid-state medium, supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
- the invention relates to a navigation assistance system for assisting in navigating an interventional instrument within a subject.
- An implanted object opening model and a vessel opening model are generated based on a provided interventional image data set, wherein the models define a respective position, shape and dimension in a frame of reference.
- These models and a position, which is also provided in the frame of reference, and optionally also a shape of the interventional instrument are used for generating a graphical representation of the interventional instrument.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Robotics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Pathology (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201680073899.7A CN108366834B (zh) | 2015-12-17 | 2016-12-16 | 导航辅助系统 |
| EP16822944.1A EP3389540B1 (en) | 2015-12-17 | 2016-12-16 | Navigation assistance system |
| JP2018531105A JP7299701B2 (ja) | 2015-12-17 | 2016-12-16 | ナビゲーション支援システム |
| US15/779,173 US11980423B2 (en) | 2015-12-17 | 2016-12-16 | Navigation assistance system |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP15307042 | 2015-12-17 | ||
| EP15307042.0 | 2015-12-17 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017103142A1 true WO2017103142A1 (en) | 2017-06-22 |
Family
ID=55069783
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2016/081477 Ceased WO2017103142A1 (en) | 2015-12-17 | 2016-12-16 | Navigation assistance system |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US11980423B2 (enExample) |
| EP (1) | EP3389540B1 (enExample) |
| JP (1) | JP7299701B2 (enExample) |
| CN (1) | CN108366834B (enExample) |
| WO (1) | WO2017103142A1 (enExample) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019119194A1 (zh) * | 2017-12-18 | 2019-06-27 | 中国科学院深圳先进技术研究院 | 柔性手术器械跟踪方法、装置、设备及存储介质 |
| JP2019181213A (ja) * | 2018-04-16 | 2019-10-24 | キヤノンメディカルシステムズ株式会社 | 画像処理装置、x線診断装置及びプログラム |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3677212B1 (en) * | 2019-01-04 | 2023-07-26 | Siemens Healthcare GmbH | Method and system for determining a navigation pathway for invasive medical instrument in blood vessels |
| US12478436B2 (en) * | 2021-11-24 | 2025-11-25 | Siemens Medical Solutions Usa, Inc. | Smart image navigation for intracardiac echocardiography |
| EP4343686A1 (en) * | 2022-09-23 | 2024-03-27 | Siemens Healthineers AG | Method and system for vascular catheter tip detection in medical images |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2014191262A2 (en) * | 2013-05-31 | 2014-12-04 | Koninklijke Philips N.V. | Assisting apparatus for assisting a user during an interventional procedure |
| WO2015177012A1 (en) * | 2014-05-23 | 2015-11-26 | Koninklijke Philips N.V. | Imaging apparatus for imaging a first object within a second object |
Family Cites Families (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8442618B2 (en) | 1999-05-18 | 2013-05-14 | Mediguide Ltd. | Method and system for delivering a medical device to a selected position within a lumen |
| US7343195B2 (en) * | 1999-05-18 | 2008-03-11 | Mediguide Ltd. | Method and apparatus for real time quantitative three-dimensional image reconstruction of a moving organ and intra-body navigation |
| US7778688B2 (en) * | 1999-05-18 | 2010-08-17 | MediGuide, Ltd. | System and method for delivering a stent to a selected position within a lumen |
| US7356367B2 (en) * | 2000-06-06 | 2008-04-08 | The Research Foundation Of State University Of New York | Computer aided treatment planning and visualization with image registration and fusion |
| US7607440B2 (en) * | 2001-06-07 | 2009-10-27 | Intuitive Surgical, Inc. | Methods and apparatus for surgical planning |
| US7998062B2 (en) * | 2004-03-29 | 2011-08-16 | Superdimension, Ltd. | Endoscope structures and techniques for navigating to a target in branched structure |
| US7781724B2 (en) | 2004-07-16 | 2010-08-24 | Luna Innovations Incorporated | Fiber optic position and shape sensing device and method relating thereto |
| US20060013523A1 (en) | 2004-07-16 | 2006-01-19 | Luna Innovations Incorporated | Fiber optic position and shape sensing device and method relating thereto |
| JP5122743B2 (ja) * | 2004-12-20 | 2013-01-16 | ゼネラル・エレクトリック・カンパニイ | インターベンショナルシステム内で3d画像を位置合わせするシステム |
| US7967742B2 (en) * | 2005-02-14 | 2011-06-28 | Karl Storz Imaging, Inc. | Method for using variable direction of view endoscopy in conjunction with image guided surgical systems |
| US8050523B2 (en) | 2007-04-20 | 2011-11-01 | Koninklijke Philips Electronics N.V. | Optical fiber shape sensing systems |
| ES2900584T3 (es) * | 2010-12-23 | 2022-03-17 | Bard Access Systems Inc | Sistema para guiar un instrumento rígido |
| EP2800534B1 (en) * | 2012-01-03 | 2021-04-28 | Koninklijke Philips N.V. | Position determining apparatus |
| WO2014151651A1 (en) | 2013-03-15 | 2014-09-25 | The Cleveland Clinic Foundation | Method and system to facilitate intraoperative positioning and guidance |
| DE102013213727A1 (de) * | 2013-07-12 | 2015-01-15 | Siemens Aktiengesellschaft | Interventionelles Bildgebungssystem |
-
2016
- 2016-12-16 WO PCT/EP2016/081477 patent/WO2017103142A1/en not_active Ceased
- 2016-12-16 JP JP2018531105A patent/JP7299701B2/ja active Active
- 2016-12-16 US US15/779,173 patent/US11980423B2/en active Active
- 2016-12-16 CN CN201680073899.7A patent/CN108366834B/zh active Active
- 2016-12-16 EP EP16822944.1A patent/EP3389540B1/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2014191262A2 (en) * | 2013-05-31 | 2014-12-04 | Koninklijke Philips N.V. | Assisting apparatus for assisting a user during an interventional procedure |
| WO2015177012A1 (en) * | 2014-05-23 | 2015-11-26 | Koninklijke Philips N.V. | Imaging apparatus for imaging a first object within a second object |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019119194A1 (zh) * | 2017-12-18 | 2019-06-27 | 中国科学院深圳先进技术研究院 | 柔性手术器械跟踪方法、装置、设备及存储介质 |
| JP2019181213A (ja) * | 2018-04-16 | 2019-10-24 | キヤノンメディカルシステムズ株式会社 | 画像処理装置、x線診断装置及びプログラム |
| JP7297507B2 (ja) | 2018-04-16 | 2023-06-26 | キヤノンメディカルシステムズ株式会社 | 画像処理装置、x線診断装置及びプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| EP3389540B1 (en) | 2024-02-28 |
| CN108366834B (zh) | 2021-04-09 |
| JP7299701B2 (ja) | 2023-06-28 |
| US11980423B2 (en) | 2024-05-14 |
| US20180353240A1 (en) | 2018-12-13 |
| JP2019501703A (ja) | 2019-01-24 |
| EP3389540A1 (en) | 2018-10-24 |
| CN108366834A (zh) | 2018-08-03 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP3003194B1 (en) | Assisting apparatus for assisting a user during an interventional procedure | |
| US9098899B2 (en) | Determining the specific orientation of an object | |
| US8295577B2 (en) | Method and apparatus for guiding a device in a totally occluded or partly occluded tubular organ | |
| JP6581598B2 (ja) | カテーテルの特定の位置を決定するための装置 | |
| EP3389540B1 (en) | Navigation assistance system | |
| US20080275467A1 (en) | Intraoperative guidance for endovascular interventions via three-dimensional path planning, x-ray fluoroscopy, and image overlay | |
| EP2751726B1 (en) | Vessel annotator | |
| JP6480938B2 (ja) | ナビゲーションシステム | |
| WO2006103644A1 (en) | Method and apparatus for positioning a device in a tubular organ | |
| EP3145432B1 (en) | Imaging apparatus for imaging a first object within a second object | |
| US11887236B2 (en) | Animated position display of an OSS interventional device | |
| WO2008050315A2 (en) | Method and apparatus for guiding a device in a totally occluded or partly occluded tubular organ |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16822944 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2018531105 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2016822944 Country of ref document: EP |