WO2022048984A1 - Commande d'intervention médicale basée sur l'identification de type de dispositif - Google Patents

Commande d'intervention médicale basée sur l'identification de type de dispositif Download PDF

Info

Publication number
WO2022048984A1
WO2022048984A1 PCT/EP2021/073592 EP2021073592W WO2022048984A1 WO 2022048984 A1 WO2022048984 A1 WO 2022048984A1 EP 2021073592 W EP2021073592 W EP 2021073592W WO 2022048984 A1 WO2022048984 A1 WO 2022048984A1
Authority
WO
WIPO (PCT)
Prior art keywords
interventional instrument
controller
medical
medical imaging
device type
Prior art date
Application number
PCT/EP2021/073592
Other languages
English (en)
Inventor
Torre Michelle BYDLON
Molly Lara FLEXMAN
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2022048984A1 publication Critical patent/WO2022048984A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2061Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2074Interface software
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/254User interfaces for surgical systems being adapted depending on the stage of the surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis

Definitions

  • the present disclosure relates to systems, controllers and methods for medical interventions incorporating medical imaging systems (e.g., X-Ray imaging systems, magnetic resonance imaging systems, ultrasound imaging systems, etc.) and/or medical robot systems (e.g., operator-manipulated medical robots and computer-assisted medical robots for heart surgeries, thoracic surgeries, gastrointestinal surgeries, etc.).
  • medical imaging systems e.g., X-Ray imaging systems, magnetic resonance imaging systems, ultrasound imaging systems, etc.
  • medical robot systems e.g., operator-manipulated medical robots and computer-assisted medical robots for heart surgeries, thoracic surgeries, gastrointestinal surgeries, etc.
  • one challenge is to establish a proper view of an interventional instrument (e.g., a catheters, delivery devices, scopes, etc.) during a navigation and a deployment of the interventional instrument in the vasculature within a context of also obtaining a good visualization of the anatomy.
  • an interventional instrument e.g., a catheters, delivery devices, scopes, etc.
  • Various ways of improving image quality to optimize visualization of interventional instruments and anatomy in an imaging environment while reducing radiation are known.
  • surgical robots have been incorporated as a means for an easier and a safer navigating of an interventional instrument within complex anatomy.
  • a vascular intervention is to control a setting of navigation parameter(s) of a surgical robot to ensure a minimization of any force trauma applied by the interventional instrument to the anatomy as the interventional instrument is navigated within the anatomy.
  • a linear acceleration/deceleration, a rotational acceleration/decel eration and a maximum velocity of the surgical robot in navigating the interventional instrument within the anatomy may be set to ensure a minimization of any force trauma applied by the interventional instrument to the anatomy as the interventional instrument is navigated within the anatomy.
  • an interventional instrument e.g., balloon catheters, stent catheters, ablation catheters, imaging catheters, infusion catheters, endograft deployment devices, sheaths, introducers, mitral clip delivery devices, mitral valve delivery devices,
  • a medical robot apparatus e.g., an operator-manipulated surgical robot, a computer-controlled surgical robot, etc.
  • Exemplary embodiments of the present disclosure for improving medical interventions include, but are limited to, (1) medical intervention control systems, (2) adaptive intervention controllers, and (3) medical intervention control methods.
  • Various medical intervention control system embodiments of the present disclosure encompass a medical imaging controller and an adaptive intervention controller for improving a medical imaging of an interventional instrument by a medical imaging apparatus.
  • the medical imaging controller is configured to execute an operational control of the medical imaging apparatus for the medical imaging of the interventional instrument.
  • the adaptive intervention controller is configured to derive an identification of a device type of the interventional instrument responsive to receiving shape sensing data generated by a guidewire when the interventional instrument is positioned at least partly over the guidewire.
  • the adaptive intervention controller is further configured to adapt the operational control of the medical imaging apparatus by the medical imaging controller for the medical imaging of the interventional instrument in accordance with the identification of the device type of the interventional instrument.
  • Various medical intervention control system embodiments of the present disclosure encompass a medical robot controller and an adaptive intervention controller for improving a robotic navigation of the interventional instrument by a medical robot apparatus.
  • the medical robot controller is configured to execute an operational control of the medical robot apparatus for the robotic navigation of the interventional instrument.
  • the adaptive intervention controller is configured to derive an identification of a device type of the interventional instrument responsive to receiving shape sensing data generated by a guidewire when the interventional instrument is positioned at least partly over the guidewire.
  • the adaptive intervention controller is further configured to adapt the operational control of the medical robot apparatus by the medical robot controller for the robotic navigation of the interventional instrument in accordance with the identification of the device type of the interventional instrument.
  • Various medical intervention control system embodiments of the present disclosure encompass a medical imaging controller, a medical robot controller, and an adaptive intervention controller for improving both a medical imaging of an interventional instrument by a medical imaging apparatus and a robotic navigation of the interventional instrument by a medical robot apparatus.
  • the medical imaging controller is configured to execute an operational control of the medical imaging apparatus for the medical imaging of the interventional instrument
  • the medical robot controller is configured to execute an operational control of the medical robot apparatus for the robotic navigation of the interventional instrument.
  • the adaptive intervention controller is configured to derive an identification of a device type of the interventional instrument responsive to receiving shape sensing data generated by a guidewire when the interventional instrument is positioned at least partly over the guidewire.
  • the adaptive intervention controller is further configured to adapt the operational control of the medical imaging apparatus by the medical imaging controller for the medical imaging of the interventional instrument in accordance with the identification of the device type of the interventional instrument, and to adapt the operational control of the medical robot apparatus by the medical robot controller for the robotic navigation of the interventional instrument in accordance with the identification of the device type of the interventional instrument.
  • Various adaptive intervention controller embodiments of the present disclosure encompass a non-transitory machine-readable storage medium encoded with instructions for execution by one or more processors for improving a medical imaging of an interventional instrument by a medical imaging apparatus.
  • the non-transitory machine-readable storage medium includes instructions to identify a device type of the interventional instrument responsive to receiving shape sensing data generated by a guidewire when the interventional instrument is positioned at least partly over the guidewire.
  • the non-transitory machine-readable storage medium further includes instructions to adapt the operational control of the medical imaging apparatus by the medical imaging controller for the medical imaging of the interventional instrument in accordance with the identification of the device type of the interventional instrument.
  • Various adaptive intervention controller embodiments of the present disclosure encompass a non-transitory machine-readable storage medium encoded with instructions for execution by one or more processors for improving a robotic navigation of an interventional instrument by a medical robot apparatus.
  • the non-transitory machine-readable storage medium includes instructions to identify a device type of the interventional instrument responsive to receiving shape sensing data generated by a guidewire when the interventional instrument is positioned at least partly over the guidewire.
  • the non-transitory machine-readable storage medium further includes instructions to adapt the operational control of the medical robot apparatus by the medical robot controller for the robotic navigation of the interventional instrument in accordance with the identification of the device type of the interventional instrument.
  • Various adaptive intervention controller embodiments of the present disclosure encompass a non-transitory machine-readable storage medium encoded with instructions for execution by one or more processors for improving a medical imaging of an interventional instrument by a medical imaging apparatus and for improving a robotic navigation of an interventional instrument by a medical robot apparatus.
  • the non-transitory machine-readable storage medium includes instructions to identify a device type of the interventional instrument responsive to receiving shape sensing data generated by a guidewire when the interventional instrument is positioned at least partly over the guidewire.
  • the non-transitory machine-readable storage medium further includes instructions to adapt the operational control of the medical imaging apparatus by the medical imaging controller for the medical imaging of the interventional instrument in accordance with the identification of the device type of the interventional instrument and instructions to adapt the operational control of the medical robot apparatus by the medical robot controller for the robotic navigation of the interventional instrument in accordance with the identification of the device type of the interventional instrument.
  • Various medical intervention control method embodiments of the present disclosure for improving a medical imaging of an interventional instrument by a medical imaging apparatus encompass an adaptive intervention controller deriving an identification of a device type of the interventional instrument responsive to receiving shape sensing data generated by the guidewire when the interventional instrument is positioned at least partly over the guidewire, and further encompass the adaptive intervention controller adapting an operational control of the medical imaging apparatus by a medical imaging controller for the medical imaging of the interventional instrument in accordance with the identification of the device type of the interventional instrument.
  • Various medical intervention control method embodiments of the present disclosure for improving a robotic navigation of an interventional instrument by a medical robot apparatus encompass an adaptive intervention controller deriving an identification of a device type of the interventional instrument responsive to receiving shape sensing data generated by the guidewire when the interventional instrument is positioned at least partly over the guidewire, and further encompass the adaptive intervention controller adapting an operational control of the medical robot apparatus by the medical robot controller for the robotic navigation of the interventional instrument in accordance with the identification of the device type of the interventional instrument.
  • Various medical intervention control method embodiments of the present disclosure for improving a medical imaging of an interventional instrument by a medical imaging apparatus and for improving a robotic navigation of an interventional instrument by a medical robot apparatus encompass an adaptive intervention controller deriving an identification of a device type of the interventional instrument responsive to receiving shape sensing data generated by the guidewire when the interventional instrument is positioned at least partly over the guidewire.
  • These medical intervention control method embodiments of the present disclosure further encompass the adaptive intervention controller adapting an operational control of the medical imaging apparatus by a medical imaging controller for the medical imaging of the interventional instrument in accordance with the identification of the device type of the interventional instrument, and the adaptive intervention controller adapting an operational control of the medical robot apparatus by the medical robot controller for the robotic navigation of the interventional instrument in accordance with the identification of the device type of the interventional instrument.
  • an identification of a device type of the interventional instrument may be derived from (1) a pre-defined identify feature of an over-the-wire device of the interventional instrument (e.g., a shape profile, a curvature profile, a strain profile, a twist profile and/or derivative profiles thereof of a lumen of the over-the wire device) and/or a pre-defined identify template of a hub of the interventional instrument (e.g., a shape profile, a curvature profile, a strain profile a twist profile and/or derivative profiles thereof of a channel within the hub).
  • a pre-defined identify feature of an over-the-wire device of the interventional instrument e.g., a shape profile, a curvature profile, a strain profile, a twist profile and/or derivative profiles thereof of a lumen of the over-the wire device
  • a pre-defined identify template of a hub of the interventional instrument e.g., a shape profile, a curvature profile, a strain profile a twist profile
  • the operational control adaption of a medical imaging apparatus by a medical imaging controller may include the adaptive intervention controller (1) commanding the medical imaging controller to set medical imaging parameter(s) of the medical imaging apparatus corresponding to the identification of the device type of the interventional instrument (e.g., a controller setting of a dosage, an acquisition time, a shuttering, a windowing, etc.
  • the adaptive intervention controller (1) commanding the medical imaging controller to set medical imaging parameter(s) of the medical imaging apparatus corresponding to the identification of the device type of the interventional instrument (e.g., a controller setting of a dosage, an acquisition time, a shuttering, a windowing, etc.
  • an image enhancement mode of the medical imaging apparatus corresponding to the identification of the device type of the interventional instrument e.g., activation of StentBoost and/or SmartPerfusion corresponding to an identified stent catheter
  • a display of a protocol user interface for a user interaction with the medical imaging parameter(s) of the medical imaging apparatus corresponding to the identification of the device type of the interventional instrument e.g., activation of StentBoost and/or SmartPerfusion corresponding to an identified stent catheter
  • the operational control adaption of a medical robot apparatus by a medical robot controller may include the adaptive intervention controller (1) commanding the medical robot controller to set robot navigation parameter(s) of the medical robot apparatus corresponding to the identification of the device type of the interventional instrument (e.g., a controller setting of a linear acceleration/decel eration, a rotational acceleration/deceleration and a maximum velocity corresponding to an identified balloon catheter), (2) commanding the medical robot controller to activate a navigation enhancement mode of the medical robot apparatus corresponding to the identification of the device type of the interventional instrument (e.g., maintaining an orientation of the interventional instrument within an anatomy, maintaining positional/orientational relationships between components of the interventional instrument or updated preplanned paths through the anatomy), and/or (3) controlling a display of a protocol user interface for a user interaction with the medical robot navigation parameter(s) of the medical robot apparatus corresponding to the identification of the device type of the interventional instrument.
  • the adaptive intervention controller (1) commanding the medical robot controller to set robot navigation parameter(s) of the medical robot apparatus corresponding
  • FIGS. 1 A and IB illustrate an exemplary embodiments of a medical intervention control system in accordance with the present disclosure
  • FIG. 2 illustrates exemplary shape profiles of catheters as known in the art of the present disclosure.
  • FIG. 3 illustrates a cross-section view of a hub as known in the art of the present disclosure
  • FIG. 4 illustrates an example of an image of StentBoost showing better image quality of the stent as known in the art of the present disclosure
  • FIG. 5 illustrates an optical shape sensed device overlaid upon a pre-operative CT image as known in the art of the present disclosure
  • FIG. 6 illustrates an exemplary embodiment of a flowchart representative of a medical intervention control method in accordance with the present disclosure
  • FIG. 7 illustrates an exemplary embodiment of a flowchart representative of an X-Ray medical intervention control method in accordance with the present disclosure
  • FIGS. 8A-8F illustrate exemplary embodiments of adaptive intervention controllers as shown in FIGS. 1 A and IB in accordance with the present disclosure.
  • FIGS. 9A-9F illustrate exemplary embodiments of the medical intervention control systems as shown in FIGS. 1 A and IB in accordance with the present disclosure.
  • the present disclosure is applicable to all medical interventions as known in the art of the present disclosure for diagnosing and/or treating a medical condition involving a medical imaging and/or a robotic navigation of an interventional instrument within an anatomy including, but not limited to, (1) vascular procedures utilizing guidewires, catheters, stent sheaths, endograft deployment systems, etc., (2) endoluminal procedures utilizing guidewires, catheters, sheaths, endoscopes or bronchoscopes and (3) orthopedic procedures utilizing k-wires, screwdrivers, etc.
  • the present disclosure improves upon the prior art of medical interventions by providing novel and unique interventional instrument based adaptive control of numerous and various medical imaging modalities including, but not limited to, X-ray medical imaging, ultrasound medical imaging, intravascular ultrasound imaging, magnetic resonance medical imaging, computed tomography medical imaging, optical coherence tomography medical imaging, and endoscopic medical imaging.
  • medical imaging modalities including, but not limited to, X-ray medical imaging, ultrasound medical imaging, intravascular ultrasound imaging, magnetic resonance medical imaging, computed tomography medical imaging, optical coherence tomography medical imaging, and endoscopic medical imaging.
  • the present disclosure improves upon the prior art of medical interventions by providing novel and unique interventional instrument based adaptive control of numerous and various robotic navigation modalities including, but not limited to, an operator-manipulated robotic navigation of an interventional instrument (e.g., a handmanipulator for mimicking normal operator hand movements during a medical intervention) and a computer-controlled robotic navigation of an interventional instrument (e.g., an image based path planning of the interventional instrument within an anatomy).
  • an operator-manipulated robotic navigation of an interventional instrument e.g., a handmanipulator for mimicking normal operator hand movements during a medical intervention
  • a computer-controlled robotic navigation of an interventional instrument e.g., an image based path planning of the interventional instrument within an anatomy.
  • optical shape sensing may be utilized by the medical intervention control of the present disclosure.
  • FIGS. 1A and IB teach exemplary embodiments of a medical intervention control system in accordance with the present disclosure. From the description of FIGS. 1A and IB, those having ordinary skill in the art of the present disclosure will appreciate how to apply the present disclosure to make and use additional embodiments of a medical intervention control system in accordance with the present disclosure.
  • a first exemplary medical intervention system of the present disclosure employs an interventional instrument 10 including a medical imaging system including a medical imaging apparatus 41 and a medical imaging controller 42, a spatial tracking system including a spatial tracking apparatus 53 and a spatial tracking controller 54, and a medical display system including an image processor 60 and an monitor 61.
  • the term "interventional instrument” encompasses all medical instruments, as known in the art of the present disclosure and hereinafter conceived, including an over-the-wire (OTW) device 11 and an interventional device 12 for diagnosing and/or treating a medical condition
  • the term "OTW device” encompasses all medical devices, as known in the art of the present disclosure and hereinafter conceived, having a lumen extending over a segment or an entirety of the medical device for accommodating a guidewire 30 within the OTW device
  • the term “interventional device” encompasses all medical devices, as known in the art of the present disclosure and hereinafter conceived, deliverable by OTW device 11 to target position(s) within an anatomy and deployable within the anatomy for diagnosing and/or treating a medical condition.
  • interventional device 12 may be internal to, external to, or partially internal/partially external to OTW device 11.
  • interventional instrument 10 examples include, but are not limited to, (1) a balloon catheter including OTW device 11 in the form of a catheter and interventional device 12 in form of a balloon, (2) a stent catheters including OTW device 11 in the form of a catheter and interventional device 12 in form of a stent, and (3) an endograft delivery system including OTW device 11 in the form of a delivery system and interventional device 12 in the form of a stent graft, and (4) an intervascular therapy system including OTW device 11 in the form of an introducer sheath and interventional device 12 in the form of percutaneous heart pump.
  • OTW device 11 has one or more predefined identify features, which for purposes of describing and claims the present disclosure, the term "pre-defined identify feature" is a pre-defined profile of a lumen of OTW device 11 as known in the art of the present disclosure or hereinafter conceived (e.g., a shape profile, a curvature profile, a strain profile, a vibration profile, a twist profile and/or derivate profiles thereof of the lumen of OTW device 11) that serves as a basis for identifying a device type of interventional instrument 10 as will be further described in the present disclosure.
  • pre-defined identify feature is a pre-defined profile of a lumen of OTW device 11 as known in the art of the present disclosure or hereinafter conceived (e.g., a shape profile, a curvature profile, a strain profile, a vibration profile, a twist profile and/or derivate profiles thereof of the lumen of OTW device 11) that serves as a basis for identifying a device type of interventional
  • FIG. 2 illustrates known baseline shapes of a tiger catheter 1 la, a jacky catheter 1 lb, an amplatz left catheter 11c, a LCB catheter 1 Id, a RCB catheter 1 le, a Judkins left catheter 1
  • a Judkins right catheter 11g a multipurpose A2 catheter 1 Ih, a IM catheter 1 li, a 3D lima catheter 1 Ijand an IM VB-1 catheter 1 Ik, whereby a shape profile, a curvature profile, a strain profile and a twist profile of a lumen within one of these catheters will facilitate an identification of that particular catheter via a shape sensing by guidewire 30 as positioned or translated within the lumen.
  • interventional instrument 10 may include a hub 20 attached to OTW device 11.
  • hub broadly encompasses any object, as known in the art of the present disclosure and hereinafter conceived, having a pre-defined identity template serving as a basis for a spatial registration of OTW device 11 within a reference coordinate system
  • pre-defined identity template broadly encompasses a pre-defined profile of a channel within the hub as known in the art of the present disclosure or hereinafter conceived (e.g., a shape profile, a curvature profile, a strain profile, a twist profile and/or derivate profiles thereof of the channel within hub 20) that serves as a basis for identifying a device type of interventional instrument 10 as will be further described in the present disclosure.
  • Examples of a hub include, but are not limited to, a unicath hub, a luer lock hub, an over-catheter hub, a hemostatic valve hub, a guidewire torque hub and an introducer hub.
  • hub 20 may be permanently affixed to OTW device 11 or detachably attached to OTW device 11.
  • hub 20 may be constructed in accordance with International Patent Application Publication No. WO 2017/055620 Al to Noonan et al. entitled “HUB FOR DEVICE NAVIGATION WITH OPTICAL SHAPE SENSED GUIDEWIRE", hereby incorporated by reference.
  • a hub 20a includes a hub body 21, which may have a solid design or split half design.
  • hub body 21 provides channel 21a having a pre-defined identity template that serves as a basis for identifying a device type of interventional instrument 10 as will be further described in the present disclosure.
  • the channel 21a forms a path through hub body 21 as shown whereby a shape profile of the channel 21a is the pre-defined identity template of hub 20a that may be sensed by guidewire 30 being positioned or translated within channel 21a.
  • hub body 21 may employ a mechanism 23 for displacing channel 21a from a linear shape profile to a deformable non-linear shape profile to thereby distinguish a portion of channel 21a as the pre-defined identity template of hub 20a that may be sensed by guidewire 30 being positioned or translated within channel 21a.
  • Mechanism 22 includes a spring return button 22a to induce the shape deformation when needed.
  • heat coil(s) 24 may be employed to induce a temperature based axial strain in channel 21a to form a deformable shape profile of channel 21a thereby distinguish a portion of channel 21a as the pre-defined identity template of hub 20a that may be sensed by guidewire 30 being positioned or translated within channel 21a.
  • Hub body 21 may include a locking mechanism 25 to capture OSS embodiments of guidewire section 30a to thereby impede any translation of OSS guidewire section 30a within hub body 21.
  • hub body 21 may include a radio-opaque feature 24 to permit the registration of the hub 20a by a medical imaging modality (e.g., fluoroscopy/x-ray, MRI, CT, ultrasound, etc.).
  • a medical imaging modality e.g., fluoroscopy/x-ray, MRI, CT, ultrasound, etc.
  • hub body 21 may include an identifier 26 such as, for example, a code, a serial number, a radiofrequency identifier (RFID) tag and a microchip to identify the hub 20a via a database or other reference.
  • identifier 26 such as, for example, a code, a serial number, a radiofrequency identifier (RFID) tag and a microchip to identify the hub 20a via a database or other reference.
  • RFID radiofrequency identifier
  • hub body 21 may include a registration feature 29 such as, for example, a divot or a channel as taught by Noonan. Additionally, hub body 21 includes a proximal Luer lock 27 that is free to rotate and pivot to allow improved usability. Lock 27 may include a feature 28 (e.g., a torque stop or a lock) to prevent removal if twisting in one direction but permit removal in the other direction.
  • a registration feature 29 such as, for example, a divot or a channel as taught by Noonan.
  • hub body 21 includes a proximal Luer lock 27 that is free to rotate and pivot to allow improved usability.
  • Lock 27 may include a feature 28 (e.g., a torque stop or a lock) to prevent removal if twisting in one direction but permit removal in the other direction.
  • OTW device 11 (and hub 20 if employed) will be loaded upon guidewire 30 as known in the art of the present disclosure whereby interventional device 12 is strategically positioned along guidewire 30 to thereby perform a diagnostic task or therapeutic task.
  • interventional device 12 is strategically positioned along guidewire 30 to thereby perform a diagnostic task or therapeutic task.
  • a section 30a of guidewire 30 will extend through OTW device 1 l(and hub 20 if employed).
  • a section 30b of guidewire 30 may be proximal OTW device 11 and/or a section 30c of guidewire 30 may be distal OTW device 11.
  • the term guidewire broadly encompasses all elongated devices, as known in the art of the present disclosure and hereinafter conceived, constructed with shape sensors to assist in an insertion, a positioning and/or a tracking of an interventional instrument within an anatomy
  • shape sensors broadly encompasses all sensors, as known in the art of the present disclosure and hereinafter conceived, that may be distributed along an entirety of a guidewire or one or more segments of the guidewire to collectively generate data informative of a shape profile of a guidewire and may be further informative of additional profiles of the guidewire (e.g., a curvature profile, a temperature profile, a vibration profile, a strain profile, a twist profile and/or an alpha profile).
  • guidewire 30 examples include, but are not limited to, a floppy guidewire, a stiff guidewire, a measurement wire (e.g. flow wire), a k-wire and a microcatheter.
  • a first example of a shape sensor is an optical shape sensor with fiber optics embedded within guidewire 30 for optical shape sensing (OSS) of guidewire 30 as known in the art of the present disclosure.
  • OSS optical shape sensing
  • the optical shape sensor is based on fiber optic Bragg grating sensors.
  • a fiber optic Bragg grating is a short segment of optical fiber that reflects particular wavelengths of light and transmits all others. This is achieved by adding a periodic variation of the refractive index in the fiber core, which generates a wavelength-specific dielectric minor.
  • a fiber Bragg grating can therefore be used as an inline optical filter to block certain wavelengths, or as a wavelength-specific reflector.
  • a FBG sensor may use Fresnel reflection at each of the interfaces where the refractive index is changing. For some wavelengths, the reflected light of the various periods is in phase so that constructive interference exists for reflection and, consequently, destructive interference for transmission.
  • the Bragg wavelength is sensitive to strain as well as to temperature. This means that Bragg gratings can be used as sensing elements in fiber optic sensors. In an FBG sensor, the measurand (e.g., strain) causes a shift in the Bragg wavelength.
  • the optical shape sensor is based on inherent backscatter.
  • Rayleigh scatter or other scattering
  • Rayleigh scatter occurs as a result of random fluctuations of the index of refraction in the fiber core. These random fluctuations can be modeled as a Bragg grating with a random variation of amplitude and phase along the grating length.
  • optical shape sensors include the FORS sensors commercially offered by Philips Healthcare.
  • a second example of a shape sensor are electromagnetic (EM) sensors as known in the art of the present disclosure for detecting a magnetic field to facilitate a measurement of a position and/or an orientation of the EM sensor(s) within the magnetic field.
  • EM sensors may be embedded within guidewire 30 over a length or one or more segments of guidewire 30 to thereby permit a two-dimensional or a three-dimensional shape of guidewire 30 to be precisely determined. Additional, a curvature, a strain and/or a twist of guidewire 30 can be inferred from the positions of the EM sensors within guidewire.
  • a non-limiting example of a EM sensor(s) are AURORA EM sensor(s) commercially offered by NDI, Inc.
  • the term “medical imaging apparatus” encompasses all apparatuses, as known in the art of the present disclosure and hereinafter conceived, for directing energy (e.g., X-ray beams, ultrasound, radio waves, magnetic fields, light, electrons, lasers, and radionuclides) into an anatomy for purposes of generating images of the anatomy
  • the term “medical imaging controller” encompasses all controllers, as known in the art of the present disclosure and hereinafter conceived, for controlling an activation/deactivation of a medical imaging apparatus (e.g., an X-ray C-arm, an ultrasound probe, etc.) to systematically direct energy into an anatomy via operatorgenerated commands and/or image guided procedural -generated commands for the purposes of generating images of the anatomy as known in the art of the present disclosure.
  • a medical imaging system include, but are not limited to, X- ray imaging systems, ultrasound imaging systems, MRI systems and computed- tomography
  • imaging controller 42 may set one or more imaging parameters of the activation/deactivation of medical imaging apparatus 41 for generating anatomical images.
  • imaging parameter broadly encompasses any factor that determines or limits a generation of anatomical images by medical imaging apparatus 41. Examples of an imaging parameter include, but are not limited to, dosage, acquisition time, collimation, shuttering, windowing of an X-ray imaging apparatus, acquisition setting (kV/mA), image processing parameters, frame rate and postprocessing filters.
  • medical imaging controller 42 may set the imaging parameter(s) of the activation/deactivation of medical imaging apparatus 41 for generating anatomical images where the imaging parameter setting corresponds to a device type of interventional instrument 10, or medical imaging controller 42 may provide user interface(s) of user selectable protocols of imaging parameters that correspond to a device type of interventional instrument 10.
  • medical imaging controller 42 may activate/deactivate one or more image enhancements modes of medical imaging apparatus 41 including, but not limited to, StentBoost and SmartPerfusion or the like for X-ray imaging apparatuses.
  • image enhancement mode broadly encompasses any operational mode of medical imaging apparatus 41 that facilitates an image display of interventional instrument 10 that is customized to facilitate an optimal visualization of interventional instrument 10 within medical images.
  • StentBoost is an image enhancement tool that enhances stent visualization in relation to vessel walls by localizing marker bands of the stent in each medical image frame, compensating for any motion, and then averaging across the medical image frames to improve the contrast of the image.
  • a stent is enhanced in an X- ray image 62a by showing finer details of the stent struts while background noise and anatomical structures are faded out. This enables more precise positioning of the stent and the ability to correct for under-deployment immediately.
  • SmartPerfusion is an imaging enhancement technology that provides interventionalists with an objective understanding of the impact of their treatment to help determine the outcome of perfusion procedures.
  • Advanced guidance supports standardized comparisons and automated functions simplify clinical adoption.
  • SmartPerfusion provides step-by-step guidance of a relative positioning of the X-ray apparatus and the interventional instrument during the procedure to aid standardization of pre-and post-comparison runs. For example, a C- arm and a table position can be easily matched with a pre-run position and a catheter position is stored and visualized to standardize placement for injection.
  • Additional image enhancement modes include a generation a multi-dimensional road map imaging of the interventional instrument 10, and implementation of a multidimensional subtraction imaging of the interventional instrument.
  • fluoroscopy roadmapping an image with peak opacification is used as a mask for subsequent subtraction images.
  • an image with contrast filling the vessel tree can be used as a mask for subsequent fluoroscopy navigation of a catheter within the vessel tree.
  • a pigtail catheter is used to inject a contrast agent into the vasculature.
  • a pigtail catheter is the type of device that has been identified, then the physician will most likely want to inject contrast and obtain a roadmap image; and hence the imaging system should be set to acquire such an image and as a next step utilize this newly acquired roadmap as the mask for future subtraction images.
  • Philips has multiple 3D roadmapping tools such as VesselNavigator that provide a 3D overlay (e.g. segmented vessels from pre-operative CT, segmented vessels form intra-operative cone beam CT, planned landmarks such as ring landmarks for target ostea, planned paths).
  • imaging systems typically provide optimized image acquisition protocols and functionality for certain organs, procedures, or devices. For example, ultrasound presets for MitraClip, x-ray presets for Spine, xperCT presets for stroke.
  • spatial tracking apparatus encompasses all apparatuses, as known in the art of the present disclosure and hereinafter conceived, including one or more tracking sensor(s) for implementing a localized spatial tracking of interventional instrument 10 within a registered coordinate system.
  • the tracking sensor(s) are optical shape sensor(s) (OSS) as known in the art the present disclose that utilize light along a multicore optical fiber for device localization and navigation of interventional instrument 20 during an interventional procedure.
  • the principle involved makes use of distributed strain measurement in the optical fiber using characteristic Rayleigh backscatter or controlled grating patterns (e.g., Fiber Bragg Gratings).
  • the optical shape sensor(s) may be integrated into over-the-wire device 11 or guidewire 30 in order to provide live guidance of the device during the interventional procedure without the need for radiation.
  • the integrated fiber provides the position and orientation of the entire device.
  • a non-limiting example of optical shape sensors include the FORS sensors commercially offered by Philips Healthcare.
  • spatial tracking apparatus 53 includes a broadband optical source and wavelength monitor(s) as known in the art of the present disclosure, and spatial tracking controller 54 (e.g., programmed hardware and/or application specific integrated circuit) controls a directing of light through the optical fiber via the broadband optical source as known in the art of the present disclosure and for executing distributed strain measurements in the optical fiber via the wavelength monitor(s) as known in the art of the present disclosure.
  • spatial tracking controller 54 e.g., programmed hardware and/or application specific integrated circuit controls a directing of light through the optical fiber via the broadband optical source as known in the art of the present disclosure and for executing distributed strain measurements in the optical fiber via the wavelength monitor(s) as known in the art of the present disclosure.
  • an optical shape sensor may be utilized for shape sensing and spatially tracking interventional instrument 10, or two (2) different optical shape sensors may be independently utilized for shape sensing and spatially tracking interventional instrument 10.
  • FIG. 5 shows an exemplary overlay of an OSS based interventional instrument upon a pre-operative CT medical image 62b to delineate a position of the OSS based interventional instrument inside a vasculature.
  • the tracking sensor(s) are electromagnetic (EM) sensors as known in the art of the present disclosure for detecting a magnetic field to facilitate a measurement of a position and/or an orientation of the EM sensor(s) within the magnetic field.
  • EM electromagnetic
  • a non-limiting example of a EM based spatial tracking apparatus is the AURORA electromagnetic tracking system commercially offered by NDI, Inc.
  • spatial tracking apparatus 53 is an EM field generator as known in the art of the present disclosure
  • EM based tracking spatial controller 54 e.g., programmed hardware and/or application specific integrated circuit
  • EM based tracking spatial controller 54 for controlling a generation of the magnetic field via the EM field generator as known in the art of the present disclosure and for measuring a position and/or an orientation of the EM sensor(s) within the magnetic field as known in the art of the present disclosure.
  • a set of EM sensors may be utilized for shape sensing and spatially tracking interventional instrument 10, or two (2) different sets of EM sensors may be independently utilized for shape sensing and spatially tracking interventional instrument 10.
  • image processor encompasses all digital signal processors, as known in the art of the present disclosure and hereinafter conceived, configurable for executing image processing to generate an image.
  • medical image processor 60 is an image processor configured to execute image processing to generate medical image(s) 62 for display on monitor 61.
  • the image processing performed by medical image processor 60 is dependent upon the type of medical imaging apparatus 41 being deployed and the type of spatial tracking apparatus 53 being deployed.
  • interventional instrument 10 the medical imaging system, the spatial tracking system and the medical display system represent exemplary medical intervention systems as known in the art of the present disclosure, such as, for example, image-guided medical interventions practicable via medical intervention systems commercially offered by Philips Healthcare.
  • the present disclosure improves upon such medical intervention systems as well as other intervention systems, as known in the art of the present disclosure and hereinafter conceived, by providing an adaptive intervention controller 70a for improving a medical image visualization of interventional instrument 10, such as, by example, setting the imaging parameters of medical imaging apparatus 41 in accordance with a shape sensed identification of a device type of interventional instrument 10 and/or activating/deactivating enhancement modes of medical imaging apparatus 41 in accordance with a device type of interventional instrument 10 as will be further detailed in the present disclosure with the description of FIGS. 6 and 7.
  • adaptive intervention controller 70a may be (1) installed within one of the medical imaging system, the spatial tracking system and the medical display system, (2) distributed among two or more of the medical imaging system, the spatial tracking system and the medical display system or (3) installed within a separate device, such as, for example, a tablet, a laptop, a workstation or a server.
  • an exemplary medical intervention system of the present disclosure employs interventional instrument 10, a spatial tracking system including spatial tracking apparatus 53 and spatial tracking controller 54, and a medical display system including image processor 60 and an monitor 61 as previously described in the present disclosure (FIG. 1 A).
  • This exemplary medical intervention system further employs a medical robot system including a medical robot apparatus 91 and a medical robot controller 92.
  • the term “medical robot apparatus” encompasses all apparatuses, as known in the art of the present disclosure and hereinafter conceived, controllable for navigating an interventional instrument within an anatomy during a medical intervention
  • the term “medical robot controller” encompasses all controllers, as known in the art of the present disclosure and hereinafter conceived, for controlling a medical robot apparatus to navigate an interventional instrument to target position(s) within the anatomy during the medical intervention via operator-generated commands and/or image guided procedural -generated commands.
  • Examples of a medical robot system include, but are not limited to, the CorPath® GRX medical robot system, the MagellanTM medical robot system, the MonarchTM medical robot system and the IonTM medical robot system.
  • medical robot controller 92 may automatically set one or more robot navigation parameters of medical robot apparatus 91 for navigate an interventional instrument to target position(s) within the anatomy where the robot navigation parameter setting corresponds to a device type of interventional instrument 10, or medical robot controller 92 may provide user interface(s) of user selectable protocols of robot navigation parameters that correspond to a device type of interventional instrument 10.
  • medical robot controller 92 may set robot navigation parameters associated with various motions of interventional instrument 10 as being held by medical robot apparatus 91, such as for example, a setting of a linear velocity, a linear acceleration, a linear deceleration, a rotational velocity, a rotational acceleration, a rotational deceleration, a pivotal velocity, a pivotal acceleration and/or a pivotal deceleration of a distal tip of interventional instrument 10.
  • medical robot controller 92 may set navigation enhancement mode associated with a maintaining an orientation of interventional instrument 10.
  • medical robot controller 92 may set orientation of interventional instrument
  • a medical imaging apparatus e.g. a distal tip orientation is maintained relative to a beam path of a registered X-ray system.
  • medical robot controller 92 may set navigation enhancement mode associated with a maintaining a positional/orientational relationship between components of OTW device
  • medical robot controller 92 may set an optimal positioning of a sheath with respect to a catheter for stabilization dependent upon the mechanical properties of the catheter (e.g. how far to pull back a guidewire into the catheter to allow for the catheter to take on its pre-formed shape).
  • medical robot controller 92 may set navigation enhancement mode associated with updating a planned sequence of motions of interventional instrument 10 within an anatomy.
  • medical robot controller 92 may modify a path to a target position within an anatomy based on a natural mechanical curvature of interventional instrument 10.
  • medical robot controller 92 may set robot navigation parameters associated with defining spatial limits on where the interventional instrument 10 may be navigated within an anatomy. For example, a certain size of sheath may not be advanced beyond a certain depth into renal arteries.
  • interventional instrument 10 the medical robot system, the spatial tracking system and the medical display system represent exemplary medical intervention systems as known in the art of the present disclosure, such as, for example, robot based medical interventions practicable via medical intervention systems commercially offered by Philips Healthcare.
  • the present disclosure improves upon such medical intervention systems as well as other intervention systems, as known in the art of the present disclosure and hereinafter conceived, by providing an adaptive intervention controller 70b for improving a robotic navigation of interventional instrument 10 with an anatomy, such as, by example, setting the robot navigation parameters of medical robot apparatus 11 and/or activating/deactivating navigation enhancement modes in accordance with a shape sensed identification of a device type of interventional instrument 10 will be further detailed in the present disclosure with the description of FIGS. 6 and 7.
  • adaptive intervention controller 70b may be (1) installed within one of the medical robot system, the spatial tracking system and the medical display system, (2) distributed among two or more of the medical robot system, the spatial tracking system and the medical display system or (3) installed within a separate device, such as, for example, a tablet, a laptop, a workstation or a server.
  • these embodiments may have an segregation, a partial integration or a complete integration of adaptive intervention controller 70a and adaptive intervention controller 70b.
  • FIGS. 6 and 7 teaches exemplary embodiments of a medical intervention control method in accordance with the present disclosure. From the description of FIGS. 6 and 7, those having ordinary skill in the art of the present disclosure will appreciate how to apply the present disclosure to devise and execute additional embodiments of a medical intervention control method in accordance with the present disclosure.
  • a medical intervention control method is executed by an adaptive intervention controller of the present disclosure, such as, for example, adaptive intervention controller 70a (FIG. 1 A) adaptive intervention controller 70b (FIG. IB) or a partial/complete integration of adaptive intervention controllers 70a and 70b.
  • adaptive intervention controller 70a FIG. 1 A
  • adaptive intervention controller 70b FIG. IB
  • a partial/complete integration of adaptive intervention controllers 70a and 70b FIG. 1 A
  • adaptive intervention controller encompasses all structural configurations, as understood in the art of the present disclosure and as exemplary described in the present disclosure, of a main circuit board or an integrated circuit for controlling an application of various principles of the present disclosure for adapting a medical intervention to a shape sensed identification of a device type of interventional instrument 10 as will be further detailed in the description of FIGS. 6 and 7.
  • the structural configuration of the adaptive intervention controller may include, but is not limited to, processor(s), computer-usable/computer readable storage medium(s), an operating system, application module(s), peripheral device controller(s), slot(s) and port(s).
  • adaptive intervention controller 70a employs application modules for adapting a medical intervention to a shape sensed identification of a device type of interventional instrument 10 as will be further detailed in the description of FIGS. 6 and 7.
  • a flowchart 100 represents an exemplary embodiment of a medical intervention control method of the present disclosure as executable by an adaptive intervention controller of the present disclosure.
  • a device shape sensing stage SI 02 of flowchart 100 involves a translation or a positioning of the shape sensors of guidewire 30 within OTW device 11 and/or hub 20 (if employed in interventional instrument 10) whereby the shape sensor(s) of guidewire 30 generate shape sensing data 110 indicative of a shape profile of guidewire 30 within OTW device 11 and/or hub 20 (if employed in interventional instrument 10) and may be further informative of additional profiles of guidewire 30 (e.g., a curvature profile, a temperature profile, a vibration profile, a strain profile, a twist profile and/or an alpha profile).
  • additional profiles of guidewire 30 e.g., a curvature profile, a temperature profile, a vibration profile, a strain profile, a twist profile and/or an alpha profile.
  • device shape sensing stage SI 02 of flowchart 100 may involve a spatial registration of interventional instrument 10 (FIG. 1) and medical imaging apparatus 41 (FIG. 1) executed by medical imaging controller 42 or adaptive intervention controller 70a.
  • interventional instrument 10 FIG. 1
  • medical imaging apparatus 41 FIG. 1
  • any type of registration technique as known in the art of the present disclosure and hereinafter conceived, may be utilized to register interventional instrument and medical imaging apparatus.
  • registration techniques based on spatial dimensions or time series include, but are not limited to, extrinsic image based registration techniques (e.g., invasive with screw markers or noninvasive with fiducial markers), intrinsic image based registration techniques (e.g., landmark based, segmentation based or voxel based), or non-image based registration techniques (e.g., OSS registration).
  • extrinsic image based registration techniques e.g., invasive with screw markers or noninvasive with fiducial markers
  • intrinsic image based registration techniques e.g., landmark based, segmentation based or voxel based
  • non-image based registration techniques e.g., OSS registration
  • a device type identification stage SI 04 of flowchart 100 involves a device type identification of interventional instrument 10 (FIG. 1) executed by adaptive intervention controller 70a.
  • adaptive intervention controller 70a inputs shape sensing data 110 generated during device shape sensing stage S102 to (1) identify pre-defined identity feature(s) of OTW device 11 from the shape of the guidewire, (2)and/or a pre-defined identity template of hub 20 (if employed in interventional instrument 10), and/or (3) temporal changes of 3D shape sensing generated from the guidewire or OTW device
  • profile(s) of guidewire 30 may be derived from a translation or a positioning of the shape sensors of guidewire 30 within OTW device 11 and/or hub 20 (if employed) to execute (1) an instantaneous three- dimensional (3D) shape sensing generated from multiple sensors of guidewire 30 positioned within OTW device 11 and/or hub 20 (if employed), or (2) a derived 3D shape sensing generated from one sensor of guidewire 30 temporally translating through OTW device 11 and/or hub 20 (if employed).
  • 3D three- dimensional
  • the term "device type” broadly encompasses a particular genus of interventional instrument 10 (e.g., a catheter, a deployment device, a delivery device, a therapy device, an imaging device, etc.) or a particular species of a genus of interventional instrument 10 (e.g., a balloon catheter, a stent catheter, an ablation catheter, an imaging catheter, an infusion catheter, an endograft deployment device, a sheath, an introducer, a mitral clip delivery device, a mitral valve delivery device, an aortic valve delivery device, an IVUS catheter, etc.), or an exact of interventional instrument 10 (e.g. a Cook 80cm Cobra catheter, a Medtronic Endurant II endograft system, etc.).
  • interventional instrument 10 e.g., a catheter, a deployment device, a delivery device, a therapy device, an imaging device, etc.
  • a particular species of a genus of interventional instrument 10 e.g.,
  • adaptive intervention controller 70a interprets shape sensing data 110 to identify the pre-defined identity feature(s) of OTW device 11 and/or the pre-defined identify template of hub 20 (if employed in interventional instrument 10) the correspond to the shape profile and any other additional profiles of guidewire 30 within OTW device 11 and/or hub 20 (if employed in interventional instrument 10) as indicated by shape sensing data 110.
  • adaptive intervention controller 70a executes a search in a database of an organized collection of genuses of interventional instruments and/or species of one or more genuses of interventional instruments to match the shape profile and any other additional profiles of guidewire 30 within OTW device 11 and/or hub 20 (if employed in interventional instrument 10) to pre-defined identity feature(s) of an OTW device 11 and/or the pre-defined identify template of a hub to thereby identify the device type of interventional instrument 10.
  • adaptive intervention controller 70a executes an indexing of a lookup table of an array of genuses of interventional instruments and/or species of one or more genuses of interventional instruments to match the shape profile and any other additional profiles of guidewire 30 within OTW device 11 and/or hub 20 (if employed in interventional instrument 10) to pre-defined identity feature(s) of an OTW device 11 and/or the pre-defined identify template of a hub to thereby identify the device type of interventional instrument 10.
  • adaptive intervention controller 70a executes a scanning of a device reference of genuses of interventional instruments and/or species of one or more genuses of interventional instruments to match the shape profile and any other additional profiles of guidewire 30 within OTW device 11 and/or hub 20 (if employed in interventional instrument 10) to pre-defined identity feature(s) of an OTW device 11 and/or the pre-defined identify template of a hub to thereby identify the device type of interventional instrument 10.
  • Temporal data may also be used in stage S104, where the adaptive intervention controller 70a executes a scanning of the shape of the guidewire and saves the shape of the guidewire and/or OTW device with respect to time.
  • Known information about the type of guidewire that is in use may also be saved, for example a floppy guidewire or a stiff guidewire.
  • the shape of the OTW device may deform differently during manipulation of the guidewire and OTW device. For example a floppy guidewire will deform the OTW device minimally, while a stiff guidewire with cause larger changes to the shape of the OTW device.
  • the changes in shape of the OTW device over time may be used to determine the type of device it is.
  • the pre-defined identity features of the OTW device, and the type of guidewire can be used together to determine the type of OTW device. For example, if curvature is used to pre-define the OTW device; the curvature peak locations may be similar when used with a floppy or stiff wire but the magnitude of those peaks may be different. Hence when the guidewire is known to be floppy and the pre-defined identity feature is from a floppy guidewire, the OTW device is easily determined. However, if the guidewire is stiff the pre-defined identify feature with a deformation applied to it for the stiff guidewire will then determine the type of OTW device.
  • adaptive intervention controller 70a may provide a user interface to enable an operator of the system to add any unidentifiable profile(s) of guidewire 30 within OTW device 11 and/or hub 20 (if employed) corresponding to particular device type(s) of interventional instrument(s) to the database, the look-up table, the device reference and/or other data storage means. Also, adaptive intervention controller 70a may be configured (e.g., via Artificial Intelligence) to any unidentifiable profile(s) of guidewire 30 within OTW device 11 and/or hub 20 (if employed) corresponding to particular device type(s) of interventional instrument(s) to the database, the look-up table, the device reference and/or other data storage means.
  • an image/robotic adaptation stage SI 06 of flowchart 100 involves adaptive intervention controller 70a adapting an operational control of medical imaging apparatus 41 by the medical imaging controller 42 in accordance with the identification of the device type of the interventional instrument 10 of stage SI 04.
  • the adaptation of the operational control of medical imaging apparatus 41 is directed to improving the visualization quality of interventional instrument 10 within medical image(s).
  • adaptive intervention controller 70a searches a database of any imaging enhancement mode(s) of medical imaging apparatus 41 and/or any imaging param eter(s) of medical imaging apparatus 41 associated with device type of the interventional instrument 10, and issues an adaptation command 111 to medical imaging controller 42 to activate any retrieved imaging enhancement mode(s) of medical imaging apparatus 41 and set any retrieved imaging param eter(s) of medical imaging apparatus 41.
  • adaptive intervention controller 70a indexes lookup tables of any imaging enhancement mode(s) of medical imaging apparatus 41 and/or any imaging parameter(s) of medical imaging apparatus 41 associated with device type of the interventional instrument 10, and issues an adaptation command 111 to medical imaging controller 42 to activate any indexed imaging enhancement mode(s) of medical imaging apparatus 41 and set any indexed imaging param eter(s) of medical imaging apparatus 41.
  • adaptive intervention controller 70a informs medical imaging controller 42 of identification data 112 whereby medical imaging controller 42 searches a database of any imaging enhancement mode(s) of medical imaging apparatus 41 and/or any imaging parameter(s) of medical imaging apparatus 41 associated with device type of the interventional instrument 10 per identification data 112, and activates any retrieved imaging enhancement mode(s) of medical imaging apparatus 41 and sets any retrieved imaging param eter(s) of medical imaging apparatus 41.
  • medical imaging controller 42 and spatial tracking controller 54 may be operated to generate medical images 62 as adapted to the shape sensed identification of a device type of interventional instrument 10 to thereby obtain an improved visualization quality of medical images, and/or medical robot controller 92 and spatial tracking controller 54 may be operated to navigate interventional instrument 10 within an anatomy adapted to the shape sensed identification of a device type of interventional instrument 10 to thereby obtain an optimal navigation of interventional instrument 10 at target position(s) within the anatomy.
  • adaptive intervention controller 70a may provide a user interface to enable an operator of the system to add image enhancement mode(s) of medical imaging apparatus 41 corresponding to particular device type(s) of interventional instrument(s) to the database, the look-up table, the device reference and/or other data storage means.
  • medical imaging controller 42 and/or adaptive intervention controller 70a may be configured (e.g., via Artificial Intelligence) to automatically add any new image enhancement mode(s) of medical imaging apparatus 41 corresponding to particular device type(s) of interventional instrum ent(s) to the database, the look-up table, the device reference and/or other data storage means.
  • a flowchart 200 represents an exemplary embodiment of an X-Ray based medical intervention control method of the present disclosure as executable an adaptive intervention controller of the present disclosure
  • a device shape sensing stage S202 of flowchart 200 may include OTW device shape sensing embodiment 210 involving a translation or a positioning of the shape sensors of guidewire 30 within OTW device 11 whereby the shape sensor(s) of guidewire 30 generate shape sensing data 110 indicative of a shape profile of guidewire 30 within OTW device 11 and may be further informative of additional profiles of guidewire 30 (e.g., a curvature profile, a temperature profile, a vibration profile, a strain profile, a twist profile and/or an alpha profile).
  • additional profiles of guidewire 30 e.g., a curvature profile, a temperature profile, a vibration profile, a strain profile, a twist profile and/or an alpha profile.
  • device shape sensing stage S202 of flowchart 200 may include a hub device shape sensing embodiment 211 involving a translation or a positioning of the shape sensors of guidewire 30 within hub 20 whereby the shape sensor(s) of guidewire 30 generate shape sensing data 110 indicative of a shape profile of guidewire 30 within hub 20 and may be further informative of additional profiles of guidewire 30 (e.g., a curvature profile, a temperature profile, a vibration profile, a strain profile, a twist profile and/or an alpha profile).
  • additional profiles of guidewire 30 e.g., a curvature profile, a temperature profile, a vibration profile, a strain profile, a twist profile and/or an alpha profile.
  • a device type identification stage S204 of flowchart 200 involves either an execution of a controller recognition algorithm 212 by the adaptive intervention controller to identify a particular device type of interventional instrument 10 as previously described in the present disclosure, or a user identification 213 of a particular device type of interventional instrument 10 (e.g., via a user delineating interventional instrument via a dropdown list of available interventional instruments).
  • an image/robotic adaptation stage S206 of flowchart 200 may involve the adaptive intervention controller adapting an operational control of medical imaging apparatus 41 by the medical imaging controller 42 in accordance with the identification of the device type of the interventional instrument 10 via X-ray parameter setting(s) 214 and/or an X-ray enhancement activation 215.
  • Examples of X-ray parameter setting(s) 214 include, but are not limited to, a setting of a dosage, an acquisition time, a shuttering and a windowing associated with the identified device type of the interventional instrument 10.
  • Examples of X-ray enhancement activations 215 include, but are not limited to, an activation of StentBoost and/or SmartPerfusion if associated with the identified device type of the interventional instrument 10.
  • medical imaging controller 42 and/or the adaptive intervention controller may control a display of one or more user interfaces allowing user interface with the medical images.
  • a user interface may be displayed that provide a user interaction to change a color of interventional instrument 10 within the medical image as means for delineating the identified device type of interventional instrument 10 or to tag the medical image with the identified device type of interventional instrument 10.
  • a user interface may be displayed that provide various protocols of imaging parameters/imaging enhancement modes corresponding to the identified device type of interventional instrument 10 whereby the user can select a desired protocol to optimize the generation and/or visualization of the medical images.
  • image/robotic adaptation stage S206 of flowchart 200 may involve the adaptive intervention controller adapting an operational control of medical robot apparatus 91 by the medical robot controller 92 in accordance with the identification of the device type of the interventional instrument 10 via robot parameter setting(s) 217 and/or a navigation enhancement mode 218.
  • robot parameter setting(s) 217 include, but are not limited to, a setting of a motions of interventional instrument within an anatomy in accordance with the identified device type of the interventional instrument 10 (e.g., an acceleration, a deceleration and a max velocity of interventional instrument 10 within the anatomy.
  • a motions of interventional instrument within an anatomy in accordance with the identified device type of the interventional instrument 10 (e.g., an acceleration, a deceleration and a max velocity of interventional instrument 10 within the anatomy.
  • navigation enhancement activations 218 include, but are not limited to, an activation of a maintaining an orientation of a distal tip of interventional instrument 10 relative to a beam bath of the X-ray imaging apparatus.
  • medical robot controller 92 and/or the adaptive intervention controller may control a display of one or more user interfaces allowing user interface with navigation of interventional instrument 10 within an anatomy.
  • a user interface may be displayed that provide a user interaction with a simulated navigation of the interventional instrument 10 within X-ray image(s) of the anatomy via operator-manipulated medical robots the corresponds to the the identified device type of interventional instrument 10.
  • a user interface may be displayed that provide various protocols of robot parameters/robot enhancement modes corresponding to the identified device type of interventional instrument 10 whereby the user can select a desired protocol to optimize the navigation of interventional instrument 10 within an anatomy.
  • medical imaging controller 42 and spatial tracking controller 54 may be operated to generate medical images 62 as adapted to the shape sensed identification of a device type of interventional instrument 10 to thereby obtain an improved visualization quality of medical images, and/or medical robot controller 92 and spatial tracking controller 54 may be operated to navigate interventional instrument 10 within an anatomy adapted to the shape sensed identification of a device type of interventional instrument 10 to thereby obtain an optimal navigation of interventional instrument 10 at target position(s) within the anatomy.
  • FIGS. 8A-8F and 9A-9F respectively teach an exemplary embodiment of an adaptive intervention controller of the present disclosure. From this description, those having ordinary skill in the art will appreciate how to apply various aspects of the present disclosure for making and using additional embodiments of an adaptive intervention controller of the present disclosure.
  • FIGS. 8A-8C illustrate various embodiments 170a-170c of adaptive intervention controller 70a (FIG. 1 A) and FIGS. 8D-8F illustrate various embodiments 170d-170f of adaptive intervention controller 70b (FIG. IB).
  • each adaptive intervention controller 170a-170f includes one or more processor(s) 171, memory 172, a user interface 173, a network interface 174, and a storage 175 interconnected via one or more system buses 176.
  • Each processor 171 may be any hardware device, as known in the art of the present disclosure or hereinafter conceived, capable of executing instructions stored in memory 172 or storage or otherwise processing data.
  • the processor(s) 171 may include a microprocessor, field programmable gate array (FPGA), application-specific integrated circuit (ASIC), or other similar devices.
  • the memory 172 may include various memories, as known in the art of the present disclosure or hereinafter conceived, including, but not limited to, LI, L2, or L3 cache or system memory.
  • the memory 172 may include static random access memory (SRAM), dynamic RAM (DRAM), flash memory, read only memory (ROM), or other similar memory devices.
  • the user interface 173 may include one or more devices, as known in the art of the present disclosure or hereinafter conceived, for enabling communication with a user such as an administrator.
  • the user interface may include a command line interface or graphical user interface that may be presented to a remote terminal via the network interface 174.
  • the network interface 174 may include one or more devices, as known in the art of the present disclosure or hereinafter conceived, for enabling communication with other hardware devices.
  • the network interface 174 may include a network interface card (NIC) configured to communicate according to the Ethernet protocol.
  • NIC network interface card
  • the network interface 174 may implement a TCP/IP stack for communication according to the TCP/IP protocols.
  • TCP/IP protocols Various alternative or additional hardware or configurations for the network interface 174 will be apparent.
  • the storage 175 may include one or more machine-readable storage media, as known in the art of the present disclosure or hereinafter conceived, including, but not limited to, read-only memory (ROM), random-access memory (RAM), magnetic disk storage media, optical storage media, flash-memory devices, or similar storage media.
  • the storage 175 may store instructions for execution by the processor(s) 171 or data upon with the processor(s) 171 may operate.
  • the storage 175 stores a base operating system for controlling various basic operations of the hardware.
  • storage 175 of adaptive intervention controller 170a also stores application modules 177a in the form of executable software/firmware for implementing the various functions of adaptive intervention controller 170a including, but not limited to:
  • a device shape sensing module 178a for executing an embodiment of stage SI 02 of flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S202 of flowchart 200 (FIG. 7) as previously in the present disclosure;
  • stage SI 04 flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S204 of flowchart 200 (FIG. 7) as previously in the present disclosure;
  • an imaging adaptation module 178c for executing an embodiment of stage SI 06 of flowchart 100 (FIG. 6) related to adapting a medical imaging apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure.
  • adaptive intervention controller 170a may be installed in a remote device 80a (e.g., a tablet, a laptop, a workstation or a server) that is in communication with a medical imaging system 40a including medical imaging apparatus 41 and medical imaging controller 42 as previously in the present disclosure, and in further communication with a spatial tracking system 50a including a spatial tracking apparatus 53 and a spatial tracking controller 54 (e.g., an OSS system) as previously in the present disclosure.
  • a remote device 80a e.g., a tablet, a laptop, a workstation or a server
  • a spatial tracking system 50a including a spatial tracking apparatus 53 and a spatial tracking controller 54 (e.g., an OSS system) as previously in the present disclosure.
  • adaptive intervention controller 170a may be installed within medical imaging system 40a or spatial tracking system 50a, or distributed between medical imaging system 40a and spatial tracking system 50a.
  • storage 175 of adaptive intervention controller 170b also stores application modules 177b in the form of executable software/firmware for implementing the various functions of adaptive intervention controller 170b including, but not limited to:
  • device shape sensing module 178a for executing an embodiment of stage SI 02 of flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S202 of flowchart 200 (FIG. 7) as previously in the present disclosure;
  • stage SI 04 flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S204 of flowchart 200 (FIG. 7) as previously in the present disclosure;
  • an imaging adaptation module 178c for executing an embodiment of stage SI 06 of flowchart 100 (FIG. 6) related to adapting a medical imaging apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure; and
  • one or more medical imaging module(s) 178d for controlling an operation of a medical imaging apparatus, such as, for example, setting the imaging parameters of medical imaging apparatus, activating/deactivating enhancement modes of medical imaging apparatus and actuating medical imaging apparatus to generate medical imaging data.
  • adaptive intervention controller 170b may be installed in a medical imaging system 40b as a substitute for medical imaging controller 42 (FIG. 1A) whereby adaptive intervention controller 170b controls an operation of a medical imaging apparatus 41 and whereby adaptive intervention controller 170b is in communication with a spatial tracking system 50a including a spatial tracking apparatus 53 and a spatial tracking controller 54 (e.g., an OSS system).
  • a spatial tracking system 50a including a spatial tracking apparatus 53 and a spatial tracking controller 54 (e.g., an OSS system).
  • storage 175 of adaptive intervention controller 170c also stores application modules 177c in the form of executable software/firmware for implementing the various functions of adaptive intervention controller 170c including, but not limited to:
  • device shape sensing module 178a for executing an embodiment of stage SI 02 of flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S202 of flowchart 200 (FIG. 7) as previously in the present disclosure;
  • stage SI 04 flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S204 of flowchart 200 (FIG. 7) as previously in the present disclosure;
  • an imaging adaptation module 178c for executing an embodiment of stage SI 06 of flowchart 100 (FIG. 6) related to adapting a medical imaging apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure; and
  • one or more spatial tracking module(s) 178e for controlling an operation of spatial tracking apparatus such as, for example, actuating sensors of the spatial tracking apparatus to generate sensor shape data or position data (e.g., OSS interrogation of fibre optic Bragg gratings (FBG)).
  • FBG fibre optic Bragg gratings
  • adaptive intervention controller 170c may be installed in a spatial tracking system 50b as a substitute for spatial tracking controller 54 (FIGS. 1A and IB) whereby adaptive intervention controller 170c controls an operation of a spatial tracking apparatus 53 (e.g., an OSS system) and whereby adaptive intervention controller 170c is in communication with a medical imaging system 40a including a medical imaging apparatus 41 and a medical imaging controller 42.
  • a spatial tracking apparatus 53 e.g., an OSS system
  • storage 175 of adaptive intervention controller 170d also stores application modules 177d in the form of executable software/firmware for implementing the various functions of adaptive intervention controller 170d including, but not limited to:
  • device shape sensing module 178a for executing an embodiment of stage SI 02 of flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S202 of flowchart 200 (FIG. 7) as previously in the present disclosure;
  • stage SI 04 flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S204 of flowchart 200 (FIG. 7) as previously in the present disclosure;
  • a robot adaption module 178f for executing an embodiment of stage SI 06 of flowchart 100 (FIG. 6) related to adapting a medical robot apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure.
  • adaptive intervention controller 170d may be installed in a remote device 80b (e.g., a tablet, a laptop, a workstation or a server) that is in communication with a medical robot system 90a including medical robot apparatus 91 and medical robot controller 92 as previously in the present disclosure, and in further communication with a spatial tracking system 50a including a spatial tracking apparatus 53 and a spatial tracking controller 54 (e.g., an OSS system) as previously in the present disclosure.
  • a remote device 80b e.g., a tablet, a laptop, a workstation or a server
  • a medical robot system 90a including medical robot apparatus 91 and medical robot controller 92
  • a spatial tracking system 50a including a spatial tracking apparatus 53 and a spatial tracking controller 54 (e.g., an OSS system) as previously in the present disclosure.
  • adaptive intervention controller 170d may be installed within medical robot system 90a or spatial tracking system 50a, or distributed between medical robot system 90a and spatial tracking system 50a.
  • storage 175 of adaptive intervention controller 170e also stores application modules 177e in the form of executable software/firmware for implementing the various functions of adaptive intervention controller 170b including, but not limited to:
  • device shape sensing module 178a for executing an embodiment of stage SI 02 of flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S202 of flowchart 200 (FIG. 7) as previously in the present disclosure;
  • stage S104 flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S204 of flowchart 200 (FIG. 7) as previously in the present disclosure;
  • robot adaption module 178f for executing an embodiment of stage SI 06 of flowchart 100 (FIG. 6) related to adapting a medical robot apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure; and
  • one or more medical imaging module(s) 178d for controlling an operation of a medical imaging apparatus, such as, for example, setting the imaging parameters of medical imaging apparatus, activating/deactivating enhancement modes of medical imaging apparatus and actuating medical imaging apparatus to generate medical imaging data.
  • adaptive intervention controller 170e may be installed in a medical robot system 90b as a substitute for medical robot controller 92 (FIG. 1A) whereby adaptive intervention controller 170e controls an operation of a medical robot apparatus 91 and whereby adaptive intervention controller 170e is in communication with a spatial tracking system 50a including a spatial tracking apparatus 53 and a spatial tracking controller 54 (e.g., an OSS system).
  • a spatial tracking system 50a including a spatial tracking apparatus 53 and a spatial tracking controller 54 (e.g., an OSS system).
  • storage 175 of adaptive intervention controller 170c also stores application modules 177c in the form of executable software/firmware for implementing the various functions of adaptive intervention controller 170c including, but not limited to: (1) device shape sensing module 178a for executing an embodiment of stage SI 02 of flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S202 of flowchart 200 (FIG. 7) as previously in the present disclosure;
  • stage SI 04 flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S204 of flowchart 200 (FIG. 7) as previously in the present disclosure;
  • robot adaption module 178f for executing an embodiment of stage S106 of flowchart 100 (FIG. 6) related to adapting a medical robot apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure; and
  • one or more spatial tracking module(s) 178e for controlling an operation of spatial tracking apparatus such as, for example, actuating sensors of the spatial tracking apparatus to generate sensor shape data or position data (e.g., OSS interrogation of fibre optic Bragg gratings (FBG)).
  • FBG fibre optic Bragg gratings
  • adaptive intervention controller 170f may be installed in a spatial tracking system 50b as a substitute for spatial tracking controller 54 (FIGS. 1A and IB) whereby adaptive intervention controller 170f controls an operation of spatial tracking apparatus 53 (e.g., an OSS system) and whereby adaptive intervention controller 170f is in communication with a medical robot system 90a including a medical robot apparatus 91 and a medical robot controller 92.
  • spatial tracking apparatus 53 e.g., an OSS system
  • the respective application modules 177a-177c of adaptive intervention controllers 170a- 170c may further include robot adaption module 178f (FIGS. 9D-9F) for executing an embodiment of stage S106 of flowchart 100 (FIG. 6) related to adapting a medical robot apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure whereby adaptive intervention controllers 170a- 170c are in communication with a medical robot system including a medical robot apparatus and a medical robot controller.
  • robot adaption module 178f FIGS. 9D-9F
  • the respective application modules 177d-177f of adaptive intervention controllers 170d-170f may further include an imaging adaptation module 178c for executing an embodiment of stage SI 06 of flowchart 100 (FIG. 6) related to adapting a medical imaging apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure whereby adaptive intervention controllers 170d-170f are in communication with a medical imaging system including a medical imaging apparatus and a medical imaging controller.
  • an imaging adaptation module 178c for executing an embodiment of stage SI 06 of flowchart 100 (FIG. 6) related to adapting a medical imaging apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure whereby adaptive intervention
  • structures, elements, components, etc. described in the present disclosure/specification and/or depicted in the Figures may be implemented in various combinations of hardware and software, and provide functions which may be combined in a single element or multiple elements.
  • the functions of the various structures, elements, components, etc. shown/illustrated/depicted in the Figures can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software for added functionality.
  • the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared and/or multiplexed.
  • processor or “controller” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor (“DSP”) hardware, memory (e.g., read only memory (“ROM”) for storing software, random access memory (“RAM”), non-volatile storage, etc.) and virtually any means and/or machine (including hardware, software, firmware, combinations thereof, etc.) which is capable of (and/or configurable) to perform and/or control a process.
  • DSP digital signal processor
  • ROM read only memory
  • RAM random access memory
  • non-volatile storage etc.
  • machine including hardware, software, firmware, combinations thereof, etc.
  • any flow charts, flow diagrams and the like can represent various processes which can be substantially represented in computer readable storage media and so executed by a computer, processor or other device with processing capabilities, whether or not such computer or processor is explicitly shown.
  • signal broadly encompasses all forms of a detectable physical quantity or impulse (e.g., voltage, current, or magnetic field strength) as understood in the art of the present disclosure and as exemplary described in the present disclosure for transmitting information and/or instructions in support of applying various inventive principles of the present disclosure as subsequently described in the present disclosure.
  • Signal/data/command communication between various components of the present disclosure may involve any communication method as known in the art of the present disclosure including, but not limited to, signal/data/command transmission/reception over any type of wired or wireless datalink and a reading of signal/data/commands uploaded to a computer-usable/computer readable storage medium.
  • corresponding and/or related systems incorporating and/or implementing the device/system or such as may be used/implemented in/with a device in accordance with the present disclosure are also contemplated and considered to be within the scope of the present disclosure.
  • corresponding and/or related method for manufacturing and/or using a device and/or system in accordance with the present disclosure are also contemplated and considered to be within the scope of the present disclosure.

Abstract

Des modes de réalisation de la présente invention concernent un système de commande d'intervention médicale pour améliorer une intervention médicale incorporant un instrument d'intervention (10) apte à être positionné sur un fil-guide (30). En fonctionnement, un dispositif de commande d'intervention adaptative (70) obtient une identification du type de dispositif de l'instrument d'intervention (10) en réponse à la réception de données de détection de forme générées par le fil-guide (30) lorsque l'instrument d'intervention (10) est positionné en partie ou entièrement sur le fil-guide (30). Conformément à l'identification du type de dispositif de l'instrument d'intervention (10), le dispositif de commande d'intervention adaptative (70) adapte également une commande opérationnelle d'un appareil d'imagerie médicale (41) par un dispositif de commande d'imagerie médicale (42) pour l'imagerie médicale de l'instrument d'intervention (10) et/ou une commande opérationnelle d'un appareil de robot médical (91) par un dispositif de commande de robot médical (92) pour une navigation robotique de l'instrument d'intervention (10).
PCT/EP2021/073592 2020-09-02 2021-08-26 Commande d'intervention médicale basée sur l'identification de type de dispositif WO2022048984A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063073822P 2020-09-02 2020-09-02
US63/073,822 2020-09-02

Publications (1)

Publication Number Publication Date
WO2022048984A1 true WO2022048984A1 (fr) 2022-03-10

Family

ID=77774879

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/073592 WO2022048984A1 (fr) 2020-09-02 2021-08-26 Commande d'intervention médicale basée sur l'identification de type de dispositif

Country Status (1)

Country Link
WO (1) WO2022048984A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117530775A (zh) * 2024-01-09 2024-02-09 华中科技大学同济医学院附属协和医院 一种基于人工智能和ct的磁控介入控制方法及系统
CN117530775B (en) * 2024-01-09 2024-04-30 华中科技大学同济医学院附属协和医院 Magnetic control intervention control method and system based on artificial intelligence and CT

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7289652B2 (en) 2001-11-21 2007-10-30 Koninklijke Philips Electronics, N. V. Medical viewing system and method for detecting and enhancing structures in noisy images
US20130216025A1 (en) * 2010-10-27 2013-08-22 Koninklijke Philips Electronics N.V. Adaptive imaging and frame rate optimizing based on real-time shape sensing of medical instruments
WO2017055620A1 (fr) 2015-10-02 2017-04-06 Koninklijke Philips N.V. Raccord pour navigation de dispositif à fil-guide de détection de forme optique
US20180264227A1 (en) * 2015-10-02 2018-09-20 Koninklijke Philips N.V. Hub for device placement with optical shape sensed guidewire
US10687909B2 (en) * 2014-01-24 2020-06-23 Koninklijke Philips N.V. Robotic control of imaging devices with optical shape sensing
US20200253668A1 (en) * 2017-08-28 2020-08-13 Koninklijke Philips N.V. Automatic field of view updating of position tracked interventional device
WO2021074437A1 (fr) * 2019-10-17 2021-04-22 Koninklijke Philips N.V. Détection automatique de caractéristiques basées sur l'oss et caractérisation de dispositif

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7289652B2 (en) 2001-11-21 2007-10-30 Koninklijke Philips Electronics, N. V. Medical viewing system and method for detecting and enhancing structures in noisy images
US20130216025A1 (en) * 2010-10-27 2013-08-22 Koninklijke Philips Electronics N.V. Adaptive imaging and frame rate optimizing based on real-time shape sensing of medical instruments
US10687909B2 (en) * 2014-01-24 2020-06-23 Koninklijke Philips N.V. Robotic control of imaging devices with optical shape sensing
WO2017055620A1 (fr) 2015-10-02 2017-04-06 Koninklijke Philips N.V. Raccord pour navigation de dispositif à fil-guide de détection de forme optique
US20180264227A1 (en) * 2015-10-02 2018-09-20 Koninklijke Philips N.V. Hub for device placement with optical shape sensed guidewire
US20200253668A1 (en) * 2017-08-28 2020-08-13 Koninklijke Philips N.V. Automatic field of view updating of position tracked interventional device
WO2021074437A1 (fr) * 2019-10-17 2021-04-22 Koninklijke Philips N.V. Détection automatique de caractéristiques basées sur l'oss et caractérisation de dispositif

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117530775A (zh) * 2024-01-09 2024-02-09 华中科技大学同济医学院附属协和医院 一种基于人工智能和ct的磁控介入控制方法及系统
CN117530775B (en) * 2024-01-09 2024-04-30 华中科技大学同济医学院附属协和医院 Magnetic control intervention control method and system based on artificial intelligence and CT

Similar Documents

Publication Publication Date Title
CN106714724B (zh) 用于规划多个介入程序的系统和方法
US20190231436A1 (en) Anatomical model for position planning and tool guidance of a medical tool
US9404734B2 (en) System and method for sensing shape of elongated instrument
EP2632384B1 (fr) Imagerie adaptative et optimisation de fréquence d'image sur la base d'une détection de forme en temps réel d'instruments médicaux
US20100030063A1 (en) System and method for tracking an instrument
EP3565482B1 (fr) Système de navigation médicale utilisant un dispositif de détection de forme et son procédé de fonctionnement
EP3247259B1 (fr) Dispositif visualisation par détection de forme optique d'un fil-guide
EP3057640A1 (fr) Système interventionnel
CN107684660A (zh) 鼻窦扩张手术中的球囊定位
EP3082610A1 (fr) Échographie robotisée à détection de forme pour interventions mini-invasives
CN114173694A (zh) 使用光学形状感测的颅部手术
EP3675763B1 (fr) Mise à jour automatique de champ de vision d'un dispositif d'intervention à suivi de position
WO2022048984A1 (fr) Commande d'intervention médicale basée sur l'identification de type de dispositif
US20220301100A1 (en) Providing a corrected dataset
CN115317005A (zh) 用于提供经校正的数据组的方法和系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21770155

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21770155

Country of ref document: EP

Kind code of ref document: A1