WO2022048979A1 - A flexible robotic interface for performing robotics assisted ultrasound guided interventions - Google Patents

A flexible robotic interface for performing robotics assisted ultrasound guided interventions Download PDF

Info

Publication number
WO2022048979A1
WO2022048979A1 PCT/EP2021/073563 EP2021073563W WO2022048979A1 WO 2022048979 A1 WO2022048979 A1 WO 2022048979A1 EP 2021073563 W EP2021073563 W EP 2021073563W WO 2022048979 A1 WO2022048979 A1 WO 2022048979A1
Authority
WO
WIPO (PCT)
Prior art keywords
ultrasound transducer
instrument
target
ultrasound
robotic interface
Prior art date
Application number
PCT/EP2021/073563
Other languages
French (fr)
Inventor
Alexandru Patriciu
Marcin Arkadiusz Balicki
Sean Joseph KYNE
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2022048979A1 publication Critical patent/WO2022048979A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4209Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
    • A61B8/4218Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4263Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors not mounted on the probe, e.g. mounted on an external reference frame
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4272Details of probe positioning or probe attachment to the patient involving the acoustic interface between the transducer and the tissue
    • A61B8/4281Details of probe positioning or probe attachment to the patient involving the acoustic interface between the transducer and the tissue characterised by sound-transmitting media or devices for coupling the transducer to the tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4272Details of probe positioning or probe attachment to the patient involving the acoustic interface between the transducer and the tissue
    • A61B8/429Details of probe positioning or probe attachment to the patient involving the acoustic interface between the transducer and the tissue characterised by determining or monitoring the contact between the transducer and the tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • A61B2017/3413Needle locating or guiding means guided by ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2061Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/40Positioning of patients, e.g. means for holding or immobilising parts of the patient's body
    • A61B8/403Positioning of patients, e.g. means for holding or immobilising parts of the patient's body using compression means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device

Definitions

  • Ultrasound guided interventional procedures may be automated or partially automated using robot manipulators to guide the interventional instrument into the target, where one robot holds and manipulates an ultrasound probe while another robot holds and manipulates an interventional instrument being inserted into the subject’s body.
  • robot manipulators to guide the interventional instrument into the target, where one robot holds and manipulates an ultrasound probe while another robot holds and manipulates an interventional instrument being inserted into the subject’s body.
  • DOFs degrees of freedom
  • high cost and workspace limitations often preclude implementation of automated conventional ultrasound guided interventional procedures in commercial systems. For example, positioning of the ultrasound probe has to be above the target, with good skin contact, requiring additional multiple DOF manipulators, making the hardware bulky and expensive. Accordingly, an inexpensive and reliable robot assisted ultrasound guided system is needed for accurately positioning and guiding an interventional instrument to a target in a subject, while maintaining acoustic coupling in order to preserve image quality.
  • a system for performing robotics assisted ultrasound guidance.
  • the system includes an instrument guide maneuverable by a robot, and configured to guide an interventional instrument toward a target within a subject; a flexible robotic interface attached to the instrument guide, and configured to apply a contact pressure to a surface of the subject; an ultrasound transducer attached to the flexible robotic interface, such that the ultrasound transducer is pressed against the surface of the subject in close proximity to the instrument guide by the contact pressure applied by the flexible robotic interface, the ultrasound transducer being configured to provide ultrasound images of at least one or the target or the interventional instrument when guided by the instrument guide toward the target, where the flexible robotic interface enables automatic manipulation of the ultrasound transducer in order to maintain acoustic coupling between the ultrasound transducer and the surface of the subject and to adjust a position of the ultrasound transducer relative to the instrument guide; and at least one sensor attached to the flexible robotic interface, and configured to provide sensor data indicating the position of the ultrasound transducer.
  • the system further includes a system controller programmed to receive image data from ultrasound images from the ultrasound transducer and sensor data from the at least one sensor, and to control the flexible robotic interface to adjust the contact pressure applied to the surface of the subject to optimize an image quality of the ultrasound images in response to the image data and/or the sensor data.
  • the system controller may be further programmed to control the robot to maneuver the interventional instrument so that a projected trajectory of the interventional instrument aligns with the target shown in the ultrasound images.
  • a method for performing robotics assisted ultrasound guidance of an interventional instrument using an instrument guide maneuverable by a robot to guide the interventional instrument toward a target within a subject.
  • the method includes applying a contact pressure to a surface of the subject through an ultrasound transducer using a flexible robotic interface, the contact pressure providing acoustic coupling between the ultrasound transducer and a surface of the subject; and determining a position of the ultrasound transducer relative to the instrument guide by determining a shape of the flexible robotic interface, the flexible robotic interface being connected between the instrument guide and a transducer mounting to which the ultrasound transducer connected; determining a projected trajectory of the interventional instrument toward the target from the instrument guide.
  • the method further includes changing the position of the ultrasound transducer, determining another position of the ultrasound transducer relative to the instrument guide, and determining another projected trajectory of the interventional instrument from the instrument guide.
  • the method further includes advancing the interventional instrument through the instrument guide along the projected trajectory.
  • a flexible robotic interface includes a deformable bladder inflatable with gas, and having a proximal end connected to an instrument guide maneuverable by a robot for guiding an interventional instrument toward a target within a subject, and a distal end connected to an ultrasound transducer configured to press against the surface of the subject by a contact pressure applied by the deformable bladder and to provide ultrasound images of at least one of the target or the interventional instrument when guided by the instrument guide toward the target.
  • the flexible robotic interface further includes bending sensors attached to walls of the deformable bladder, and configured to provide sensor data indicating a shape of the deformable bladder, the shape of the deformable bladder defining a positional relationship between the ultrasound transducer and the instrument guide.
  • a gas pressure of the gas in the deformable bladder is adjustable in response to the sensor data to order to adjust at least one of the contact pressure applied to the surface of the subject through the ultrasound transducer for optimizing image quality of the ultrasound images or the positional relationship between the ultrasound transducer and the instrument guide for aligning a projected trajectory of the interventional instrument with the target.
  • FIG. 1 is a simplified schematic diagram of a system for performing robot assisted ultrasound guidance using a flexible robotic interface, according to a representative embodiment.
  • FIG. 2A is a simplified schematic diagram of a bladder having multiple bladder chambers configured to provide no lateral displacement of the transducer, according to a representative embodiment.
  • FIG. 2B is a simplified schematic diagram of the bladder having multiple bladder chambers configured to provide lateral displacement of the transducer, according to a representative embodiment.
  • FIG. 2C is a simplified schematic diagram of the bladder having multiple bladder chambers configured to provide lateral displacement of the instrument guide, according to a representative embodiment.
  • FIG. 3 is a simplified schematic diagram of a bladder having multiple shape sensors used as reference for computing a transformation between imaging and end-effector coordinate systems, according to a representative embodiment.
  • FIG. 4 is a simplified flow diagram showing performing robotics assisted ultrasound guidance using a flexible robotic interface, according to a representative embodiment.
  • a soft robotic interface attaches an ultrasound probe to a device guide for a therapy needle or other interventional instrument.
  • the device guide is used to guide the interventional instrument toward a target area in the subject, while the ultrasound probe provides image feedback.
  • the soft robotic interface which may be a pneumatic interface, for example, interacts with the environment in a fundamentally different way from the rigid robotic manipulator, better complying with that environment.
  • An ultrasound probe delivers the best image quality when it is flat against the tissue (e.g., skin) being imaged, and there is a film of coupling gel between the ultrasound probe and the skin and the probe is pressed against the tissue with a certain contact pressure. All these goals can be easily achieved with a pneumatic soft robotics implementation.
  • FIG. 1 is a simplified schematic diagram of a system for performing robot assisted ultrasound guidance using a flexible (soft) robotic interface, according to a representative embodiment.
  • a robot assisted ultrasound guidance system 100 includes a robot 110 configured to automatically guide and interventional instrument 120 into a target 108 within a subject 105 (e.g., patient).
  • the interventional instrument 120 may be a biopsy needle, and the target 108 may be a suspected tumor within the subject 105.
  • the interventional instrument 120 may be any of various other types of compatible instruments, such as a catheter or an endoscope, for example, insertable in the subject 105 for any type of target 108, without departing from the scope of the present teachings.
  • the robot 110 includes a robot interface 112 and a robot arm 114 that supports an instrument guide 116 configured to guide the interventional instrument 120 toward the target 108.
  • the interventional instrument 120 is slidably attached to the instrument guide 116, such that the interventional instrument 120 may be advanced into the subject 105 at a fixed angle to the instrument guide 116.
  • the instrument guide 116 may include a fixed channel at a predetermined fixed angle, a set of fixed channels at different predetermined fixed angles, or a pivoting channel at a selectable angle within a range of available angles.
  • the interventional instrument 120 may be advanced manually by a user, or may be advanced automatically by the robot 110.
  • the instrument guide 116 may be referred to as the robot end-effector.
  • the robot assisted ultrasound guidance system 100 further includes a system controller
  • the robot interface 112 receives and translates commands from the system controller 150 to operate servo motors and flexible joints (not shown), for example, to maneuver the robot arm 114 and the instrument guide 116 integrated with or attached to the robot arm 114 in order to position the interventional instrument 120 on a trajectory 125 toward the target 108.
  • the robot arm 114 has at least three degrees of freedom with regard to motion, as would be apparent to one skilled in the art.
  • the system controller 150 generates the commands provided to the robot interface 112 in response to image data from an ultrasound imaging system 145 and image quality data from an image quality processor 147 based on ultrasound images showing the target 108 and/or the interventional instrument 120, as well as position data from a sensor processor 136, discussed below.
  • the system controller 150 may be connected by a system bus
  • the robot assisted ultrasound guidance system 100 further includes a flexible robotic interface 130 attached to the instrument guide 116, an ultrasound transducer 140 attached to the flexible robotic interface 130 via a transducer mounting 144, and representative sensors 131, 132, 133 and 134 attached to the flexible robotic interface 130.
  • the system controller 150 is programmed to receive image data from ultrasound images provided by the ultrasound transducer 140 and sensor data from the sensors 131, 132, 133 and 134, and to control the robot 110 to maneuver the interventional instrument 120 to align with the target 108 shown in the ultrasound images along the trajectory 125.
  • the flexible robotic interface 130 enables automatic manipulation of the ultrasound transducer 140 by the system controller 150 in order to adjust contact pressure to maintain proper acoustic coupling between the ultrasound transducer 140 and a surface 102 (e.g., skin) of the subject 105, and to adjust a position of the ultrasound transducer 140 relative to the instrument guide 116.
  • the flexible robotic interface 130 is configured to variably apply a contact pressure, indicated by arrow CP, to the surface 102 of the subject 105 through the ultrasound transducer 140 in close proximity to the instrument guide 116.
  • the flexible robotic interface 130 exerts a variable force against a top side of the ultrasound transducer 140, which in turn applies a commensurate contact pressure against surface 102 adjacent a bottom side of the ultrasound transducer 140, where the force against the top side of the ultrasound transducer 140 is substantially the same as the contact pressure against the surface 102.
  • the ultrasound transducer 140 is configured to provide the ultrasound images of the target 108 and/or the interventional instrument 120 within an imaging field of view 141 by emitting ultrasound waves into the subject 105 and receiving reflected ultrasound echo signals. Both the target 108 and the interventional instrument 120 are within the field of view 141 when the interventional instrument 120 has been guided to the target 108, e.g., along the trajectory 125.
  • the sensors 131, 132, 133 and 134 provide position data indicating the physical position of the ultrasound transducer 140, including a position and orientation (e.g., pitch, yaw, roll) of ultrasound transducer 140 in a three-dimensional coordinate system.
  • the position data is provided to a sensor processor 136, which translates the position data to indicate the position of the ultrasound transducer 140 relative to the instrument guide 116 and, if needed, to register the positon data to a common three-dimensional coordinate system.
  • the physical position of the ultrasound transducer 140 may be determined in a two-dimensional or three-dimensional shape sensing coordinate system of the sensors 131, 132, 133 and 134, and then registered to a two-dimensional or three-dimensional robot coordinate system of the instrument guide 116 (end-effector) of the robot 110, or vice versa, as is known to one skilled in the art.
  • the translated position data may include relative distances between predefined portions of the ultrasound transducer 140 and a portion (e.g., channel) of the instrument guide 116 to which the interventional instrument 120 is slideably attached, as well as the orientation of ultrasound transducer 140 within the common two-dimensional three- dimensional coordinate system.
  • the flexible robotic interface 130 includes a deformable bladder 135 that is inflatable with gas (e.g., air) under control of a pressure regulator 138.
  • gas e.g., air
  • the magnitude of the contact pressure applied by the flexible robotic interface 130 is directly proportional to the extent to which the bladder 135 is inflated. That is, the more gas there is in the bladder 135, the greater the force exerted by the bladder 135 on the ultrasound transducer 140, and thus the greater the contact pressure against the surface of the subject 105.
  • the pressure regulator 138 is configured to measure the contact pressure(s) being applied by the flexible robotic interface 130 to the surface 102 via the ultrasound transducer 140.
  • the measured pressure is provided by the pressure regulator 138 to the system controller 150 through the system bus 151.
  • the pressure regulator 138 is further configured adjust the contact pressure being applied by the flexible robotic interface 130 in response to pressure control signals received from the system controller 150 via the system bus 151.
  • the pressure regulator 138 may be an air regulator that increases and decreases air pressure within the bladder 135 of the flexible robotic interface 130 in order to increase and decrease the contact pressure being applied by the flexible robotic interface 130, respectively.
  • the quality of the ultrasound image varies in part as a function of the contact pressure between ultrasound transducer 140 and the surface 102 of the subject 105. Therefore, in order to optimize image quality, the magnitude of the contact pressure being asserted by the flexible robotic interface 130 may be varied under control of the system controller 150 in response to image quality data provided by an image quality processor 147 via the system bus 151, based on the image data received from the ultrasound imaging system 145.
  • the quality of the ultrasound image may be determined by the image quality processor 147 using image processing techniques, such as segmentation and identification of anatomy, for example. More advanced image quality techniques may be employed, such as theoretical information and ultrasound penetration methods, for example, as would apparent to one skilled in the art.
  • the system controller 150 may determine whether to increase, decrease or maintain the current contact pressure. The system controller 150 then sends control signals to the pressure regulator 138 to increase, decrease or maintain the contact pressure accordingly.
  • the image quality processor 147 may determine based on the image data that the lines defining an apparent outer perimeter of the target 108 are fuzzy. Based on this determination, the system controller 150 may send a command to the pressure regulator 138 to increase the pressure applied by the flexible robotic interface 130 by incremental amounts over set time periods, until the image data indicates appropriately defined lines of the outer perimeter.
  • the direction of the contact pressure applied by the flexible robotic interface 130 may also be varied in order to adjust the field of view 141 of the ultrasound transducer 140 to better image the target 108 and/or the interventional instrument 120.
  • uneven magnitudes of contact pressure may be applied to the surface 102 by way of the flexible robotic interface 130 applying uneven forces to opposite ends of the ultrasound transducer 140, thereby angling or tilting the ultrasound transducer 140 and thus the field of view 141 in a desired direction.
  • the direction (and the magnitude) of the contact pressure may be varied under control of the system controller 150 in response to the image data from the ultrasound imaging system 145 and the image quality processor 147.
  • the image data may be analyzed by the image quality processor 147 to determine whether the target 108 and/or the interventional instrument 120 are substantially centered within the field of view 141.
  • the presence or absence of the target 108 and/or the interventional instrument 120 in the ultrasound images likewise may be determined by shape recognition and segmentation techniques, as would be apparent to one skilled in the art.
  • the system controller 150 determines an amount and direction by which to shift the field of view 141, and what forces to be applied by the flexible robotic interface 130 to different portions of the ultrasound transducer 140 in order to obtain the determined amount and direction of the contact pressures.
  • the flexible robotic interface 130 In addition to enabling the adjustment of magnitude and/or direction of the contact pressure, the flexible robotic interface 130 also enables determination of relative positions between the ultrasound transducer 140 and the instrument guide 116, which is used for locating and directing the interventional instrument 120.
  • the relative positions may be determined using the sensors 131, 132, 133 and 134, which provide position data that enable determination of the shape of the flexible robotic interface 130 by the system controller 150. Since the ultrasound transducer 140 and the instrument guide 116 are on opposite ends of the flexible robotic interface 130, shape of the flexible robotic interface 130 defines the positional relationship between the ultrasound transducer 140 and the instrument guide 116.
  • the sensors 131, 132, 133 and 134 may be discrete bending sensors or linear displacement sensors attached to walls of the bladder 135, where “attached to” means that the discrete bending sensors or linear displacement sensors are embedded in the walls of the bladder 135 or connected to inner and/or outer surfaces of the walls of the bladder 135.
  • the bending sensors may be shape sensing optical fibers or piezoresistive materials, for example, although any compatible type of bending/shape sensing sensors may be incorporated without departing from the scope of the present teachings.
  • the shape sensing optical fibers may include Fiber Bragg gratings (FBGs), for example, for measuring strain resulting from the bending action.
  • FBGs Fiber Bragg gratings
  • the piezoresistive materials may include any conductive or semiconductor material in which electrical resistance changes in response to applied stress, such as silicon, polysilicon or silicon carbide, for example.
  • the bending sensors thereby provide shape data as the position data to indicate the extent of bending of each of the sensors 131, 132, 133 and 134.
  • the sensor processor 136 is a shape sensing processor configured to determine the shapes of the sensors 131, 132, 133 and 134 based on the shape data, as is well known in the art.
  • the determined shapes of the sensors 131, 132, 133 and 134 correlate to the shape of the bladder 135, and provide data indicating distances between the transducer mounting 144 and the instrument guide 116, respectively. These distances are used to determine the positional relationship between the ultrasound transducer 140 at a distal end of the bladder 135 and the instrument guide 116 at a proximal end of the bladder 135.
  • the sensors 131, 132, 133 and 134 may be position sensors attached to the walls of the bladder 135, meaning embedded in the walls or connected to inner and/or outer surface of the bladder 135.
  • the sensors 131, 132, 133 and 134 may be attached to the transducer mounting 144.
  • the position sensors may be any sensor capable of providing a location in two-dimensional or three-dimensional space, such as electromagnet (EM) sensors or optical sensors, for example, although any compatible type of position sensors may be incorporated without departing from the scope of the present teachings.
  • the robot assisted ultrasound guidance system 100 would further include a corresponding position sensor tracking system, e.g., providing a magnetic field, as is well known in the art.
  • Additional position sensors may be mounted to the instrument guide 116, such that the position sensor tracking system is able to determine the physical relationships between these position sensors and those attached to the bladder 135 and/or the transducer mounting 144.
  • a camera may be mounted on the instrument guide 116, and the sensors 131, 132, 133 and 134 may be visual markers attached to the bladder 135 and/or the transducer mounting 144. Then, the positions of the markers are determined in camera images provided by the camera.
  • the determined positions of the sensors 131, 132, 133 and 134 again correlate the shape of the bladder 135, and provide data indicating distances between the transducer mounting 144 and the instrument guide 116. These distances are used to determine the positional relationship between the ultrasound transducer 140 and the instrument guide 116 at the distal and proximal ends of the bladder 135, respectively.
  • the system controller 150 includes a memory 152 that stores instructions and a processing unit 153 that executes the instructions. When executed, the instructions cause the system controller 150 (via the processing unit 153) to implement all or part of the process shown in FIGs. 3 and 4, for example.
  • each of the sensor processor 136 and the image quality processor 147 is associated with a memory (not shown) or portion of the memory 152 that stores instructions that, when executed, cause the sensor processor 136 and the image quality processor 147 to perform their respective functions, discussed herein.
  • the sensor processor 136 and the image quality processor 147 may be implemented separately or together as one or more processors, and/or the respective functionalities may be incorporated in whole or in part into the processing unit 153, without departing from the scope of the present teachings.
  • the processing unit 153, as well as each of the sensor processor 136 and the image quality processor 147, is representative of one or more processing devices, and is configured to execute software instructions to perform functions as described in the various embodiments herein.
  • the processing unit 153, the sensor processor 136 and the image quality processor 147 may be implemented by one or more of field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), a general purpose computer, a central processing unit, a computer processor, a microprocessor, a microcontroller, a state machine, programmable logic device, or combinations thereof, using any combination of hardware, software, firmware, hard- wired logic circuits, or combinations thereof. It is understood that reference to any processor herein (e.g., computer processor and microprocessor) may include multiple processors, parallel processors, or both. Multiple processors may be included in, or coupled to, a single device or multiple devices.
  • processor encompasses an electronic component able to execute a program or machine executable instruction.
  • references to a computing device comprising “a processor” should be interpreted to include more than one processor or processing core, as in a multi-core processor.
  • a processor may also refer to a collection of processors within a single computer system or distributed among multiple computer systems.
  • the term computing device should also be interpreted to include a collection or network of computing devices each including a processor or processors. Programs have software instructions performed by one or multiple processors that may be within the same computing device or which may be distributed across multiple computing devices.
  • the memory 152 may include a main memory and/or a static memory, where such memories may communicate with each other and the processing unit 153, the sensor processor 136 and the image quality processor 147, via one or more buses.
  • the memory 152 and the other memories store instructions used to implement some or all aspects of methods and processes described herein.
  • the memory 152 and the other memories may be implemented by any number, type and combination of random access memory (RAM) and read-only memory (ROM), for example, and may store various types of information, such as software algorithms, Al models including recurrent neural networks (RNNs) and other neural network based models, and computer programs, for example, all of which are executable by the processing unit 153, the sensor processor 136 and the image quality processor 147.
  • RAM random access memory
  • ROM read-only memory
  • information such as software algorithms, Al models including recurrent neural networks (RNNs) and other neural network based models, and computer programs, for example, all of which are executable by the processing unit 153, the sensor processor 136 and the image quality processor 147.
  • RNNs recurrent neural networks
  • computer programs for example, all of which are executable by the processing unit 153, the sensor processor 136 and the image quality processor 147.
  • ROM and RAM may include any number, type and combination of computer readable storage media, such as a disk drive, flash memory, an electrically programmable read-only memory (EPROM), an electrically erasable and programmable read only memory (EEPROM), registers, a hard disk, a removable disk, tape, compact disk read only memory (CD-ROM), digital versatile disk (DVD), floppy disk, blu-ray disk, a universal serial bus (USB) drive, or any other form of storage medium known in the art.
  • EPROM electrically programmable read-only memory
  • EEPROM electrically erasable and programmable read only memory
  • registers a hard disk, a removable disk, tape, compact disk read only memory (CD-ROM), digital versatile disk (DVD), floppy disk, blu-ray disk, a universal serial bus (USB) drive, or any other form of storage medium known in the art.
  • Each of the memory 152 and the other memories is a tangible storage medium for storing data and executable software instructions, and
  • non-transitory is to be interpreted not as an eternal characteristic of a state, but as a characteristic of a state that will last for a period.
  • the term “non-transitory” specifically disavows fleeting characteristics such as characteristics of a carrier wave or signal or other forms that exist only transitorily in any place at any time.
  • the memory 152 and the other memories may store software instructions and/or computer readable code that enables performance of various functions.
  • the memory 152 and the other memories may be secure and/or encrypted, or unsecure and/or unencrypted.
  • the interface 156 may one or more of include ports, disk drives, wireless antennas, or other types of receiver circuitry.
  • the system controller 150 may retrieve or otherwise receive data and instructions via the interface 156 from a website, an email, a portable disk or other type of memory (not shown).
  • the interface 156 may include one or more user interfaces, such as a mouse, a keyboard, a microphone, a video camera, a touchscreen display, voice or gesture recognition captured by a microphone or video camera, for example.
  • the display 157 may be a monitor such as a computer monitor, a television, a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid-state display, or a cathode ray tube (CRT) display, or an electronic whiteboard, for example.
  • the display 157 may also include one or more interface(s), such as the interface 156, in which case the display 157 may provide a graphical user interface (GUI) for displaying and receiving information to and from the user.
  • GUI graphical user interface
  • the interface 156 and the display 157 are connected to one another and to the system controller 150 via the system bus 151.
  • FIG. 2A is a simplified schematic diagram of a bladder having multiple bladder chambers configured to provide no lateral displacement of the transducer, according to a representative embodiment
  • FIG. 2B is a simplified schematic diagram of the bladder having multiple bladder chambers configured to provide lateral displacement of the transducer, according to a representative embodiment
  • FIG. 2C is a simplified schematic diagram of the bladder having multiple bladder chambers configured to provide lateral displacement of the instrument guide, according to a representative embodiment.
  • a deformable bladder 235 has multiple bladder chambers, including a first bladder chamber 231, a second bladder chamber 232 and a third bladder chamber 233, that may be attached to walls of the bladder 235.
  • attached to the walls of the bladder 235 means embedded in the walls of the bladder 235 or connected to inner and/or outer surfaces of the walls of the bladder 235.
  • the bladder 235 may be substituted for the bladder 135 as the flexible robotic interface 130 in FIG. 1, which is effectively a single chamber bladder.
  • the bladder 235 is shown positioned between the instrument guide 116 and the ultrasound transducer 140, and is configured to adjust contact pressure applied to the surface 102 of the subject 105 through the ultrasound transducer 140 to maintain proper acoustic coupling between the ultrasound transducer 140 and the surface 102, and to adjust the position of the ultrasound transducer 140 and/or the instrument guide 116 relative to one another.
  • the bladder 235 also includes position sensors (not shown), such as the sensors 131, 132, 133 and 134, for providing shape data, for example, to enable determination of the position or shape of the bladder 235 and ultimately to determine the relative positions of the ultrasound transducer 140 and the instrument guide 116.
  • position data from position sensors (e.g., bending sensors) of the bladder 235 would indicate distance d between the active surface of the ultrasound transducer 140 on the surface 102 and exit position 117 of the interventional instrument 120 from the instrument guide 116.
  • the instrument guide 116 is controlled to guide the interventional instrument 120 to the target 108 along the trajectory 125, at least partially in response to the position data from the bladder 235.
  • the gas pressure in each of the first, second and third bladder chambers 231, 232 and 233 may be separately controllable, e.g., through the pressure regulator 138, in order to apply different forces to different locations on the top side of the ultrasound transducer 140.
  • the first bladder chamber 231 is substantially centered over the ultrasound transducer 140, so the first bladder chamber 231 may be used to adjust the contact pressure applied by the bladder 235 directly downward.
  • the second bladder chamber 232 is positioned over the left edge of the ultrasound transducer 140 and the third bladder chamber 233 is positioned over the right edge of the ultrasound transducer 140, so that the second and third bladder chambers 232 and 233 may be used to angle or tilt the ultrasound transducer 140 to the left and right with respect to the surface 102, as shown in FIG. 2B, and/or to angle or tilt the instrument guide 116 to the left and right with respect to the surface 102, as shown in FIG. 2C.
  • the first, second and third bladder chambers 231, 232 and 233 are used together to adjust the contact pressure applied by the bladder 235, as mentioned above.
  • the second and third bladder chambers 232 and 233 may be positioned over different edges of the ultrasound transducer 140, such as the front and back edges, for example, which would enable angling or tilting the ultrasound transducer 140 and the instrument guide 116 in different corresponding planes.
  • the first bladder chamber 231 has pressure Pl
  • the second bladder chamber 232 has pressure P2
  • the third bladder chamber 233 has pressure P3.
  • the pressure P2 equals the pressure P3, in which case the combined contact pressure applied by the bladder 235 is directed directly downward, as indicated by the arrow CPI.
  • the first pressure Pl may be the same as or different than the pressures P2 and P3, and the contact pressure would continue to be directed directly downward.
  • FIG. 2B shows a situation in which the pressure P2 is much greater than the pressure P3, in which case the combined contact pressure applied by the bladder 235 through the ultrasound transducer 140 is directed to the right (away from the second bladder chamber 232 applying the greater pressure), as indicated by arrow CP2. That is, the higher pressure P2 of the second bladder chamber 232 causes the bladder 235 to exert a greater force on the left end of the ultrasound transducer 140 than on the right end, rotating the ultrasound transducer 140 counterclockwise in the depicted orientation. This changes the angle of the field of view of the ultrasound transducer 140 (e.g., field of view 141).
  • rotating the ultrasound transducer 140 counterclockwise shifts the field of view to the right, enabling the ultrasound transducer 140 to capture images of the target 108 without necessarily having to adjust the position of the instrument guide 116.
  • the ultrasound transducer 140 may be tilted a greater or lesser amount counterclockwise by increasing or decreasing the pressure P2 relative to the pressure P3, respectively.
  • the ultrasound transducer 140 may be tilted clockwise when the pressure P3 is greater than the pressure P2, and the amount by which the ultrasound transducer 140 is tilted may be altered by increasing or decreasing the pressure P3 relative to the pressure P2, respectively.
  • FIG. 2C similarly shows a situation in which the pressure P2 is much greater than the pressure P3, although in this case, the combined contact pressure applied by the bladder 235 continues to be directed directly downward, as indicated by arrow CP3, while the instrument guide 116 is angled or tilted clockwise in the depicted orientation.
  • rotating the instrument guide 116 clockwise reduces the depth below the surface 102 at which the interventional instrument 120 intersects the target 108.
  • the instrument guide 116 may be tilted clockwise a greater or lesser amount by increasing or decreasing the pressure P2 relative to the pressure P3, respectively.
  • the instrument guide 116 may be tilted counterclockwise when the pressure P3 is greater than the pressure P2, and the amount by which the instrument guide 116 is tilted may be altered by increasing or decreasing the pressure P3 relative to the pressure P2, respectively.
  • FIGs. 2A, 2B and 2C show three bladder chambers, it is understood that more or fewer bladder chambers may be attached to (included in or on) a bladder of the flexible robotic interface 130 in order to provide unique benefits for any particular situation or to meet application specific design requirements of various implementations, as would be apparent to one skilled in the art.
  • FIGs. 2A, 2B and 2C show adjustments to the positions of the ultrasound transducer 140 and the instrument guide 116 in only two dimensions in response to application of the pressures Pl, P2 and P3. It is further understood, however, that the pressures of the various bladder chambers may be applied to adjust the ultrasound transducer 140 in three dimensions, as would apparent to one skilled in the art.
  • the bladder may further include bladder chambers attached to the front and back walls of the bladder, in addition to the bladders attached to the left and right walls (e.g., second and third bladder chambers 232 and 233).
  • the ultrasound transducer 140 and/or the instrument guide 116 may be tilted to the front and back, in addition to being tilted to the left and right, by adjusting the relative pressures of the bladders attached to the front and back walls.
  • FIG. 3 is a simplified schematic diagram of a bladder having multiple shape sensors, according to a representative embodiment.
  • FIG. 3 is used as reference for computing a transformation between ultrasound imaging and robot end-effector coordinate systems to determine the position of a flexible robotic interface, as discussed below.
  • a deformable bladder 335 includes multiple position sensors, indicated by first bending sensor 331, second bending sensor 332 and third bending sensor 333.
  • the first, second and third bending sensors 331, 332 and 333 may be shape sensing optical fibers or piezoresistive materials, for example, as discussed above with regard to the representative sensors 131, 132, 133 and 134.
  • the first, second and third bending sensors 331, 332 and 333 may be position sensors, as discussed above, without departing from the scope of the present teachings.
  • the bladder 335 may be substituted for the bladder 135 as the flexible robotic interface 130 in FIG. 1.
  • the bladder 335 is shown connected between the instrument guide 116 (robot end-effector) and the transducer mounting 144 for the ultrasound transducer 140.
  • the instrument guide 116 robot end-effector
  • the transducer mounting 144 for the ultrasound transducer 140 at least two bending sensors are needed to indicate the position of the ultrasound transducer in two dimensions, and at least six bending sensors needed to indicate the position of the ultrasound transducer in three dimensions.
  • Attachment points Qi, Q2 and Q3 are the physical positions at which the first, second and third bending sensors 331, 332 and 333 attach to the instrument guide 116.
  • the points Qi, Q2 and Q3 are in the robot end-effector coordinate system (ECS).
  • Attachment points Pi, P2 and P3 are the physical positions at which the first, second and third bending sensors 331, 332 and 333 attach to the transducer mount 144.
  • the points Pi, P2 and P3 are in the ultrasound imaging coordinate system (ICS).
  • a mathematical transformation of the points Qi, Q2 and Q3 from the ECS to the ICS may be indicated by FICS ECS . In the two-dimensional case, as shown in FIG. 3, the transformation requires three parameters, which are provided by the first, second and third bending sensors 331, 332 and 333. In a three-dimensional case (not shown), the transformation would require six parameters and thus six bending sensors, as would be apparent to one skilled in the art.
  • the position of the ultrasound transducer 140 in the ICS is computed with respect to the instrument guide 116.
  • the first, second and third bending sensors 331, 332 and 333 provide data that enable determination of the distance between the instrument guide 116 and on the transducer mounting 144.
  • the bending sensor 331 indicates a distance di between the points Qi and Pi
  • the bending sensor 332 indicates a distance d2 between the points Q2 and P2
  • the bending sensor 333 indicates a distance ds between the points Q3 and P3.
  • the goal is to compute the mathematical transformation between ECS and ICS referred to as FICS ECS .
  • Equation (1) the respective positions of the points Pi, P2 and P3 in the ECS may be expressed as shown in Equation (1):
  • the distances di may be used to determine the positional relationship between the instrument guide 116 and the ultrasound transducer 140, mounted to the transducer mounting 144.
  • position sensors such as EM sensors or optical sensors, for example, may be mounted to the instrument guide 116 and on the transducer mounting 144 and/or the bladder 335, and the corresponding tracking system may be employed to determine the physical relationships among the sets of position sensors.
  • a camera may be mounted on the instrument guide 116 and visual markers may be attached to the transducer mounting 144 and/or the bladder 335. Then, the positions of the markers in the camera image are used to compute the position of the ICS in relation to the RCS.
  • the flexible robotic interface (e.g., flexible robotic interface 130) is connected between an ultrasound transducer (e.g., ultrasound transducer 140) and an instrument guide (instrument guide 116) of a robot, used to guide an interventional instrument (e.g., interventional instrument 120) toward a target (e.g., target 108) in a subject (e.g., subject 105).
  • the method may be implemented by computer readable instructions or computer code, e.g., stored the memory 152 and executable by the processing unit 153 of the system controller 150, discussed above.
  • the method includes initially positioning the ultrasound transducer on a surface of the subject such that the target is substantially within a field of view of the ultrasound transducer in block S411.
  • the ultrasound transducer may initially positioned on the surface of the subject by hand or by the robot using the ultrasound images provided by the ultrasound transducer to initially locate the target. In this manner, the target is placed within or near the field of view of the ultrasound transducer prior to beginning insertion of the interventional instrument.
  • the ultrasound transducer may be initially placed using another imaging modality, such as x-ray imaging, computerized tomography (CT) scan imaging or magnetic resonance imaging (MRI) imaging, for example, showing the target.
  • CT computerized tomography
  • MRI magnetic resonance imaging
  • a contact pressure is applied (or maintained) by a flexible robotic interface (e.g., flexible robotic interface 130) to a surface (e.g., surface 102) of the subject through the ultrasound transducer in block S412.
  • a flexible robotic interface e.g., flexible robotic interface 130
  • the ultrasound transducer is connected to a distal end of the flexible robotic interface, while the instrument guide of the robot is connected to a proximate end of the flexible robotic interface, such that operation of the flexible robotic interface alters the position of the ultrasound transducer relative to the instrument guide.
  • the flexible robotic interface includes a deformable bladder configured to contain gas (e.g., air) under pressure, such that the contact pressure is applied by forcibly injecting the gas into the bladder (or one or more bladder chambers in the bladder) to cause the bladder to exert force on the ultrasound transducer in a direction toward the surface of the subject.
  • the gas is injected into the bladder through a pressure regulator (e.g., pressure regulator 138) to obtain a desired contact pressure between the ultrasound transducer and the surface of the subject.
  • a pressure regulator e.g., pressure regulator 138
  • the image quality is analyzed by an image processor (e.g., image quality processor 147) based on ultrasound image data provided by the ultrasound transducer and the ultrasound imaging system. For example, determining the image quality may include shape recognition and segmentation techniques, as would be apparent to one skilled in the art.
  • the ultrasound images may be analyzed to determine whether features in the target, such as lines defining an apparent outer perimeter of the target, are fuzzy or clear.
  • image quality is determined to be unacceptable (block S413: No)
  • magnitude of the contact pressure applied by the flexible robotic interface is adjusted through the pressure regulator in block S414. That is, the contact pressure is increased or decreased in order to bring the image of the target into sharp relief.
  • the contact pressure may be increased when low ultrasound coupling is detected and decreased when large skin deformation is detected, as would be apparent to one skilled in the art.
  • the process then returns to block S413 to determine whether the quality of the ultrasound images provided by the ultrasound transducer is acceptable using the adjusted contact pressure.
  • the process proceeds to block S415.
  • a position of the ultrasound transducer is determined relative to the instrument guide, where the position of the ultrasound transducer also provides the desired position of the instrument guide.
  • position data may be received from position sensors at a sensor processor (e.g., sensor processor 136), which calculates the position of the ultrasound transducer relative to the instrument guide using the received position data.
  • the position sensors may be bending sensors (e.g., sensors 131, 132, 133 and 134) attached to the flexible robotic interface, such as shape sensing optical fibers or piezoresistive materials, that provide shape data indicating the shape of the flexible robotic interface.
  • the more a bending sensor is bent the more compact the flexible robotic interface, and thus the closer the ultrasound transducer relative is to the instrument guide, and the less the bending sensor is bent, the more extended the flexible robotic interface, and thus the farther the ultrasound transducer relative is from the instrument guide.
  • Multiple bending sensors may have different degrees of bending, indicating that the flexible robotic interface is not being uniformly deformed. This occurs when the ultrasound transducer is positioned at an angle (tilted) with respect to the instrument guide.
  • Other types of position sensors may be incorporated, such as EM sensors, optical sensors or markers, attached to the flexible robotic interface, or attached to the instrument guide and/or the transducer (or transducer mounting), as discussed above.
  • block S416 it is determined whether the relative position of the ultrasound transducer and the instrument guide is acceptable. For example, it may be determined pursuant to block S416 whether the target is substantially centered in the field of view of the ultrasound transducer and/or whether a projected trajectory of the interventional instrument at the angle currently set by the instrument guide intersects the position of the target in the subject, discussed below. When the relative position of the ultrasound transducer and the instrument guide is determined to be unacceptable (block S416: No), the angle (direction) of the contact pressure applied by the flexible robotic interface is adjusted through the pressure regulator in block S417.
  • the process then returns to block S415 to again determine the position of the ultrasound transducer relative to the instrument guide based on the adjusted angle of the contact pressure applied by the flexible robotic interface.
  • the process proceeds to block S418.
  • the position of the ultrasound transducer relative to the instrument guide may be altered by changing the shape of the flexible robotic interface.
  • the shape of the flexible robotic interface may be changed by increasing or decreasing the amount of gas contained in the bladder, which increases or decreases separation between the ultrasound transducer and the instrument guide, respectively.
  • the bladder includes multiple bladder chambers (e.g., bladder chambers 231, 232 and 233) attached to walls of the bladder, the shape of the flexible robotic interface also may be changed by increasing or decreasing the amount of gas in each of the separate bladder chambers, thereby angling or tilting one or both of the ultrasound transducer and the instrument guide, as discussed above with reference to FIGs. 2B and 2C.
  • the determination in block S416 as to whether the relative position of the ultrasound transducer and the instrument guide is acceptable may include determining whether the target is substantially centered in the field of view of the ultrasound transducer and/or determining whether the projected trajectory of the interventional instrument intersects the position of the target.
  • a current ultrasound image is provided by ultrasound imaging system 145 and analyzed, for example, by system controller 150 and/or processing unit 153 using shape recognition techniques.
  • the position of the ultrasound transducer relative to the instrument guide may be altered incrementally until the target appears at or near the center of the ultrasound image.
  • the interventional instrument may have been partially or fully advanced into the subject at this point in the procedure, it may be determined whether the position of a distal end of the interventional instrument is at or near the center of the field of view using the ultrasound images, where the position of the ultrasound transducer relative to the instrument guide may be then altered to place the distal end at a desired location within the ultrasound image.
  • a projected trajectory (e.g., trajectory 125) of the interventional instrument toward the target is determined using the determined position of the ultrasound transducer and corresponding position of the instrument guide.
  • the interventional instrument is slidably attached to the instrument guide at a predetermined angle, previously set for the interventional procedure.
  • the instrument guide may include at least one fixed channel at a corresponding at least one predetermined fixed angle, or may include a pivoting channel that may be set at a selected angle within a predetermined range of angle.
  • the projected trajectory may be determined geometrically based on knowledge of the location of an exit position (e.g., exit position 117) of the interventional instrument from the instrument guide, the angle of the channel for the interventional instrument, and the location of the target in the subject.
  • the position of the ultrasound transducer relative to the instrument guide may be altered by changing the shape of the flexible robotic interface.
  • the flexible robotic interface includes a deformable bladder, in order to get the projected trajectory to intersect the target closer to the surface of the subject, gas may be added to the bladder to further separate the instrument guide from the ultrasound transducer.
  • gas may be added to one bladder chamber and removed from an opposite bladder chamber such that the instrument guide tilts toward the ultrasound transducer, thereby decreasing the intersection angle of the interventional instrument relative to the field of view of the ultrasound transducer.
  • the interventional instrument is advanced through the instrument guide along the projected trajectory a remaining distance (if any) toward the target until it intersects (e.g., pierces or otherwise enters) the target.
  • the interventional instrument may be advanced through the instrument guide manually by a user or automatically by the robot itself.
  • the methods described herein may be implemented using a hardware computer system that executes software programs. Further, in an exemplary, non-limited embodiment, implementations can include distributed processing, component/object distributed processing, and parallel processing. Virtual computer system processing may implement one or more of the methods or functionalities as described herein, and a processor described herein may be used to support a virtual processing environment.
  • inventions of the disclosure may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any particular invention or inventive concept.
  • inventions merely for convenience and without intending to voluntarily limit the scope of this application to any particular invention or inventive concept.
  • specific embodiments have been illustrated and described herein, it should be appreciated that any subsequent arrangement designed to achieve the same or similar purpose may be substituted for the specific embodiments shown.
  • This disclosure is intended to cover any and all subsequent adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the description.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Acoustics & Sound (AREA)
  • Gynecology & Obstetrics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

A system for performing robotics assisted ultrasound guidance includes an instrument guide for guiding an interventional instrument toward a target within a subject; a flexible robotic interface attached to the instrument guide for applying contact pressure to the subject; an ultrasound transducer attached to the flexible robotic interface to be pressed against the subject in close proximity to the instrument guide by the contact pressure applied by the flexible robotic interface, the ultrasound transducer providing ultrasound images of at least one or the target or the interventional instrument when guided toward the target, where the flexible robotic interface enables automatic manipulation of the ultrasound transducer in order to maintain acoustic coupling and to adjust a position of the ultrasound transducer relative to the instrument guide; and at least one sensor attached to the flexible robotic interface to provide sensor data indicating the position of the ultrasound transducer.

Description

A FLEXIBLE ROBOTIC INTERFACE FOR PERFORMING ROBOTICS ASSISTED ULTRASOUND GUIDED INTERVENTIONS
BACKGROUND
[0001] Many medical procedures involve insertion of an interventional instrument, such as a needle or a catheter, into the body of a subject in order to penetrate a target with minimum discomfort for the subject. Accurate placement of the interventional instrument is necessary to the success of these medical procedures, so the insertion of the interventional instrument may be guided using diagnostic medical imaging, such ultrasound imaging, x-ray imaging, computerized tomography (CT) scan imaging or magnetic resonance imaging (MRI) imaging. Ultrasound imaging is especially desirable because of the relatively low cost as compared to alternative medical imaging techniques, and the absence of exposing the subject to radiation as in x-ray imaging and CT scan imaging. Ultrasound guided interventional procedures are particularly useful for performing biopsies, for example, such as breast biopsies, liver biopsies, kidney biopsies, and the like.
[0002] Ultrasound guided interventional procedures may be automated or partially automated using robot manipulators to guide the interventional instrument into the target, where one robot holds and manipulates an ultrasound probe while another robot holds and manipulates an interventional instrument being inserted into the subject’s body. Generally, though, use of the two robot manipulators is technologically complex given the high number of degrees of freedom (DOFs) through which the interventional instrument must be manipulated. Also, high cost and workspace limitations often preclude implementation of automated conventional ultrasound guided interventional procedures in commercial systems. For example, positioning of the ultrasound probe has to be above the target, with good skin contact, requiring additional multiple DOF manipulators, making the hardware bulky and expensive. Accordingly, an inexpensive and reliable robot assisted ultrasound guided system is needed for accurately positioning and guiding an interventional instrument to a target in a subject, while maintaining acoustic coupling in order to preserve image quality. SUMMARY
[0003] According to an aspect of the present disclosure, a system is provided for performing robotics assisted ultrasound guidance. The system includes an instrument guide maneuverable by a robot, and configured to guide an interventional instrument toward a target within a subject; a flexible robotic interface attached to the instrument guide, and configured to apply a contact pressure to a surface of the subject; an ultrasound transducer attached to the flexible robotic interface, such that the ultrasound transducer is pressed against the surface of the subject in close proximity to the instrument guide by the contact pressure applied by the flexible robotic interface, the ultrasound transducer being configured to provide ultrasound images of at least one or the target or the interventional instrument when guided by the instrument guide toward the target, where the flexible robotic interface enables automatic manipulation of the ultrasound transducer in order to maintain acoustic coupling between the ultrasound transducer and the surface of the subject and to adjust a position of the ultrasound transducer relative to the instrument guide; and at least one sensor attached to the flexible robotic interface, and configured to provide sensor data indicating the position of the ultrasound transducer. The system further includes a system controller programmed to receive image data from ultrasound images from the ultrasound transducer and sensor data from the at least one sensor, and to control the flexible robotic interface to adjust the contact pressure applied to the surface of the subject to optimize an image quality of the ultrasound images in response to the image data and/or the sensor data. The system controller may be further programmed to control the robot to maneuver the interventional instrument so that a projected trajectory of the interventional instrument aligns with the target shown in the ultrasound images.
[0004] According to another aspect of the present disclosure, a method is provided for performing robotics assisted ultrasound guidance of an interventional instrument using an instrument guide maneuverable by a robot to guide the interventional instrument toward a target within a subject. The method includes applying a contact pressure to a surface of the subject through an ultrasound transducer using a flexible robotic interface, the contact pressure providing acoustic coupling between the ultrasound transducer and a surface of the subject; and determining a position of the ultrasound transducer relative to the instrument guide by determining a shape of the flexible robotic interface, the flexible robotic interface being connected between the instrument guide and a transducer mounting to which the ultrasound transducer connected; determining a projected trajectory of the interventional instrument toward the target from the instrument guide. When the projected trajectory of the interventional instrument does not intersect the target, the method further includes changing the position of the ultrasound transducer, determining another position of the ultrasound transducer relative to the instrument guide, and determining another projected trajectory of the interventional instrument from the instrument guide. When the projected trajectory or the another projected trajectory of the interventional instrument intersects the target, the method further includes advancing the interventional instrument through the instrument guide along the projected trajectory.
[0005] According to another aspect of the present disclosure, a flexible robotic interface includes a deformable bladder inflatable with gas, and having a proximal end connected to an instrument guide maneuverable by a robot for guiding an interventional instrument toward a target within a subject, and a distal end connected to an ultrasound transducer configured to press against the surface of the subject by a contact pressure applied by the deformable bladder and to provide ultrasound images of at least one of the target or the interventional instrument when guided by the instrument guide toward the target. The flexible robotic interface further includes bending sensors attached to walls of the deformable bladder, and configured to provide sensor data indicating a shape of the deformable bladder, the shape of the deformable bladder defining a positional relationship between the ultrasound transducer and the instrument guide. A gas pressure of the gas in the deformable bladder is adjustable in response to the sensor data to order to adjust at least one of the contact pressure applied to the surface of the subject through the ultrasound transducer for optimizing image quality of the ultrasound images or the positional relationship between the ultrasound transducer and the instrument guide for aligning a projected trajectory of the interventional instrument with the target.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] The example embodiments are best understood from the following detailed description when read with the accompanying drawing figures. It is emphasized that the various features are not necessarily drawn to scale. In fact, the dimensions may be arbitrarily increased or decreased for clarity of discussion. Wherever applicable and practical, like reference numerals refer to like elements.
[0007] FIG. 1 is a simplified schematic diagram of a system for performing robot assisted ultrasound guidance using a flexible robotic interface, according to a representative embodiment. [0008] FIG. 2A is a simplified schematic diagram of a bladder having multiple bladder chambers configured to provide no lateral displacement of the transducer, according to a representative embodiment.
[0009] FIG. 2B is a simplified schematic diagram of the bladder having multiple bladder chambers configured to provide lateral displacement of the transducer, according to a representative embodiment.
[0010] FIG. 2C is a simplified schematic diagram of the bladder having multiple bladder chambers configured to provide lateral displacement of the instrument guide, according to a representative embodiment.
[0011] FIG. 3 is a simplified schematic diagram of a bladder having multiple shape sensors used as reference for computing a transformation between imaging and end-effector coordinate systems, according to a representative embodiment.
[0012] FIG. 4 is a simplified flow diagram showing performing robotics assisted ultrasound guidance using a flexible robotic interface, according to a representative embodiment.
DETAILED DESCRIPTION
[0013] In the following detailed description, for the purposes of explanation and not limitation, representative embodiments disclosing specific details are set forth in order to provide a thorough understanding of an embodiment according to the present teachings. Descriptions of known systems, devices, materials, methods of operation and methods of manufacture may be omitted so as to avoid obscuring the description of the representative embodiments.
Nonetheless, systems, devices, materials and methods that are within the purview of one of ordinary skill in the art are within the scope of the present teachings and may be used in accordance with the representative embodiments. It is to be understood that the terminology used herein is for purposes of describing particular embodiments only and is not intended to be limiting. The defined terms are in addition to the technical and scientific meanings of the defined terms as commonly understood and accepted in the technical field of the present teachings.
[0014] It will be understood that, although the terms first, second, third, etc. may be used herein to describe various elements or components, these elements or components should not be limited by these terms. These terms are only used to distinguish one element or component from another element or component. Thus, a first element or component discussed below could be termed a second element or component without departing from the teachings of the inventive concept. [0015] The terminology used herein is for purposes of describing particular embodiments only and is not intended to be limiting. As used in the specification and appended claims, the singular forms of terms “a,” “an” and “the” are intended to include both singular and plural forms, unless the context clearly dictates otherwise. Additionally, the terms “comprises,” “comprising,” and/or similar terms specify the presence of stated features, elements, and/or components, but do not preclude the presence or addition of one or more other features, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
[0016] Unless otherwise noted, when an element or component is said to be “connected to,” “coupled to,” or “adjacent to” another element or component, it will be understood that the element or component can be directly connected or coupled to the other element or component, or intervening elements or components may be present. That is, these and similar terms encompass cases where one or more intermediate elements or components may be employed to connect two elements or components. However, when an element or component is said to be “directly connected” to another element or component, this encompasses only cases where the two elements or components are connected to each other without any intermediate or intervening elements or components.
[0017] The present disclosure, through one or more of its various aspects, embodiments and/or specific features or sub-components, is thus intended to bring out one or more of the advantages as specifically noted below. For purposes of explanation and not limitation, example embodiments disclosing specific details are set forth in order to provide a thorough understanding of an embodiment according to the present teachings. However, other embodiments consistent with the present disclosure that depart from specific details disclosed herein remain within the scope of the appended claims. Moreover, descriptions of well-known apparatuses and methods may be omitted so as to not obscure the description of the example embodiments. Such methods and apparatuses are within the scope of the present disclosure. [0018] According to various embodiments, a soft robotic interface is provided that attaches an ultrasound probe to a device guide for a therapy needle or other interventional instrument. The device guide is used to guide the interventional instrument toward a target area in the subject, while the ultrasound probe provides image feedback. Generally, the soft robotic interface, which may be a pneumatic interface, for example, interacts with the environment in a fundamentally different way from the rigid robotic manipulator, better complying with that environment. An ultrasound probe delivers the best image quality when it is flat against the tissue (e.g., skin) being imaged, and there is a film of coupling gel between the ultrasound probe and the skin and the probe is pressed against the tissue with a certain contact pressure. All these goals can be easily achieved with a pneumatic soft robotics implementation.
[0019] FIG. 1 is a simplified schematic diagram of a system for performing robot assisted ultrasound guidance using a flexible (soft) robotic interface, according to a representative embodiment.
[0020] Referring to FIG. 1, a robot assisted ultrasound guidance system 100 includes a robot 110 configured to automatically guide and interventional instrument 120 into a target 108 within a subject 105 (e.g., patient). For example, the interventional instrument 120 may be a biopsy needle, and the target 108 may be a suspected tumor within the subject 105. However, the interventional instrument 120 may be any of various other types of compatible instruments, such as a catheter or an endoscope, for example, insertable in the subject 105 for any type of target 108, without departing from the scope of the present teachings.
[0021] In the depicted embodiment, the robot 110 includes a robot interface 112 and a robot arm 114 that supports an instrument guide 116 configured to guide the interventional instrument 120 toward the target 108. The interventional instrument 120 is slidably attached to the instrument guide 116, such that the interventional instrument 120 may be advanced into the subject 105 at a fixed angle to the instrument guide 116. For example, the instrument guide 116 may include a fixed channel at a predetermined fixed angle, a set of fixed channels at different predetermined fixed angles, or a pivoting channel at a selectable angle within a range of available angles. In various embodiments, the interventional instrument 120 may be advanced manually by a user, or may be advanced automatically by the robot 110. The instrument guide 116 may be referred to as the robot end-effector.
[0022] The robot assisted ultrasound guidance system 100 further includes a system controller
150 that is programmed, in part, to control operations of the robot 110, and to coordinate the operations of the robot 110 with ultrasound imaging, positioning and sensing operations, discussed below. The robot interface 112 receives and translates commands from the system controller 150 to operate servo motors and flexible joints (not shown), for example, to maneuver the robot arm 114 and the instrument guide 116 integrated with or attached to the robot arm 114 in order to position the interventional instrument 120 on a trajectory 125 toward the target 108. The robot arm 114 has at least three degrees of freedom with regard to motion, as would be apparent to one skilled in the art. The system controller 150 generates the commands provided to the robot interface 112 in response to image data from an ultrasound imaging system 145 and image quality data from an image quality processor 147 based on ultrasound images showing the target 108 and/or the interventional instrument 120, as well as position data from a sensor processor 136, discussed below. The system controller 150 may be connected by a system bus
151 to the ultrasound imaging system 145 and the image quality processor 147 to receive the image data and the image quality data and to the sensor processor 136 to receive the position data.
[0023] The robot assisted ultrasound guidance system 100 further includes a flexible robotic interface 130 attached to the instrument guide 116, an ultrasound transducer 140 attached to the flexible robotic interface 130 via a transducer mounting 144, and representative sensors 131, 132, 133 and 134 attached to the flexible robotic interface 130. The system controller 150 is programmed to receive image data from ultrasound images provided by the ultrasound transducer 140 and sensor data from the sensors 131, 132, 133 and 134, and to control the robot 110 to maneuver the interventional instrument 120 to align with the target 108 shown in the ultrasound images along the trajectory 125. The flexible robotic interface 130 enables automatic manipulation of the ultrasound transducer 140 by the system controller 150 in order to adjust contact pressure to maintain proper acoustic coupling between the ultrasound transducer 140 and a surface 102 (e.g., skin) of the subject 105, and to adjust a position of the ultrasound transducer 140 relative to the instrument guide 116. [0024] More particularly, the flexible robotic interface 130 is configured to variably apply a contact pressure, indicated by arrow CP, to the surface 102 of the subject 105 through the ultrasound transducer 140 in close proximity to the instrument guide 116. More particularly, the flexible robotic interface 130 exerts a variable force against a top side of the ultrasound transducer 140, which in turn applies a commensurate contact pressure against surface 102 adjacent a bottom side of the ultrasound transducer 140, where the force against the top side of the ultrasound transducer 140 is substantially the same as the contact pressure against the surface 102. The ultrasound transducer 140 is configured to provide the ultrasound images of the target 108 and/or the interventional instrument 120 within an imaging field of view 141 by emitting ultrasound waves into the subject 105 and receiving reflected ultrasound echo signals. Both the target 108 and the interventional instrument 120 are within the field of view 141 when the interventional instrument 120 has been guided to the target 108, e.g., along the trajectory 125. [0025] The sensors 131, 132, 133 and 134 provide position data indicating the physical position of the ultrasound transducer 140, including a position and orientation (e.g., pitch, yaw, roll) of ultrasound transducer 140 in a three-dimensional coordinate system. The position data is provided to a sensor processor 136, which translates the position data to indicate the position of the ultrasound transducer 140 relative to the instrument guide 116 and, if needed, to register the positon data to a common three-dimensional coordinate system.
[0026] For example, the physical position of the ultrasound transducer 140 may be determined in a two-dimensional or three-dimensional shape sensing coordinate system of the sensors 131, 132, 133 and 134, and then registered to a two-dimensional or three-dimensional robot coordinate system of the instrument guide 116 (end-effector) of the robot 110, or vice versa, as is known to one skilled in the art. The translated position data may include relative distances between predefined portions of the ultrasound transducer 140 and a portion (e.g., channel) of the instrument guide 116 to which the interventional instrument 120 is slideably attached, as well as the orientation of ultrasound transducer 140 within the common two-dimensional three- dimensional coordinate system.
[0027] In the depicted embodiment, the flexible robotic interface 130 includes a deformable bladder 135 that is inflatable with gas (e.g., air) under control of a pressure regulator 138. The magnitude of the contact pressure applied by the flexible robotic interface 130 is directly proportional to the extent to which the bladder 135 is inflated. That is, the more gas there is in the bladder 135, the greater the force exerted by the bladder 135 on the ultrasound transducer 140, and thus the greater the contact pressure against the surface of the subject 105. The pressure regulator 138 is configured to measure the contact pressure(s) being applied by the flexible robotic interface 130 to the surface 102 via the ultrasound transducer 140. This may be done, for example, by measuring a gas pressure of the gas inside the bladder 135 and converting the measured gas pressure into a force exerted by the flexible robotic interface 130 onto the ultrasound transducer 140. The measured pressure is provided by the pressure regulator 138 to the system controller 150 through the system bus 151. The pressure regulator 138 is further configured adjust the contact pressure being applied by the flexible robotic interface 130 in response to pressure control signals received from the system controller 150 via the system bus 151. For example, the pressure regulator 138 may be an air regulator that increases and decreases air pressure within the bladder 135 of the flexible robotic interface 130 in order to increase and decrease the contact pressure being applied by the flexible robotic interface 130, respectively.
[0028] It is well known that the quality of the ultrasound image varies in part as a function of the contact pressure between ultrasound transducer 140 and the surface 102 of the subject 105. Therefore, in order to optimize image quality, the magnitude of the contact pressure being asserted by the flexible robotic interface 130 may be varied under control of the system controller 150 in response to image quality data provided by an image quality processor 147 via the system bus 151, based on the image data received from the ultrasound imaging system 145. For example, the quality of the ultrasound image may be determined by the image quality processor 147 using image processing techniques, such as segmentation and identification of anatomy, for example. More advanced image quality techniques may be employed, such as theoretical information and ultrasound penetration methods, for example, as would apparent to one skilled in the art. An image has good quality when the information content is relatively constant through the image. Similarly, poor coupling leads to reduced ultrasound imaging power and low penetration. Other methods would be apparent to one skilled in the art. Based on the image quality data, the system controller 150 may determine whether to increase, decrease or maintain the current contact pressure. The system controller 150 then sends control signals to the pressure regulator 138 to increase, decrease or maintain the contact pressure accordingly. For example, the image quality processor 147 may determine based on the image data that the lines defining an apparent outer perimeter of the target 108 are fuzzy. Based on this determination, the system controller 150 may send a command to the pressure regulator 138 to increase the pressure applied by the flexible robotic interface 130 by incremental amounts over set time periods, until the image data indicates appropriately defined lines of the outer perimeter. [0029] In an embodiment, the direction of the contact pressure applied by the flexible robotic interface 130 may also be varied in order to adjust the field of view 141 of the ultrasound transducer 140 to better image the target 108 and/or the interventional instrument 120. For example, uneven magnitudes of contact pressure may be applied to the surface 102 by way of the flexible robotic interface 130 applying uneven forces to opposite ends of the ultrasound transducer 140, thereby angling or tilting the ultrasound transducer 140 and thus the field of view 141 in a desired direction. Again, the direction (and the magnitude) of the contact pressure may be varied under control of the system controller 150 in response to the image data from the ultrasound imaging system 145 and the image quality processor 147. For example, the image data may be analyzed by the image quality processor 147 to determine whether the target 108 and/or the interventional instrument 120 are substantially centered within the field of view 141. For example, the presence or absence of the target 108 and/or the interventional instrument 120 in the ultrasound images likewise may be determined by shape recognition and segmentation techniques, as would be apparent to one skilled in the art. In response, the system controller 150 determines an amount and direction by which to shift the field of view 141, and what forces to be applied by the flexible robotic interface 130 to different portions of the ultrasound transducer 140 in order to obtain the determined amount and direction of the contact pressures.
[0030] In addition to enabling the adjustment of magnitude and/or direction of the contact pressure, the flexible robotic interface 130 also enables determination of relative positions between the ultrasound transducer 140 and the instrument guide 116, which is used for locating and directing the interventional instrument 120. The relative positions may be determined using the sensors 131, 132, 133 and 134, which provide position data that enable determination of the shape of the flexible robotic interface 130 by the system controller 150. Since the ultrasound transducer 140 and the instrument guide 116 are on opposite ends of the flexible robotic interface 130, shape of the flexible robotic interface 130 defines the positional relationship between the ultrasound transducer 140 and the instrument guide 116.
[0031] When the flexible robotic interface 130 includes the bladder 135, the sensors 131, 132, 133 and 134 may be discrete bending sensors or linear displacement sensors attached to walls of the bladder 135, where “attached to” means that the discrete bending sensors or linear displacement sensors are embedded in the walls of the bladder 135 or connected to inner and/or outer surfaces of the walls of the bladder 135. The bending sensors may be shape sensing optical fibers or piezoresistive materials, for example, although any compatible type of bending/shape sensing sensors may be incorporated without departing from the scope of the present teachings. The shape sensing optical fibers may include Fiber Bragg gratings (FBGs), for example, for measuring strain resulting from the bending action. The piezoresistive materials may include any conductive or semiconductor material in which electrical resistance changes in response to applied stress, such as silicon, polysilicon or silicon carbide, for example. The bending sensors thereby provide shape data as the position data to indicate the extent of bending of each of the sensors 131, 132, 133 and 134. When the sensors 131, 132, 133 and 134 are bending sensors, the sensor processor 136 is a shape sensing processor configured to determine the shapes of the sensors 131, 132, 133 and 134 based on the shape data, as is well known in the art. The determined shapes of the sensors 131, 132, 133 and 134 correlate to the shape of the bladder 135, and provide data indicating distances between the transducer mounting 144 and the instrument guide 116, respectively. These distances are used to determine the positional relationship between the ultrasound transducer 140 at a distal end of the bladder 135 and the instrument guide 116 at a proximal end of the bladder 135.
[0032] Alternatively, the sensors 131, 132, 133 and 134 may be position sensors attached to the walls of the bladder 135, meaning embedded in the walls or connected to inner and/or outer surface of the bladder 135. In an embodiment, the sensors 131, 132, 133 and 134 may be attached to the transducer mounting 144. The position sensors may be any sensor capable of providing a location in two-dimensional or three-dimensional space, such as electromagnet (EM) sensors or optical sensors, for example, although any compatible type of position sensors may be incorporated without departing from the scope of the present teachings. In this case, the robot assisted ultrasound guidance system 100 would further include a corresponding position sensor tracking system, e.g., providing a magnetic field, as is well known in the art. Additional position sensors may be mounted to the instrument guide 116, such that the position sensor tracking system is able to determine the physical relationships between these position sensors and those attached to the bladder 135 and/or the transducer mounting 144. Alternatively, a camera may be mounted on the instrument guide 116, and the sensors 131, 132, 133 and 134 may be visual markers attached to the bladder 135 and/or the transducer mounting 144. Then, the positions of the markers are determined in camera images provided by the camera. The determined positions of the sensors 131, 132, 133 and 134 again correlate the shape of the bladder 135, and provide data indicating distances between the transducer mounting 144 and the instrument guide 116. These distances are used to determine the positional relationship between the ultrasound transducer 140 and the instrument guide 116 at the distal and proximal ends of the bladder 135, respectively.
[0033] The system controller 150 includes a memory 152 that stores instructions and a processing unit 153 that executes the instructions. When executed, the instructions cause the system controller 150 (via the processing unit 153) to implement all or part of the process shown in FIGs. 3 and 4, for example. Similarly, each of the sensor processor 136 and the image quality processor 147 is associated with a memory (not shown) or portion of the memory 152 that stores instructions that, when executed, cause the sensor processor 136 and the image quality processor 147 to perform their respective functions, discussed herein. In various embodiments, the sensor processor 136 and the image quality processor 147 may be implemented separately or together as one or more processors, and/or the respective functionalities may be incorporated in whole or in part into the processing unit 153, without departing from the scope of the present teachings. [0034] The processing unit 153, as well as each of the sensor processor 136 and the image quality processor 147, is representative of one or more processing devices, and is configured to execute software instructions to perform functions as described in the various embodiments herein. The processing unit 153, the sensor processor 136 and the image quality processor 147 may be implemented by one or more of field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), a general purpose computer, a central processing unit, a computer processor, a microprocessor, a microcontroller, a state machine, programmable logic device, or combinations thereof, using any combination of hardware, software, firmware, hard- wired logic circuits, or combinations thereof. It is understood that reference to any processor herein (e.g., computer processor and microprocessor) may include multiple processors, parallel processors, or both. Multiple processors may be included in, or coupled to, a single device or multiple devices.
[0035] The term “processor” as used herein encompasses an electronic component able to execute a program or machine executable instruction. References to a computing device comprising “a processor” should be interpreted to include more than one processor or processing core, as in a multi-core processor. A processor may also refer to a collection of processors within a single computer system or distributed among multiple computer systems. The term computing device should also be interpreted to include a collection or network of computing devices each including a processor or processors. Programs have software instructions performed by one or multiple processors that may be within the same computing device or which may be distributed across multiple computing devices.
[0036] The memory 152, as well as any memories associated with the sensor processor 136 and the image quality processor 147, may include a main memory and/or a static memory, where such memories may communicate with each other and the processing unit 153, the sensor processor 136 and the image quality processor 147, via one or more buses. The memory 152 and the other memories store instructions used to implement some or all aspects of methods and processes described herein. The memory 152 and the other memories may be implemented by any number, type and combination of random access memory (RAM) and read-only memory (ROM), for example, and may store various types of information, such as software algorithms, Al models including recurrent neural networks (RNNs) and other neural network based models, and computer programs, for example, all of which are executable by the processing unit 153, the sensor processor 136 and the image quality processor 147. The various types of ROM and RAM may include any number, type and combination of computer readable storage media, such as a disk drive, flash memory, an electrically programmable read-only memory (EPROM), an electrically erasable and programmable read only memory (EEPROM), registers, a hard disk, a removable disk, tape, compact disk read only memory (CD-ROM), digital versatile disk (DVD), floppy disk, blu-ray disk, a universal serial bus (USB) drive, or any other form of storage medium known in the art. Each of the memory 152 and the other memories is a tangible storage medium for storing data and executable software instructions, and is non-transitory during the time software instructions are stored therein. As used herein, the term “non-transitory” is to be interpreted not as an eternal characteristic of a state, but as a characteristic of a state that will last for a period. The term “non-transitory” specifically disavows fleeting characteristics such as characteristics of a carrier wave or signal or other forms that exist only transitorily in any place at any time. The memory 152 and the other memories may store software instructions and/or computer readable code that enables performance of various functions. The memory 152 and the other memories may be secure and/or encrypted, or unsecure and/or unencrypted.
[0037] The interface 156 may one or more of include ports, disk drives, wireless antennas, or other types of receiver circuitry. For example, the system controller 150 may retrieve or otherwise receive data and instructions via the interface 156 from a website, an email, a portable disk or other type of memory (not shown). The interface 156 may include one or more user interfaces, such as a mouse, a keyboard, a microphone, a video camera, a touchscreen display, voice or gesture recognition captured by a microphone or video camera, for example. The display 157 may be a monitor such as a computer monitor, a television, a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid-state display, or a cathode ray tube (CRT) display, or an electronic whiteboard, for example. The display 157 may also include one or more interface(s), such as the interface 156, in which case the display 157 may provide a graphical user interface (GUI) for displaying and receiving information to and from the user. The interface 156 and the display 157 are connected to one another and to the system controller 150 via the system bus 151.
[0038] FIG. 2A is a simplified schematic diagram of a bladder having multiple bladder chambers configured to provide no lateral displacement of the transducer, according to a representative embodiment, FIG. 2B is a simplified schematic diagram of the bladder having multiple bladder chambers configured to provide lateral displacement of the transducer, according to a representative embodiment, and FIG. 2C is a simplified schematic diagram of the bladder having multiple bladder chambers configured to provide lateral displacement of the instrument guide, according to a representative embodiment.
[0039] Referring to FIGs. 2A, 2B and 2C, a deformable bladder 235 has multiple bladder chambers, including a first bladder chamber 231, a second bladder chamber 232 and a third bladder chamber 233, that may be attached to walls of the bladder 235. As used herein, attached to the walls of the bladder 235 means embedded in the walls of the bladder 235 or connected to inner and/or outer surfaces of the walls of the bladder 235. The bladder 235 may be substituted for the bladder 135 as the flexible robotic interface 130 in FIG. 1, which is effectively a single chamber bladder. Accordingly, the bladder 235 is shown positioned between the instrument guide 116 and the ultrasound transducer 140, and is configured to adjust contact pressure applied to the surface 102 of the subject 105 through the ultrasound transducer 140 to maintain proper acoustic coupling between the ultrasound transducer 140 and the surface 102, and to adjust the position of the ultrasound transducer 140 and/or the instrument guide 116 relative to one another..
[0040] The bladder 235 also includes position sensors (not shown), such as the sensors 131, 132, 133 and 134, for providing shape data, for example, to enable determination of the position or shape of the bladder 235 and ultimately to determine the relative positions of the ultrasound transducer 140 and the instrument guide 116. For example, position data from position sensors (e.g., bending sensors) of the bladder 235 would indicate distance d between the active surface of the ultrasound transducer 140 on the surface 102 and exit position 117 of the interventional instrument 120 from the instrument guide 116. As discussed above, the instrument guide 116 is controlled to guide the interventional instrument 120 to the target 108 along the trajectory 125, at least partially in response to the position data from the bladder 235.
[0041] The gas pressure in each of the first, second and third bladder chambers 231, 232 and 233 may be separately controllable, e.g., through the pressure regulator 138, in order to apply different forces to different locations on the top side of the ultrasound transducer 140. In the depicted embodiment, the first bladder chamber 231 is substantially centered over the ultrasound transducer 140, so the first bladder chamber 231 may be used to adjust the contact pressure applied by the bladder 235 directly downward. The second bladder chamber 232 is positioned over the left edge of the ultrasound transducer 140 and the third bladder chamber 233 is positioned over the right edge of the ultrasound transducer 140, so that the second and third bladder chambers 232 and 233 may be used to angle or tilt the ultrasound transducer 140 to the left and right with respect to the surface 102, as shown in FIG. 2B, and/or to angle or tilt the instrument guide 116 to the left and right with respect to the surface 102, as shown in FIG. 2C. In addition, the first, second and third bladder chambers 231, 232 and 233 are used together to adjust the contact pressure applied by the bladder 235, as mentioned above. In alternative configurations, the second and third bladder chambers 232 and 233 may be positioned over different edges of the ultrasound transducer 140, such as the front and back edges, for example, which would enable angling or tilting the ultrasound transducer 140 and the instrument guide 116 in different corresponding planes.
[0042] For purposes of explanation, it is assumed that the first bladder chamber 231 has pressure Pl, that the second bladder chamber 232 has pressure P2, and that the third bladder chamber 233 has pressure P3. In FIG. 2A, the pressure P2 equals the pressure P3, in which case the combined contact pressure applied by the bladder 235 is directed directly downward, as indicated by the arrow CPI. The first pressure Pl may be the same as or different than the pressures P2 and P3, and the contact pressure would continue to be directed directly downward.
[0043] In comparison, FIG. 2B shows a situation in which the pressure P2 is much greater than the pressure P3, in which case the combined contact pressure applied by the bladder 235 through the ultrasound transducer 140 is directed to the right (away from the second bladder chamber 232 applying the greater pressure), as indicated by arrow CP2. That is, the higher pressure P2 of the second bladder chamber 232 causes the bladder 235 to exert a greater force on the left end of the ultrasound transducer 140 than on the right end, rotating the ultrasound transducer 140 counterclockwise in the depicted orientation. This changes the angle of the field of view of the ultrasound transducer 140 (e.g., field of view 141). In the depicted example, rotating the ultrasound transducer 140 counterclockwise shifts the field of view to the right, enabling the ultrasound transducer 140 to capture images of the target 108 without necessarily having to adjust the position of the instrument guide 116. Of course, the ultrasound transducer 140 may be tilted a greater or lesser amount counterclockwise by increasing or decreasing the pressure P2 relative to the pressure P3, respectively. Likewise, the ultrasound transducer 140 may be tilted clockwise when the pressure P3 is greater than the pressure P2, and the amount by which the ultrasound transducer 140 is tilted may be altered by increasing or decreasing the pressure P3 relative to the pressure P2, respectively.
[0044] FIG. 2C similarly shows a situation in which the pressure P2 is much greater than the pressure P3, although in this case, the combined contact pressure applied by the bladder 235 continues to be directed directly downward, as indicated by arrow CP3, while the instrument guide 116 is angled or tilted clockwise in the depicted orientation. This changes the angle at which the interventional instrument 120 penetrates the surface 102 to adjust for varying depths of the target 108 below the surface 102. In the depicted example, rotating the instrument guide 116 clockwise reduces the depth below the surface 102 at which the interventional instrument 120 intersects the target 108. Of course, the instrument guide 116 may be tilted clockwise a greater or lesser amount by increasing or decreasing the pressure P2 relative to the pressure P3, respectively. Likewise, the instrument guide 116 may be tilted counterclockwise when the pressure P3 is greater than the pressure P2, and the amount by which the instrument guide 116 is tilted may be altered by increasing or decreasing the pressure P3 relative to the pressure P2, respectively.
[0045] Although FIGs. 2A, 2B and 2C show three bladder chambers, it is understood that more or fewer bladder chambers may be attached to (included in or on) a bladder of the flexible robotic interface 130 in order to provide unique benefits for any particular situation or to meet application specific design requirements of various implementations, as would be apparent to one skilled in the art. In addition, for ease of illustration, FIGs. 2A, 2B and 2C show adjustments to the positions of the ultrasound transducer 140 and the instrument guide 116 in only two dimensions in response to application of the pressures Pl, P2 and P3. It is further understood, however, that the pressures of the various bladder chambers may be applied to adjust the ultrasound transducer 140 in three dimensions, as would apparent to one skilled in the art. For example, the bladder may further include bladder chambers attached to the front and back walls of the bladder, in addition to the bladders attached to the left and right walls (e.g., second and third bladder chambers 232 and 233). In such configuration, the ultrasound transducer 140 and/or the instrument guide 116 may be tilted to the front and back, in addition to being tilted to the left and right, by adjusting the relative pressures of the bladders attached to the front and back walls. Also, although FIGs. 2B and 2C show angling or tilting of either the ultrasound transducer 140 or the instrument guide 116, respectively, it is understood that both the ultrasound transducer 140 and the instrument guide 116 may be adjusted simultaneously to shift the field of view of the ultrasound transducer 140 and the penetration angle of the interventional instrument 120 in desired amounts. [0046] FIG. 3 is a simplified schematic diagram of a bladder having multiple shape sensors, according to a representative embodiment. FIG. 3 is used as reference for computing a transformation between ultrasound imaging and robot end-effector coordinate systems to determine the position of a flexible robotic interface, as discussed below.
[0047] Referring to FIG. 3, a deformable bladder 335 includes multiple position sensors, indicated by first bending sensor 331, second bending sensor 332 and third bending sensor 333. The first, second and third bending sensors 331, 332 and 333 may be shape sensing optical fibers or piezoresistive materials, for example, as discussed above with regard to the representative sensors 131, 132, 133 and 134. Of course, the first, second and third bending sensors 331, 332 and 333 may be position sensors, as discussed above, without departing from the scope of the present teachings. The bladder 335 may be substituted for the bladder 135 as the flexible robotic interface 130 in FIG. 1. Accordingly, the bladder 335 is shown connected between the instrument guide 116 (robot end-effector) and the transducer mounting 144 for the ultrasound transducer 140. Notably, at least two bending sensors are needed to indicate the position of the ultrasound transducer in two dimensions, and at least six bending sensors needed to indicate the position of the ultrasound transducer in three dimensions.
[0048] Attachment points Qi, Q2 and Q3 are the physical positions at which the first, second and third bending sensors 331, 332 and 333 attach to the instrument guide 116. The points Qi, Q2 and Q3 are in the robot end-effector coordinate system (ECS). Attachment points Pi, P2 and P3 are the physical positions at which the first, second and third bending sensors 331, 332 and 333 attach to the transducer mount 144. The points Pi, P2 and P3 are in the ultrasound imaging coordinate system (ICS). A mathematical transformation of the points Qi, Q2 and Q3 from the ECS to the ICS may be indicated by FICSECS. In the two-dimensional case, as shown in FIG. 3, the transformation requires three parameters, which are provided by the first, second and third bending sensors 331, 332 and 333. In a three-dimensional case (not shown), the transformation would require six parameters and thus six bending sensors, as would be apparent to one skilled in the art.
[0049] In order to use the flexible robotic interface 130 for image guided interventions or image analysis in the ECS, the position of the ultrasound transducer 140 in the ICS is computed with respect to the instrument guide 116. The first, second and third bending sensors 331, 332 and 333 provide data that enable determination of the distance between the instrument guide 116 and on the transducer mounting 144. In particular, the bending sensor 331 indicates a distance di between the points Qi and Pi, the bending sensor 332 indicates a distance d2 between the points Q2 and P2, and the bending sensor 333 indicates a distance ds between the points Q3 and P3. Then, the goal is to compute the mathematical transformation between ECS and ICS referred to as FICSECS. Since the instrument guide 116 and the transducer mounting 144 are rigid bodies, the positions of the points Qi, Q2 and Q3 are known in the ECS (QIECS, Q2ECS, Q3ECS) and the positions of the points Pi, P2 and P3 are known in the ICS (PiICS, P2ICS, P3ICS). Therefore, the respective positions of the points Pi, P2 and P3 in the ECS may be expressed as shown in Equation (1):
PiECS = FICSECS PiICS; where 1 = 1, 2, 3 (1)
[0050] Accordingly, the respective distances di between the points Pi, P2 and P3 and the points Qi, Q2 and Q3 in ECS may be expressed as shown in Equation (2): di = distance (PiECS, QiECS) = distance (FICSECS PiICS, Q ECS); 1 = 1, 2, 3 (2)
[0051] Since the distances di may be physically determined (e.g., measured), the only unknowns are the parameters of the transformation FICSECS, which can be solved for since there are three equations (i = 1, 2, 3). The distances di may be used to determine the positional relationship between the instrument guide 116 and the ultrasound transducer 140, mounted to the transducer mounting 144.
[0052] As discussed above, there are a number of alternative techniques that can be used to compute the unknown transformation FICSECS. For example, position sensors such as EM sensors or optical sensors, for example, may be mounted to the instrument guide 116 and on the transducer mounting 144 and/or the bladder 335, and the corresponding tracking system may be employed to determine the physical relationships among the sets of position sensors. Alternatively, a camera may be mounted on the instrument guide 116 and visual markers may be attached to the transducer mounting 144 and/or the bladder 335. Then, the positions of the markers in the camera image are used to compute the position of the ICS in relation to the RCS. [0053] FIG. 4 is a simplified flow diagram showing a method of performing robotics assisted ultrasound guidance using a flexible robotic interface, according to a representative embodiment. The flexible robotic interface (e.g., flexible robotic interface 130) is connected between an ultrasound transducer (e.g., ultrasound transducer 140) and an instrument guide (instrument guide 116) of a robot, used to guide an interventional instrument (e.g., interventional instrument 120) toward a target (e.g., target 108) in a subject (e.g., subject 105). The method may be implemented by computer readable instructions or computer code, e.g., stored the memory 152 and executable by the processing unit 153 of the system controller 150, discussed above.
[0054] Referring to FIG. 4, the method includes initially positioning the ultrasound transducer on a surface of the subject such that the target is substantially within a field of view of the ultrasound transducer in block S411. The ultrasound transducer may initially positioned on the surface of the subject by hand or by the robot using the ultrasound images provided by the ultrasound transducer to initially locate the target. In this manner, the target is placed within or near the field of view of the ultrasound transducer prior to beginning insertion of the interventional instrument. Alternatively, the ultrasound transducer may be initially placed using another imaging modality, such as x-ray imaging, computerized tomography (CT) scan imaging or magnetic resonance imaging (MRI) imaging, for example, showing the target.
[0055] Once the ultrasound transducer is in the initial position, a contact pressure is applied (or maintained) by a flexible robotic interface (e.g., flexible robotic interface 130) to a surface (e.g., surface 102) of the subject through the ultrasound transducer in block S412. As discussed above, the ultrasound transducer is connected to a distal end of the flexible robotic interface, while the instrument guide of the robot is connected to a proximate end of the flexible robotic interface, such that operation of the flexible robotic interface alters the position of the ultrasound transducer relative to the instrument guide. In an embodiment, the flexible robotic interface includes a deformable bladder configured to contain gas (e.g., air) under pressure, such that the contact pressure is applied by forcibly injecting the gas into the bladder (or one or more bladder chambers in the bladder) to cause the bladder to exert force on the ultrasound transducer in a direction toward the surface of the subject. The gas is injected into the bladder through a pressure regulator (e.g., pressure regulator 138) to obtain a desired contact pressure between the ultrasound transducer and the surface of the subject. [0056] In block S413, it is determined whether the quality of the ultrasound images of the target provided by the ultrasound transducer is acceptable (assuming that the target is within the field of view of the ultrasound transducer). The image quality is analyzed by an image processor (e.g., image quality processor 147) based on ultrasound image data provided by the ultrasound transducer and the ultrasound imaging system. For example, determining the image quality may include shape recognition and segmentation techniques, as would be apparent to one skilled in the art. The ultrasound images may be analyzed to determine whether features in the target, such as lines defining an apparent outer perimeter of the target, are fuzzy or clear. When the image quality is determined to be unacceptable (block S413: No), magnitude of the contact pressure applied by the flexible robotic interface is adjusted through the pressure regulator in block S414. That is, the contact pressure is increased or decreased in order to bring the image of the target into sharp relief. For example, the contact pressure may be increased when low ultrasound coupling is detected and decreased when large skin deformation is detected, as would be apparent to one skilled in the art. The process then returns to block S413 to determine whether the quality of the ultrasound images provided by the ultrasound transducer is acceptable using the adjusted contact pressure. When the image quality is determined to be acceptable (block S413: Yes), the process proceeds to block S415.
[0057] In block S415, a position of the ultrasound transducer is determined relative to the instrument guide, where the position of the ultrasound transducer also provides the desired position of the instrument guide. For example, position data may be received from position sensors at a sensor processor (e.g., sensor processor 136), which calculates the position of the ultrasound transducer relative to the instrument guide using the received position data. In an embodiment, the position sensors may be bending sensors (e.g., sensors 131, 132, 133 and 134) attached to the flexible robotic interface, such as shape sensing optical fibers or piezoresistive materials, that provide shape data indicating the shape of the flexible robotic interface. Generally, the more a bending sensor is bent, the more compact the flexible robotic interface, and thus the closer the ultrasound transducer relative is to the instrument guide, and the less the bending sensor is bent, the more extended the flexible robotic interface, and thus the farther the ultrasound transducer relative is from the instrument guide. Multiple bending sensors may have different degrees of bending, indicating that the flexible robotic interface is not being uniformly deformed. This occurs when the ultrasound transducer is positioned at an angle (tilted) with respect to the instrument guide. Other types of position sensors may be incorporated, such as EM sensors, optical sensors or markers, attached to the flexible robotic interface, or attached to the instrument guide and/or the transducer (or transducer mounting), as discussed above.
[0058] In block S416, it is determined whether the relative position of the ultrasound transducer and the instrument guide is acceptable. For example, it may be determined pursuant to block S416 whether the target is substantially centered in the field of view of the ultrasound transducer and/or whether a projected trajectory of the interventional instrument at the angle currently set by the instrument guide intersects the position of the target in the subject, discussed below. When the relative position of the ultrasound transducer and the instrument guide is determined to be unacceptable (block S416: No), the angle (direction) of the contact pressure applied by the flexible robotic interface is adjusted through the pressure regulator in block S417. The process then returns to block S415 to again determine the position of the ultrasound transducer relative to the instrument guide based on the adjusted angle of the contact pressure applied by the flexible robotic interface. When the relative position of the ultrasound transducer and the instrument guide is determined to be acceptable (block S416: Yes), the process proceeds to block S418.
[0059] Generally, in block S417, the position of the ultrasound transducer relative to the instrument guide may be altered by changing the shape of the flexible robotic interface. For example, when the flexible robotic interface includes a deformable bladder, the shape of the flexible robotic interface may be changed by increasing or decreasing the amount of gas contained in the bladder, which increases or decreases separation between the ultrasound transducer and the instrument guide, respectively. Also, to the extent the bladder includes multiple bladder chambers (e.g., bladder chambers 231, 232 and 233) attached to walls of the bladder, the shape of the flexible robotic interface also may be changed by increasing or decreasing the amount of gas in each of the separate bladder chambers, thereby angling or tilting one or both of the ultrasound transducer and the instrument guide, as discussed above with reference to FIGs. 2B and 2C.
[0060] As mentioned above, the determination in block S416 as to whether the relative position of the ultrasound transducer and the instrument guide is acceptable may include determining whether the target is substantially centered in the field of view of the ultrasound transducer and/or determining whether the projected trajectory of the interventional instrument intersects the position of the target. In order to determine whether the target is substantially centered within the field of view, a current ultrasound image is provided by ultrasound imaging system 145 and analyzed, for example, by system controller 150 and/or processing unit 153 using shape recognition techniques. When the target is not centered within the field of view, the position of the ultrasound transducer relative to the instrument guide may be altered incrementally until the target appears at or near the center of the ultrasound image. Similarly, to the extent the interventional instrument may have been partially or fully advanced into the subject at this point in the procedure, it may be determined whether the position of a distal end of the interventional instrument is at or near the center of the field of view using the ultrasound images, where the position of the ultrasound transducer relative to the instrument guide may be then altered to place the distal end at a desired location within the ultrasound image.
[0061] In order to determine whether the projected trajectory of the interventional instrument intersects the position of the target, a projected trajectory (e.g., trajectory 125) of the interventional instrument toward the target is determined using the determined position of the ultrasound transducer and corresponding position of the instrument guide. As discussed above, the interventional instrument is slidably attached to the instrument guide at a predetermined angle, previously set for the interventional procedure. For example, the instrument guide may include at least one fixed channel at a corresponding at least one predetermined fixed angle, or may include a pivoting channel that may be set at a selected angle within a predetermined range of angle. Accordingly, the projected trajectory may be determined geometrically based on knowledge of the location of an exit position (e.g., exit position 117) of the interventional instrument from the instrument guide, the angle of the channel for the interventional instrument, and the location of the target in the subject. When the projected trajectory of the interventional instrument does not intersect the position of the target, the position of the ultrasound transducer relative to the instrument guide may be altered by changing the shape of the flexible robotic interface. For example, when the flexible robotic interface includes a deformable bladder, in order to get the projected trajectory to intersect the target closer to the surface of the subject, gas may be added to the bladder to further separate the instrument guide from the ultrasound transducer. Alternatively, when there are separate bladder chambers, gas may be added to one bladder chamber and removed from an opposite bladder chamber such that the instrument guide tilts toward the ultrasound transducer, thereby decreasing the intersection angle of the interventional instrument relative to the field of view of the ultrasound transducer.
[0062] In block S418, the interventional instrument is advanced through the instrument guide along the projected trajectory a remaining distance (if any) toward the target until it intersects (e.g., pierces or otherwise enters) the target. The interventional instrument may be advanced through the instrument guide manually by a user or automatically by the robot itself.
[0063] In accordance with various embodiments of the present disclosure, the methods described herein may be implemented using a hardware computer system that executes software programs. Further, in an exemplary, non-limited embodiment, implementations can include distributed processing, component/object distributed processing, and parallel processing. Virtual computer system processing may implement one or more of the methods or functionalities as described herein, and a processor described herein may be used to support a virtual processing environment.
[0064] Although developing adaptable predictive analytics has been described with reference to several exemplary embodiments, it is understood that the words that have been used are words of description and illustration, rather than words of limitation. Changes may be made within the purview of the appended claims, as presently stated and as amended, without departing from the scope and spirit of interventional procedure optimization in its aspects. Although developing adaptable predictive analytics has been described with reference to particular means, materials and embodiments, developing adaptable predictive analytics is not intended to be limited to the particulars disclosed; rather developing adaptable predictive analytics extends to all functionally equivalent structures, methods, and uses such as are within the scope of the appended claims. [0065] The illustrations of the embodiments described herein are intended to provide a general understanding of the structure of the various embodiments. The illustrations are not intended to serve as a complete description of all of the elements and features of the disclosure described herein. Many other embodiments may be apparent to those of skill in the art upon reviewing the disclosure. Other embodiments may be utilized and derived from the disclosure, such that structural and logical substitutions and changes may be made without departing from the scope of the disclosure. Additionally, the illustrations are merely representational and may not be drawn to scale. Certain proportions within the illustrations may be exaggerated, while other proportions may be minimized. Accordingly, the disclosure and the figures are to be regarded as illustrative rather than restrictive.
[0066] One or more embodiments of the disclosure may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any particular invention or inventive concept. Moreover, although specific embodiments have been illustrated and described herein, it should be appreciated that any subsequent arrangement designed to achieve the same or similar purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all subsequent adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the description.
[0067] The Abstract of the Disclosure is provided to comply with 37 C.F.R. §1.72(b) and is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, various features may be grouped together or described in a single embodiment for the purpose of streamlining the disclosure. This disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter may be directed to less than all of the features of any of the disclosed embodiments. Thus, the following claims are incorporated into the Detailed Description, with each claim standing on its own as defining separately claimed subject matter.
[0068] The preceding description of the disclosed embodiments is provided to enable any person skilled in the art to practice the concepts described in the present disclosure. As such, the above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended claims are intended to cover all such modifications, enhancements, and other embodiments which fall within the true spirit and scope of the present disclosure. Thus, to the maximum extent allowed by law, the scope of the present disclosure is to be determined by the broadest permissible interpretation of the following claims and their equivalents and shall not be restricted or limited by the foregoing detailed description.

Claims

CLAIMS:
1. A system for performing robotics assisted ultrasound guidance, the system comprising: an instrument guide maneuverable by a robot, and configured to guide an interventional instrument toward a target within a subject; a flexible robotic interface attached to the instrument guide, and configured to apply a contact pressure to a surface of the subject; an ultrasound transducer attached to the flexible robotic interface, such that the ultrasound transducer is pressed against the surface of the subject in close proximity to the instrument guide by the contact pressure applied by the flexible robotic interface, the ultrasound transducer being configured to provide ultrasound images of at least one of the target or the interventional instrument when guided by the instrument guide toward the target, wherein the flexible robotic interface enables automatic manipulation of the ultrasound transducer in order to maintain acoustic coupling between the ultrasound transducer and the surface of the subject and to adjust a position of the ultrasound transducer relative to the instrument guide; at least one sensor attached to the flexible robotic interface, and configured to provide sensor data indicating the position of the ultrasound transducer; and a controller programmed to receive image data from ultrasound images from the ultrasound transducer and the sensor data from the at least one sensor, and to control the flexible robotic interface to adjust the contact pressure applied to the surface of the subject to optimize an image quality of the ultrasound images in response to the image data and/or the sensor data.
2. The system of claim 1 , wherein the controller is further programmed to control the robot to maneuver the interventional instrument so that a projected trajectory of the interventional instrument aligns with the target shown in the ultrasound images in response to the image data and the sensor data.
3. The system of claim 2, further comprising a display, wherein the controller is further programmed to cause display of the ultrasound images, including the interventional instrument and the target, together with the projected trajectory of the interventional instrument.
26
4. The system of claim 1, wherein the interventional instrument comprises a needle.
5. The system of claim 1, wherein the flexible robotic interface comprises a deformable bladder.
6. The system of claim 5, further comprising: a pressure regulator configured to regulate an internal pressure of the deformable bladder, wherein the controller is further programmed to adjust the internal pressure of the deformable bladder using pressure signals provided to the pressure regulator in order to adjust a magnitude of the contact pressure applied to the surface of the subject.
7. The system of claim 6, wherein the at least one sensor is embedded in at least one wall of the deformable bladder.
8. The system of claim 6, wherein the at least one sensor is connected to at least one of an inner surface or an outer surface of at least one wall of the deformable bladder.
9. The system of claim 2, wherein the flexible robotic interface comprises a deformable bladder having a plurality of bladder chambers attached to walls of the deformable bladder.
10. The system of claim 9, further comprising: a pressure regulator configured to regulate internal pressures of the plurality of bladder chambers, respectively, wherein the controller is further programmed to separately adjust the internal pressures of the plurality of bladder chambers using pressure signals provided to the pressure regulator in order to adjust a magnitude and a direction of the contact pressure applied to the surface of the subject.
11. The system of claim 1, wherein the at least one sensor comprises a plurality of bending sensors attached to the flexible robotic interface.
12. The system of claim 11, wherein the plurality of bending sensors comprise optical fibers or piezoresistive materials.
13. The system of claim 12, wherein the plurality of bending sensors comprise two bending sensors for indicating the position of the ultrasound transducer in two dimensions, and wherein the plurality of bending sensors comprise six bending sensors for indicating the position of the ultrasound transducer in three dimensions.
14. The system of claim 1, wherein the at least one sensor comprises a position sensor attached externally to the flexible robotic interface.
15. The system of claim 1, wherein the at least one sensor comprises a resistive position sensor, an electromagnetic (EM) position sensor or an optical position sensor.
16. A method of performing robotics assisted ultrasound guidance of an interventional instrument using an instrument guide maneuverable by a robot to guide the interventional instrument toward a target within a subject, the method comprising; applying a contact pressure to a surface of the subject through an ultrasound transducer using a flexible robotic interface, the contact pressure providing acoustic coupling between the ultrasound transducer and a surface of the subject; determining a position of the ultrasound transducer relative to the instrument guide by determining a shape of the flexible robotic interface, the flexible robotic interface being connected between the instrument guide and a transducer mounting to which the ultrasound transducer connected; determining a projected trajectory of the interventional instrument toward the target from the instrument guide; when the projected trajectory of the interventional instrument does not intersect the target, changing the position of the ultrasound transducer, determining another position of the ultrasound transducer relative to the instrument guide, and determining another projected trajectory of the interventional instrument from the instrument guide; and when the projected trajectory or the another projected trajectory of the interventional instrument intersects the target, advancing the interventional instrument through the instrument guide along the projected trajectory.
17. The method of claim 16, wherein determining the position of the ultrasound transducer relative to the instrument guide by determining the shape of the flexible robotic interface comprises: receiving shape data from sensors attached to the flexible robotic interface; and determining at least one distance between the instrument guide and the transducer mounting using the received shape data.
18. The method of claim 16, further comprising: determining a quality of an image of the target using corresponding image data provided by the ultrasound transducer; and adjusting the contact pressure applied to the surface of the subject in response to the determined quality of the image in order to optimize imaging of the target.
19. A flexible robotic interface comprising: a deformable bladder inflatable with gas, and having a proximal end connected to an instrument guide maneuverable by a robot for guiding an interventional instrument toward a target within a subject, and a distal end connected to an ultrasound transducer configured to press against a surface of the subject by a contact pressure applied by the deformable bladder and to provide ultrasound images of at least one of the target or the interventional instrument when guided by the instrument guide toward the target; and a plurality of bending sensors attached to walls of the deformable bladder, and configured to provide sensor data indicating a shape of the deformable bladder, the shape of the deformable bladder defining a positional relationship between the ultrasound transducer and the instrument guide,
29 wherein a gas pressure of the gas in the deformable bladder is adjustable in response to the sensor data to order to adjust at least one of the contact pressure applied to the surface of the subject through the ultrasound transducer for optimizing image quality of the ultrasound images or the positional relationship between the ultrasound transducer and the instrument guide for aligning a projected trajectory of the interventional instrument with the target.
20. The flexible robotic interface of claim 19, wherein the plurality of bending sensors comprise shape sensing optical fibers or piezoresistive materials.
30
PCT/EP2021/073563 2020-09-03 2021-08-26 A flexible robotic interface for performing robotics assisted ultrasound guided interventions WO2022048979A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063073998P 2020-09-03 2020-09-03
US63/073,998 2020-09-03

Publications (1)

Publication Number Publication Date
WO2022048979A1 true WO2022048979A1 (en) 2022-03-10

Family

ID=77774872

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/073563 WO2022048979A1 (en) 2020-09-03 2021-08-26 A flexible robotic interface for performing robotics assisted ultrasound guided interventions

Country Status (1)

Country Link
WO (1) WO2022048979A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015099849A1 (en) * 2013-12-23 2015-07-02 U-Systems, Inc. Medical ultrasound scanning with control over pressure/force exerted by an ultrasound probe and/or a compression/scanning assembly
US20160324585A1 (en) * 2014-01-24 2016-11-10 Koninklijke Philips N.V. Robotic control of imaging devices with optical shape sensing
US20180168544A1 (en) * 2015-06-30 2018-06-21 Koninklijke Philips N.V. Methods, apparatuses, and systems for coupling a flexible transducer to a a surface

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015099849A1 (en) * 2013-12-23 2015-07-02 U-Systems, Inc. Medical ultrasound scanning with control over pressure/force exerted by an ultrasound probe and/or a compression/scanning assembly
US20160324585A1 (en) * 2014-01-24 2016-11-10 Koninklijke Philips N.V. Robotic control of imaging devices with optical shape sensing
US20180168544A1 (en) * 2015-06-30 2018-06-21 Koninklijke Philips N.V. Methods, apparatuses, and systems for coupling a flexible transducer to a a surface

Similar Documents

Publication Publication Date Title
US10368850B2 (en) System and method for real-time ultrasound guided prostate needle biopsies using a compliant robotic arm
Wei et al. Robot‐assisted 3D‐TRUS guided prostate brachytherapy: System integration and validation
Abayazid et al. Integrating deflection models and image feedback for real-time flexible needle steering
US20160374644A1 (en) Ultrasonic Guidance of a Probe with Respect to Anatomical Features
US7085400B1 (en) System and method for image based sensor calibration
CA2574675C (en) Calibrating imaging devices
US6311540B1 (en) Calibration method and apparatus for calibrating position sensors on scanning transducers
CA2565520C (en) Targets and methods for ultrasound catheter calibration
EP3801278B1 (en) Ultrasound probe positioning system
US11338445B2 (en) End effector force sensor and manual actuation assistance
US20110153254A1 (en) System And Method For Calibration For Image-Guided Surgery
CN110868937B (en) Integration with robotic instrument guide of acoustic probe
US20100172559A1 (en) System and method for prostate biopsy
EP2945560B1 (en) Method of adjusting focal zone in ultrasound-guided medical procedure and system employing the method
US11096745B2 (en) System and workflow for grid-less transperineal prostate interventions
WO2022166182A1 (en) Two-dimensional image guided intramedullary needle distal locking robot system and locking method therefor
US20140343425A1 (en) Enhanced ultrasound imaging interpretation and navigation
WO2022048979A1 (en) A flexible robotic interface for performing robotics assisted ultrasound guided interventions
JP7309850B2 (en) Ultrasound system and method for guided shear wave elastography of anisotropic tissue
WO2024039608A1 (en) Spatially aware medical device configured for performance of insertion pathway approximation
US20130079636A1 (en) Ultrasound diagnostic apparatus and method thereof
CN216135922U (en) Dynamically adjusting an ultrasound imaging system
US20230240757A1 (en) System for guiding interventional instrument to internal target
Cheng et al. Active phantoms: a paradigm for ultrasound calibration using phantom feedback
Lagomarsino et al. Image-guided breast biopsy of MRI-visible lesions with a hand-mounted motorised needle steering tool

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21770148

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21770148

Country of ref document: EP

Kind code of ref document: A1