WO2017167754A1 - Image guided robot for catheter placement - Google Patents

Image guided robot for catheter placement Download PDF

Info

Publication number
WO2017167754A1
WO2017167754A1 PCT/EP2017/057316 EP2017057316W WO2017167754A1 WO 2017167754 A1 WO2017167754 A1 WO 2017167754A1 EP 2017057316 W EP2017057316 W EP 2017057316W WO 2017167754 A1 WO2017167754 A1 WO 2017167754A1
Authority
WO
WIPO (PCT)
Prior art keywords
control system
recited
steerable device
steerable
robotically controlled
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2017/057316
Other languages
English (en)
French (fr)
Inventor
Aleksandra Popovic
David Paul NOONAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Priority to US16/086,805 priority Critical patent/US20190105112A1/en
Priority to CN201780022136.4A priority patent/CN108882967A/zh
Priority to EP17714419.3A priority patent/EP3435904A1/en
Priority to CN202511144129.4A priority patent/CN120732543A/zh
Priority to JP2018551379A priority patent/JP7232051B2/ja
Publication of WO2017167754A1 publication Critical patent/WO2017167754A1/en
Anticipated expiration legal-status Critical
Priority to US19/019,675 priority patent/US20250152267A1/en
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/302Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/304Surgical robots including a freely orientable platform, e.g. so called 'Stewart platforms'
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/373Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3966Radiopaque markers visible in an X-ray image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M25/00Catheters; Hollow probes
    • A61M25/01Introducing, guiding, advancing, emplacing or holding catheters
    • A61M25/0105Steering means as part of the catheter or advancing means; Markers for positioning
    • A61M25/0116Steering means as part of the catheter or advancing means; Markers for positioning self-propelled, e.g. autonomous robots

Definitions

  • This disclosure relates to medical instruments, and more particularly to systems and methods for robotically steering a device using controlled joints in medical applications.
  • Balloon sinuplasty is a procedure in which a balloon catheter is inserted into a blocked sinus to relieve patients from symptoms of a sinus infection.
  • a guide catheter is inserted through the nose into the sinus.
  • the guide catheter can have curved tips to facilitate entry into an appropriate sinus.
  • a guidewire is placed inside the catheter, and the guide catheter is retracted once the guidewire is in the right place.
  • a balloon catheter is placed over the guidewire, and a balloon is inflated to open up air passageways. This procedure is done under the guidance of a flexible endoscope and X-rays. The X-rays are typically employed to verify that the guidewire is placed into an appropriate sinus opening.
  • the anatomy of sinuses is very complex and can include multiple sharp turns to reach a sinus cavity from the nose.
  • finding an appropriate location for deploying the balloon is needed for the success of the therapy.
  • the navigation is further hindered by some of the following described issues.
  • control of the guide catheter is complex.
  • a surgeon needs to choose an appropriate angle for the curved tip, which is determined from a patient's computed tomography (CT) scan.
  • CT computed tomography
  • the guide catheter is then pivoted and rotated to position the curve at the sinus entry point.
  • the procedure is performed under image guidance, which may include a fiber optic endoscope inserted through the guide catheter and/or a C-arm X-ray system taking two dimensional images of the anatomy and the device.
  • the X-ray guidance can be challenging since the 2D images cannot capture complex 3D anatomy.
  • the endoscope guidance can show the sinus opening only if it is in front of the catheter.
  • a robot in accordance with the present principles, includes a steerable device having one or more robotically controlled joints configured to steer the steerable device.
  • a device control system is configured to adjust positioning of the steerable device in accordance with one of image feedback from an image control system or a plan in a volume such that control commands are issued to the one or more robotically controlled joints to steer the steerable device in a direction consistent with navigation of the steerable device toward a target.
  • a guidance system includes a steerable device having an adjustable tip portion, the tip portion being coupled to a robotically controlled joint.
  • An image control system is configured to combine intraoperative images with preoperative images to evaluate a position of the steerable device within a volume.
  • a device control system is configured to receive position information from the image control system and to evaluate positioning of the steerable device in the volume using a kinematic model. The device control system issues control commands to the robotically controlled joint to steer the steerable device in a direction consistent with navigation of the steerable device toward a target.
  • a guidance method includes inserting a steerable device having an adjustable robotically controlled joint configured to be steered into a volume; providing position or image feedback of the steerable device within the volume; and automatically navigating the steerable device toward a target in accordance with a plan using a device control system configured to receive the feedback, to evaluate positioning of the steerable device in the volume and to issue control commands to the robotically controlled joint to steer the steerable device.
  • FIG. 1 is a block/flow diagram showing a guidance system which employs a steerable device having a robotically controlled joint to form a steerable tip portion on a medical device in accordance with one embodiment
  • FIG. 2 is a flow diagram showing methods for guiding a steerable device (e.g., robot controlled) in accordance with illustrative embodiments
  • FIG. 3 is a diagram showing an illustrative joint with three degrees of rotational freedom and translation in accordance with one embodiment
  • FIG. 4A is a diagram showing a steerable device approaching a branching structure in accordance with one embodiment
  • FIG. 4B is a diagram showing the steerable device of FIG. 4A after being adjusted to select a desired pathway in accordance with the one embodiment.
  • FIG. 5 is a block/flow diagram showing a robot which employs a steerable device and a device control system in accordance with another embodiment.
  • a steerable device which may include an actuated robotically controlled joint that is guided using an image guidance system to place a guidewire in a sinus or other complex cavity or lumen network.
  • the steerable device may include one or more joints and may be referred to as a robot.
  • the joints are configured to alter the shape of the steerable device to guide the device into a correct passageway.
  • a guidewire can be placed through the lumen of the steerable device.
  • the image control system performs integration of preoperative and intraoperative images and determines, from the images, the location in an anatomy where a steerable tip has to be guided and an angle of steering.
  • the present invention will be described in terms of medical instruments; however, the teachings of the present invention are much broader and are applicable to any steerable instruments for use in any portions of the body.
  • the present principles are employed in tracking or analyzing complex biological or mechanical systems.
  • the present principles are applicable to internal tracking and operating procedures of biological systems and procedures in all areas of the body such as the lungs, brain, heart, gastro-intestinal tract, excretory organs, blood vessels, etc.
  • the elements depicted in the FIGS, may be implemented in various
  • processors can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software.
  • the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared.
  • explicit use of the term "processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor ("DSP”) hardware, read-only memory (“ROM”) for storing software, random access memory
  • DSP digital signal processor
  • ROM read-only memory
  • RAM random access memory
  • non-volatile storage etc.
  • embodiments of the present invention can take the form of a computer program product accessible from a computer-usable or computer-readable storage medium providing program code for use by or in connection with a computer or any instruction execution system.
  • a computer-usable or computer readable storage medium can be any apparatus that may include, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium.
  • Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk.
  • Current examples of optical disks include compact disk - read only memory (CD-ROM), compact disk - read/write (CD-R/W), Blu-RayTM and DVD.
  • such phrasing is intended to encompass the selection of the first listed option (A) only, or the selection of the second listed option (B) only, or the selection of the third listed option (C) only, or the selection of the first and the second listed options (A and B) only, or the selection of the first and third listed options (A and C) only, or the selection of the second and third listed options (B and C) only, or the selection of all three options (A and B and C).
  • This may be extended, as readily apparent by one of ordinary skill in this and related arts, for as many items listed.
  • System 100 may include a workstation or console 1 12 from which a procedure is supervised and/or managed.
  • Workstation 1 12 preferably includes one or more processors 1 14 and memory 1 16 for storing programs and applications.
  • Memory 1 16 may store a device control system 130 configured to control movement and programming of an actuated robot joint or joints 108 and other possible robotically controlled features in accordance with user input and/or feedback provided from one or more inputs.
  • the system 100 includes a steerable device 102 and an image guidance or control system 106 to permit placement of a guidewire in a complex or branching network of tubes or cavities, e.g., sinus cavities, etc.
  • the actuated device 102 may include one or more joints 108.
  • the joints 108 are configured to steer a tip of the steerable device 102.
  • the image control system or image guidance system 106 performs integration of preoperative images 142 and intraoperative images 144 and determines, from the images (142, 144), the location in anatomy where a steerable tip 124 of the device 102 (e.g., a catheter or catheter-like device) has to be steered and an angle of steering.
  • a steerable tip 124 of the device 102 e.g., a catheter or catheter-like device
  • the steerable device 102 may be fixed in space at a proximal end (for example using a medical positioning arm).
  • a coordinate frame for each joint 108 can be defined in a coordinate system at the proximal end (fixed coordinate system). Since a position of each motor (not shown) for each joint 108 is known from motor encoders, position and three angles of orientation of each joint 108 is known in the fixed coordinate system as well.
  • each rigid segment can be detected using image processing methods known in art, such as thresholding segmentation and shape fitting.
  • a radiopaque marker can be attached to each joint 108.
  • the joints 108 may be ordered in a simple tree where a parent and a child of a node are direct neighbors of any given joint.
  • the registration process assumes m points in 2D X-ray space and m points in 3D robot space (fixed coordinate system). The registration process also assumes that focal length or the X-ray system is known.
  • the pose of an X-ray detector of system 1 1 1 in the coordinate frame of the device 102 can thus be detected using any method known in art, such as iterative closest point, RANSAC (Random sample consensus) based iterative method, etc.
  • a solution with the best residual error can be shown to the user as the position of X-ray system 1 1 1 with respect to the device 102.
  • the user can select the right solution by observing rendering of both solutions or answering a simple question (e.g., "Is image detector above or below the patient?").
  • Other registration methods may also be employed to register intraoperative images 144 and preoperative images 142 and the steerable device 102.
  • the system 100 employs the steerable device 102 with the steerable tip 124 inside a passageway or anatomical lumen (e.g., sinus passage).
  • the device 102 further includes an insertion stage 128 that translates the device 102 along a main axis inside the body.
  • the device 102 can be configured to implement steering in one plane using one joint.
  • the device 102 can be configured to implement yaw and pitch motion using two joints.
  • two or more parallel motors may be employed to implement the steering angle.
  • a tendon driven system with two or more tendons embedded in the device 102 and coupled to actuator s/motors at a distal end of the tendons can provide steering.
  • additional rotational degrees of freedom can rotate the device 102 around a primary axis (longitudinal axis) of the device.
  • actuation and/or rotation schemes may be combined with any one or more other actuation and/or rotation schemes, as needed.
  • the device control system 130 may be stored in memory 1 16 and be configured to translate the angle of joints 108 into actuator commands of the device or generate actuator commands to change the angle of the joints in accordance with image feedback.
  • the device control system 130 includes a kinematic model 132 of the device and control schemes that are known in art.
  • the kinematic model 132 computes a configuration needed for guiding the device 102 through a passageway. Parameters such as speed, position, and other spatial considerations (e.g., angles due to internal volume structures) are considered by the model 132.
  • the device control system 130 controls an amount of rotation of the joint 108 based upon a position and speed of the medical device 102 as the device approaches a branching structure, bifurcation, etc.
  • actuator commands are generated by the device control system 130 to steer the device 102 by adjusting the steerable tip 124. Navigation of the device 102 in accordance with the present principles can proceed at an increased rate, which results in reduced operation times.
  • Workstation 1 12 includes a display 1 18 for viewing the internal images 144 and 142 of the subject (patient) or volume 134 and may include the images 142, 144 with overlays or other renderings. Display 1 18 may also permit a user to interact with the workstation 1 12 and its components and functions, or any other element within the system 100. This is further facilitated by a user interface 120 which may include a keyboard, mouse, a joystick, a haptic device, or any other peripheral or control to permit user feedback from and interaction with the workstation 1 12.
  • an imaging system 1 10 may be present for obtaining preoperative images 142 (e.g., MRI, CT, etc.). In other embodiments, the imaging system 1 10 may be located separately, and images may be collected remotely from other described operations.
  • the intra-operative imaging system 1 1 1 may include a fiber optic scope, a camera system, an X-ray imaging system, a mobile X-ray imaging system, etc. for obtaining intraoperative images 144.
  • the device control system 130 translates the angle of joint(s) 108 into actuator commands for the device 102 using the kinematic model 132 to select pathways and steer the device 102.
  • the images (142, 144) differentiate between open pathways and tissues.
  • the device control system 130 selects open pathways that lead to a target location using both preoperative images 142 and intraoperative images 144.
  • the intraoperative imaging system 11 1 may include a mobile X-ray system for imaging of the anatomy and the device 102, a fiber optic endoscope inserted through the device lumen or integrated into the device, or other imaging configurations and technologies.
  • the image control system 106 is configured to integrate preoperative 3D images 142 (CT, MRI, etc.) and intraoperative images 144 (X-ray, endoscope, etc.) and register those into a single coordinate system of the robot device 102.
  • the image control system 106 is further configured to permit the user to plan a path to an affected sinus or other target or to identify a target.
  • a path is planned and locations and angles identified for tip steering based on position within the anatomy.
  • an instruction set of commands for steering control can be generated.
  • these commands are communicated to the device control system 130.
  • the commands are associated with position in the anatomy or other signposts to enable the issuance of a command at the correct time to select a pathway using the commands to control the steerable tip.
  • Steering may be in accordance with a plan 150 stored in memory 1 16.
  • the plan 150 may be selected in virtual space (e.g., using preoperative images 142).
  • the steering control may be performed in real-time using the device control system 130 to make path
  • a method for steering a robot is provided in accordance with illustrative embodiments. This method may be executed using the system 100 of FIG. 1.
  • a preoperative 3D image is taken and an affected sinus or other target is identified.
  • a steerable device e.g., robot
  • a guidewire is placed in the steerable device lumen and is inserted in the anatomy (e.g., the nose). This may be performed manually.
  • position or image feedback is collected for the steerable device within the volume. For example, an X-ray image of the steerable device is acquired and registration is performed (e.g., registration of preoperative images to intraoperative images and the steerable device).
  • the registration between the steerable device and X-ray system can be performed using methods known in art.
  • an endoscope image is acquired and registration is performed.
  • the registration between the device and endoscope images can be performed using methods known in art.
  • a position of the steerable device may be determined (e.g., using fiber optic positioning, electromagnetic positioning, image positioning, etc.).
  • the position of the steerable device may be employed for navigating the steerable device in the volume (with or without images).
  • a user/surgeon identifies a location of the affected sinus or target in one of the images (e.g., CT). Path planning is performed to determine an interactive path. The path planning may include using the image control system to compute all possible paths from the nose (or other orifice) to the sinus (or other target).
  • the user/surgeon follows the planned path in the volume (e.g., nasal cavity) by steering and employing a translation stage of the device (102) to advance the device tip.
  • the translation stage can be manual (handheld, sliding stage, etc.) or motorized (with a motion trigger or speed regulation).
  • the steerable device is automatically navigated and the steering is controlled by the control system in accordance with a plan or in real-time using position or image feedback.
  • the image control system receives the device position from the device control system and computes the tip position in the coordinate system of the path. With each computation cycle, the device control system computes whether the steerable tip needs to be actuated. If the tip position is not actuated, the device will continue to proceed along the previous path direction. If the device control system determines a change in direction is needed, the angle and direction for a given position is changed to steer the steerable tip. The device control system automatically steers the tip to comply with the desired or planned path.
  • treatment or other activities are conducted on the target area.
  • the steerable device is withdrawn and a balloon is guided using a guidewire placed through the steerable device. With the balloon placed, the balloon may be expanded to open up the sinus or other anatomical feature.
  • the device is withdrawn. The device withdrawal may also employ the steering capability of the device. While described in terms a nasal procedure, it should be understood that the present principles are applicable to any procedure and are especially useful for any navigation in constrained spaces.
  • a robotic feature 300 is illustratively shown in accordance with one example.
  • the feature 300 is included in the device 102 and provides translation and rotation motions for a tip of the device 102.
  • the feature 300 includes a shaft 310, which may include an internal lumen 308 to receive a guidewire (or catheter) or other elongated instruments.
  • the feature 300 is employed to steer a distal end portion of the steerable device (102).
  • the feature 300 is covered by a sheath or the like.
  • the feature 300 is part of a catheter and receives a guidewire within the internal lumen. Once the guidewire and the steerable device are in place, the steerable device (and feature 300) is/are withdrawn. The guidewire is then employed to guide a balloon catheter to the target location where the balloon is employed to expand the cavity for treatment.
  • the feature 300 includes an end effector 312 that may include a ring or other shape that encircles a catheter or other device passing through the internal lumen 308.
  • the end effector 312 may be employed to direct the catheter or other instrument passing through the internal lumen 308.
  • the end effector 312 is coupled to translatable rods 306 (tendons) by joints 302.
  • the translatable rods 306 can advance or retract into the shaft 310 to provide a translation motion in the direction of arrow "C". For example, when all three of the rods 306 are advanced (or retracted) concurrently, translation is realized. If the rods 306 are advanced or retracted at different rates or for different amounts, the relative motion will provide a rotation of the end effector 312 in the direction or directions of arrows "A" and/or "B".
  • a rotary platform 304 may be employed to cause the entire end effector 312 to rotate about a longitudinal axis of the shaft 310 (e.g., in the direction of arrow "D").
  • the feature 300 provides a plurality of degrees of freedom at a localized position. In this way, accurate and well-controlled steering of the device 102 can be achieved.
  • FIG. 3 shows an illustrative joint, it should be understood that more complex or simpler joints may be employed. These other joint types may include simple hinge joints, rotary joints, translational mechanisms, etc.
  • FIG. 4A an illustrative example of a steerable device 102 is shown in a first configuration.
  • the first configuration shows the steerable device 102 after insertion in a nasal cavity 320.
  • the device control system automatically senses that a steering action is needed to steer the tip 124 to comply with a desired or planned path, or the device control system senses that a particular pathway needs to be navigated in accordance with the plan.
  • the device control mechanism employs signal control to adjust the feature 300 to provide appropriate navigation of the device 102 by controlling the angles of the tip 124.
  • the steerable device 102 is shown in a second configuration.
  • the second configuration shows the steerable device 102 after a command is issued by the device control system to rotate the tip 124 using the feature 300 to control the insertion in a particular direction in the nasal cavity 320.
  • the device control system automatically steers the tip 124 to comply with the planned path or senses that pathway is the better path to achieve the present goal or target.
  • the robot 400 includes a steerable device 402 (see also, device 102) having one or more robotically controlled joints 408 configured to steer the device 402.
  • the device 402 includes a lumen 404 for storing other instruments, such as a guidewire or the like.
  • Each joint 408 may include a motor or motors 410 associated with it.
  • the motors 410 receive signals generated in accordance with control commands to control the joints 408.
  • a device control system 430 (see also, system 130) is configured to receive feedback from an image control system 406 (see also, system 106) to evaluate positioning of the steerable device 402 in a volume such that control commands are issued to the one or more robotically controlled joints 408 to steer the steerable device 402 in a direction consistent with navigation of the medical device toward a target or in accordance with a plan.
  • the image control system 406 registers preoperative and intraoperative images to locate the position of the steerable device in a single coordinate system.
  • the intraoperative images may include a camera image (endoscopy), an X-ray image or other imaging modality images.
  • the device control system 430 controls translation and/or rotation of the one or more robotically controlled joints 408 to bias the medical device toward a pathway.
  • the device control system 430 can also control an amount of translation and/or rotation based upon a position, direction and speed of the steerable device 402 as the steerable device 402 approaches a branching structure.
  • the device control system 430 includes a kinematic model 432 to evaluate dynamics of the steerable device 402 to control the one or more robotically controlled joints 408.
  • the kinematic model 432 is employed to anticipate a next turn or configuration to be taken by the steerable device 402.
  • the one or more robotically controlled joints 408 may include one, two or more degrees of rotation.
  • the steerable device 402 may also include a translation stage 414 to support advancing and/or retracting of the steerable device 402.
  • the one or more robotically controlled joints 408 may include a steerable tip or end effector 41 1 or other distally mounted structure on the robot 400.
  • the end effector 41 1 may include a plurality of translatable rods such that positions of the rods provide a rotation of the end effector 41 1 relative to a longitudinal axis of a shaft that supports the rods (FIG. 3).
  • the steerable tip 41 1 can be configured to implement yaw and pitch motion using two motors 410' (for one or more motors 410) and a universal joint 408' (for one or more joints 408). Two or more parallel motors 410'may be employed to implement the steering angle.
  • a tendon driven system (300) with two or more tendons embedded in the device 102 and coupled to actuators/motors at a distal end of the tendons can provide steering.
  • additional rotational degrees of freedom can rotate the device 402 around a primary axis (longitudinal axis) of the device 402.
  • One or more of these actuation and/or rotation schemes may be combined with any one or more other actuation and/or rotation schemes, as needed.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Robotics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Manipulator (AREA)
PCT/EP2017/057316 2016-03-31 2017-03-28 Image guided robot for catheter placement Ceased WO2017167754A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
US16/086,805 US20190105112A1 (en) 2016-03-31 2017-03-28 Image guided robot for catheter placement
CN201780022136.4A CN108882967A (zh) 2016-03-31 2017-03-28 用于导管放置的图像引导的机器人
EP17714419.3A EP3435904A1 (en) 2016-03-31 2017-03-28 Image guided robot for catheter placement
CN202511144129.4A CN120732543A (zh) 2016-03-31 2017-03-28 用于导管放置的图像引导的机器人
JP2018551379A JP7232051B2 (ja) 2016-03-31 2017-03-28 カテーテル配置のための画像誘導ロボット
US19/019,675 US20250152267A1 (en) 2016-03-31 2025-01-14 Image guided robot for catheter placement

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662315785P 2016-03-31 2016-03-31
US62/315,785 2016-03-31

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US16/086,805 A-371-Of-International US20190105112A1 (en) 2016-03-31 2017-03-28 Image guided robot for catheter placement
US19/019,675 Continuation US20250152267A1 (en) 2016-03-31 2025-01-14 Image guided robot for catheter placement

Publications (1)

Publication Number Publication Date
WO2017167754A1 true WO2017167754A1 (en) 2017-10-05

Family

ID=58455031

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/057316 Ceased WO2017167754A1 (en) 2016-03-31 2017-03-28 Image guided robot for catheter placement

Country Status (5)

Country Link
US (2) US20190105112A1 (enExample)
EP (1) EP3435904A1 (enExample)
JP (1) JP7232051B2 (enExample)
CN (2) CN120732543A (enExample)
WO (1) WO2017167754A1 (enExample)

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019074682A1 (en) * 2017-10-13 2019-04-18 Auris Health, Inc. ROBOTIC SYSTEM DESIGNED TO FOLLOW A NAVIGATION PATH
US10482599B2 (en) 2015-09-18 2019-11-19 Auris Health, Inc. Navigation of tubular networks
US10492741B2 (en) 2013-03-13 2019-12-03 Auris Health, Inc. Reducing incremental measurement sensor error
US10524866B2 (en) 2018-03-28 2020-01-07 Auris Health, Inc. Systems and methods for registration of location sensors
US10531864B2 (en) 2013-03-15 2020-01-14 Auris Health, Inc. System and methods for tracking robotically controlled medical instruments
US10555778B2 (en) 2017-10-13 2020-02-11 Auris Health, Inc. Image-based branch detection and mapping for navigation
US10806535B2 (en) 2015-11-30 2020-10-20 Auris Health, Inc. Robot-assisted driving systems and methods
US10827913B2 (en) 2018-03-28 2020-11-10 Auris Health, Inc. Systems and methods for displaying estimated location of instrument
US10898286B2 (en) 2018-05-31 2021-01-26 Auris Health, Inc. Path-based navigation of tubular networks
US10898275B2 (en) 2018-05-31 2021-01-26 Auris Health, Inc. Image-based airway analysis and mapping
US10905499B2 (en) 2018-05-30 2021-02-02 Auris Health, Inc. Systems and methods for location sensor-based branch prediction
US11020016B2 (en) 2013-05-30 2021-06-01 Auris Health, Inc. System and method for displaying anatomy and devices on a movable display
US11051681B2 (en) 2010-06-24 2021-07-06 Auris Health, Inc. Methods and devices for controlling a shapeable medical device
US11147633B2 (en) 2019-08-30 2021-10-19 Auris Health, Inc. Instrument image reliability systems and methods
US11160615B2 (en) 2017-12-18 2021-11-02 Auris Health, Inc. Methods and systems for instrument tracking and navigation within luminal networks
US11207141B2 (en) 2019-08-30 2021-12-28 Auris Health, Inc. Systems and methods for weight-based registration of location sensors
US11278357B2 (en) 2017-06-23 2022-03-22 Auris Health, Inc. Robotic systems for determining an angular degree of freedom of a medical device in luminal networks
US11298195B2 (en) 2019-12-31 2022-04-12 Auris Health, Inc. Anatomical feature identification and targeting
US11324558B2 (en) 2019-09-03 2022-05-10 Auris Health, Inc. Electromagnetic distortion detection and compensation
US11395703B2 (en) 2017-06-28 2022-07-26 Auris Health, Inc. Electromagnetic distortion detection
US11426095B2 (en) 2013-03-15 2022-08-30 Auris Health, Inc. Flexible instrument localization from both remote and elongation sensors
US11490782B2 (en) 2017-03-31 2022-11-08 Auris Health, Inc. Robotic systems for navigation of luminal networks that compensate for physiological noise
US11504187B2 (en) 2013-03-15 2022-11-22 Auris Health, Inc. Systems and methods for localizing, tracking and/or controlling medical instruments
US11503986B2 (en) 2018-05-31 2022-11-22 Auris Health, Inc. Robotic systems and methods for navigation of luminal network that detect physiological noise
US11510736B2 (en) 2017-12-14 2022-11-29 Auris Health, Inc. System and method for estimating instrument location
US11602372B2 (en) 2019-12-31 2023-03-14 Auris Health, Inc. Alignment interfaces for percutaneous access
US11660147B2 (en) 2019-12-31 2023-05-30 Auris Health, Inc. Alignment techniques for percutaneous access
US11771309B2 (en) 2016-12-28 2023-10-03 Auris Health, Inc. Detecting endolumenal buckling of flexible instruments
US11832889B2 (en) 2017-06-28 2023-12-05 Auris Health, Inc. Electromagnetic field generator alignment
US12076100B2 (en) 2018-09-28 2024-09-03 Auris Health, Inc. Robotic systems and methods for concomitant endoscopic and percutaneous medical procedures
US12414686B2 (en) 2020-03-30 2025-09-16 Auris Health, Inc. Endoscopic anatomical feature tracking
US12478444B2 (en) 2019-03-21 2025-11-25 The Board Of Trustees Of The Leland Stanford Junior University Systems and methods for localization based on machine learning
US12491042B2 (en) 2013-10-24 2025-12-09 Auris Health, Inc. Endoscopic device with helical lumen design

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4648061A2 (en) 2019-12-23 2025-11-12 Mazor Robotics Ltd. Multi-arm robotic system for spine surgery with imaging guidance
US20220392065A1 (en) 2020-01-07 2022-12-08 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
US11969280B2 (en) 2020-01-07 2024-04-30 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
WO2021141921A1 (en) 2020-01-07 2021-07-15 Cleerly, Inc. Systems, methods, and devices for medical image analysis, diagnosis, risk stratification, decision making and/or disease tracking
KR20230154256A (ko) 2021-10-05 2023-11-07 코린더스 인코포레이티드 세장형 의료 디바이스들의 로봇 작동
US12440180B2 (en) 2022-03-10 2025-10-14 Cleerly, Inc. Systems, devices, and methods for non-invasive image-based plaque analysis and risk determination
US20250217981A1 (en) 2022-03-10 2025-07-03 Cleerly, Inc. Systems, methods, and devices for image-based plaque analysis and risk determination
US12406365B2 (en) 2022-03-10 2025-09-02 Cleerly, Inc. Systems, devices, and methods for non-invasive image-based plaque analysis and risk determination
US20250143657A1 (en) 2022-03-10 2025-05-08 Cleerly, Inc. Systems, devices, and methods for non-invasive image-based plaque analysis and risk determination
CN119632677B (zh) * 2023-09-18 2025-11-18 深圳市精锋医疗科技股份有限公司 导管机器人系统及其控制方法、计算机可读存储介质

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120289783A1 (en) * 2011-05-13 2012-11-15 Intuitive Surgical Operations, Inc. Medical system with multiple operating modes for steering a medical instrument through linked body passages
US20140343416A1 (en) * 2013-05-16 2014-11-20 Intuitive Surgical Operations, Inc. Systems and methods for robotic medical system integration with external imaging

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6517477B1 (en) * 2000-01-27 2003-02-11 Scimed Life Systems, Inc. Catheter introducer system for exploration of body cavities
US6610007B2 (en) * 2000-04-03 2003-08-26 Neoguide Systems, Inc. Steerable segmented endoscope and method of insertion
CA2451824C (en) * 2001-06-29 2015-02-24 Intuitive Surgical, Inc. Platform link wrist mechanism
US8295577B2 (en) * 2005-03-31 2012-10-23 Michael Zarkh Method and apparatus for guiding a device in a totally occluded or partly occluded tubular organ
US8398541B2 (en) * 2006-06-06 2013-03-19 Intuitive Surgical Operations, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
US20070167702A1 (en) * 2005-12-30 2007-07-19 Intuitive Surgical Inc. Medical robotic system providing three-dimensional telestration
US8348861B2 (en) * 2006-06-05 2013-01-08 Technion Research & Development Foundation Ltd. Controlled steering of a flexible needle
US8161838B2 (en) * 2008-12-22 2012-04-24 Intuitive Surgical Operations, Inc. Method and apparatus for reducing at least one friction force opposing an axial force exerted through an actuator element
CN105342705A (zh) * 2009-03-24 2016-02-24 伊顿株式会社 利用增强现实技术的手术机器人系统及其控制方法
KR101132659B1 (ko) * 2009-04-02 2012-04-02 한국과학기술원 4 자유도를 가진 복강경 수술장치
US20120265051A1 (en) * 2009-11-09 2012-10-18 Worcester Polytechnic Institute Apparatus and methods for mri-compatible haptic interface
US8746252B2 (en) * 2010-05-14 2014-06-10 Intuitive Surgical Operations, Inc. Surgical system sterile drape
US20120226145A1 (en) * 2011-03-03 2012-09-06 National University Of Singapore Transcutaneous robot-assisted ablation-device insertion navigation system
GB201115586D0 (en) * 2011-09-09 2011-10-26 Univ Bristol A system for anatomical reduction of bone fractures
JP6785656B2 (ja) * 2013-08-15 2020-11-18 インテュイティブ サージカル オペレーションズ, インコーポレイテッド カテーテルの位置付け及び挿入のためのグラフィカル・ユーザインターフェイス
KR102354675B1 (ko) * 2013-08-15 2022-01-24 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 의료 절차 확인을 위한 시스템 및 방법
EP3057515B1 (en) * 2013-10-18 2018-07-11 Intuitive Surgical Operations, Inc. Wrist mechanism for surgical instrument
WO2015061692A1 (en) * 2013-10-25 2015-04-30 Intuitive Surgical Operations, Inc. Flexible instrument with embedded actuation conduits
CN105682729B (zh) * 2013-10-25 2019-06-18 直观外科手术操作公司 具有带槽的可操控管的柔性器械
EP4233768A3 (en) * 2014-03-17 2023-12-27 Intuitive Surgical Operations, Inc. Device and machine readable medium executing a method of recentering end effectors and input controls
US10912523B2 (en) * 2014-03-24 2021-02-09 Intuitive Surgical Operations, Inc. Systems and methods for anatomic motion compensation
DE102014009893B4 (de) * 2014-07-04 2016-04-28 gomtec GmbH Endeffektor für ein Instrument
CN107148250B (zh) * 2014-08-15 2020-07-24 直观外科手术操作公司 具有可变进入引导器配置的外科手术系统
US11273290B2 (en) * 2014-09-10 2022-03-15 Intuitive Surgical Operations, Inc. Flexible instrument with nested conduits
WO2016069998A1 (en) * 2014-10-30 2016-05-06 Intuitive Surgical Operations, Inc. System and method for articulated arm stabilization
US10603135B2 (en) * 2014-10-30 2020-03-31 Intuitive Surgical Operations, Inc. System and method for an articulated arm based tool guide
US11033716B2 (en) * 2015-01-12 2021-06-15 Intuitive Surgical Operations, Inc. Devices, systems, and methods for anchoring actuation wires to a steerable instrument
US11389268B2 (en) * 2015-02-05 2022-07-19 Intuitive Surgical Operations, Inc. System and method for anatomical markers
US11285314B2 (en) * 2016-08-19 2022-03-29 Cochlear Limited Advanced electrode array insertion

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120289783A1 (en) * 2011-05-13 2012-11-15 Intuitive Surgical Operations, Inc. Medical system with multiple operating modes for steering a medical instrument through linked body passages
US20140343416A1 (en) * 2013-05-16 2014-11-20 Intuitive Surgical Operations, Inc. Systems and methods for robotic medical system integration with external imaging

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3435904A1 *

Cited By (66)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11857156B2 (en) 2010-06-24 2024-01-02 Auris Health, Inc. Methods and devices for controlling a shapeable medical device
US11051681B2 (en) 2010-06-24 2021-07-06 Auris Health, Inc. Methods and devices for controlling a shapeable medical device
US10492741B2 (en) 2013-03-13 2019-12-03 Auris Health, Inc. Reducing incremental measurement sensor error
US11241203B2 (en) 2013-03-13 2022-02-08 Auris Health, Inc. Reducing measurement sensor error
US12156755B2 (en) 2013-03-13 2024-12-03 Auris Health, Inc. Reducing measurement sensor error
US11969157B2 (en) 2013-03-15 2024-04-30 Auris Health, Inc. Systems and methods for tracking robotically controlled medical instruments
US11426095B2 (en) 2013-03-15 2022-08-30 Auris Health, Inc. Flexible instrument localization from both remote and elongation sensors
US11504187B2 (en) 2013-03-15 2022-11-22 Auris Health, Inc. Systems and methods for localizing, tracking and/or controlling medical instruments
US10531864B2 (en) 2013-03-15 2020-01-14 Auris Health, Inc. System and methods for tracking robotically controlled medical instruments
US11129602B2 (en) 2013-03-15 2021-09-28 Auris Health, Inc. Systems and methods for tracking robotically controlled medical instruments
US12232711B2 (en) 2013-03-15 2025-02-25 Auris Health, Inc. Systems and methods for tracking robotically controlled medical instruments
US11020016B2 (en) 2013-05-30 2021-06-01 Auris Health, Inc. System and method for displaying anatomy and devices on a movable display
US12491042B2 (en) 2013-10-24 2025-12-09 Auris Health, Inc. Endoscopic device with helical lumen design
US12089804B2 (en) 2015-09-18 2024-09-17 Auris Health, Inc. Navigation of tubular networks
US10796432B2 (en) 2015-09-18 2020-10-06 Auris Health, Inc. Navigation of tubular networks
US10482599B2 (en) 2015-09-18 2019-11-19 Auris Health, Inc. Navigation of tubular networks
US11403759B2 (en) 2015-09-18 2022-08-02 Auris Health, Inc. Navigation of tubular networks
US11464591B2 (en) 2015-11-30 2022-10-11 Auris Health, Inc. Robot-assisted driving systems and methods
US10813711B2 (en) 2015-11-30 2020-10-27 Auris Health, Inc. Robot-assisted driving systems and methods
US10806535B2 (en) 2015-11-30 2020-10-20 Auris Health, Inc. Robot-assisted driving systems and methods
US11771309B2 (en) 2016-12-28 2023-10-03 Auris Health, Inc. Detecting endolumenal buckling of flexible instruments
US11490782B2 (en) 2017-03-31 2022-11-08 Auris Health, Inc. Robotic systems for navigation of luminal networks that compensate for physiological noise
US12053144B2 (en) 2017-03-31 2024-08-06 Auris Health, Inc. Robotic systems for navigation of luminal networks that compensate for physiological noise
US11278357B2 (en) 2017-06-23 2022-03-22 Auris Health, Inc. Robotic systems for determining an angular degree of freedom of a medical device in luminal networks
US12295672B2 (en) 2017-06-23 2025-05-13 Auris Health, Inc. Robotic systems for determining a roll of a medical device in luminal networks
US11759266B2 (en) 2017-06-23 2023-09-19 Auris Health, Inc. Robotic systems for determining a roll of a medical device in luminal networks
US11395703B2 (en) 2017-06-28 2022-07-26 Auris Health, Inc. Electromagnetic distortion detection
US11832889B2 (en) 2017-06-28 2023-12-05 Auris Health, Inc. Electromagnetic field generator alignment
US11969217B2 (en) 2017-10-13 2024-04-30 Auris Health, Inc. Robotic system configured for navigation path tracing
US10555778B2 (en) 2017-10-13 2020-02-11 Auris Health, Inc. Image-based branch detection and mapping for navigation
WO2019074682A1 (en) * 2017-10-13 2019-04-18 Auris Health, Inc. ROBOTIC SYSTEM DESIGNED TO FOLLOW A NAVIGATION PATH
US11058493B2 (en) 2017-10-13 2021-07-13 Auris Health, Inc. Robotic system configured for navigation path tracing
US11850008B2 (en) 2017-10-13 2023-12-26 Auris Health, Inc. Image-based branch detection and mapping for navigation
US11510736B2 (en) 2017-12-14 2022-11-29 Auris Health, Inc. System and method for estimating instrument location
US11160615B2 (en) 2017-12-18 2021-11-02 Auris Health, Inc. Methods and systems for instrument tracking and navigation within luminal networks
US12226168B2 (en) 2018-03-28 2025-02-18 Auris Health, Inc. Systems and methods for registration of location sensors
US10827913B2 (en) 2018-03-28 2020-11-10 Auris Health, Inc. Systems and methods for displaying estimated location of instrument
US10524866B2 (en) 2018-03-28 2020-01-07 Auris Health, Inc. Systems and methods for registration of location sensors
US11712173B2 (en) 2018-03-28 2023-08-01 Auris Health, Inc. Systems and methods for displaying estimated location of instrument
US10898277B2 (en) 2018-03-28 2021-01-26 Auris Health, Inc. Systems and methods for registration of location sensors
US11950898B2 (en) 2018-03-28 2024-04-09 Auris Health, Inc. Systems and methods for displaying estimated location of instrument
US11576730B2 (en) 2018-03-28 2023-02-14 Auris Health, Inc. Systems and methods for registration of location sensors
US12171504B2 (en) 2018-05-30 2024-12-24 Auris Health, Inc. Systems and methods for location sensor-based branch prediction
US10905499B2 (en) 2018-05-30 2021-02-02 Auris Health, Inc. Systems and methods for location sensor-based branch prediction
US11793580B2 (en) 2018-05-30 2023-10-24 Auris Health, Inc. Systems and methods for location sensor-based branch prediction
US12364552B2 (en) 2018-05-31 2025-07-22 Auris Health, Inc. Path-based navigation of tubular networks
US11759090B2 (en) 2018-05-31 2023-09-19 Auris Health, Inc. Image-based airway analysis and mapping
US11864850B2 (en) 2018-05-31 2024-01-09 Auris Health, Inc. Path-based navigation of tubular networks
US10898286B2 (en) 2018-05-31 2021-01-26 Auris Health, Inc. Path-based navigation of tubular networks
US11503986B2 (en) 2018-05-31 2022-11-22 Auris Health, Inc. Robotic systems and methods for navigation of luminal network that detect physiological noise
US10898275B2 (en) 2018-05-31 2021-01-26 Auris Health, Inc. Image-based airway analysis and mapping
US12076100B2 (en) 2018-09-28 2024-09-03 Auris Health, Inc. Robotic systems and methods for concomitant endoscopic and percutaneous medical procedures
US12478444B2 (en) 2019-03-21 2025-11-25 The Board Of Trustees Of The Leland Stanford Junior University Systems and methods for localization based on machine learning
US11147633B2 (en) 2019-08-30 2021-10-19 Auris Health, Inc. Instrument image reliability systems and methods
US11944422B2 (en) 2019-08-30 2024-04-02 Auris Health, Inc. Image reliability determination for instrument localization
US11207141B2 (en) 2019-08-30 2021-12-28 Auris Health, Inc. Systems and methods for weight-based registration of location sensors
US12257006B2 (en) 2019-09-03 2025-03-25 Auris Health, Inc. Electromagnetic distortion detection and compensation
US11324558B2 (en) 2019-09-03 2022-05-10 Auris Health, Inc. Electromagnetic distortion detection and compensation
US11864848B2 (en) 2019-09-03 2024-01-09 Auris Health, Inc. Electromagnetic distortion detection and compensation
US11660147B2 (en) 2019-12-31 2023-05-30 Auris Health, Inc. Alignment techniques for percutaneous access
US11298195B2 (en) 2019-12-31 2022-04-12 Auris Health, Inc. Anatomical feature identification and targeting
US12414823B2 (en) 2019-12-31 2025-09-16 Auris Health, Inc. Anatomical feature tracking
US12465431B2 (en) 2019-12-31 2025-11-11 Auris Health, Inc. Alignment techniques for percutaneous access
US11602372B2 (en) 2019-12-31 2023-03-14 Auris Health, Inc. Alignment interfaces for percutaneous access
US12220150B2 (en) 2019-12-31 2025-02-11 Auris Health, Inc. Aligning medical instruments to access anatomy
US12414686B2 (en) 2020-03-30 2025-09-16 Auris Health, Inc. Endoscopic anatomical feature tracking

Also Published As

Publication number Publication date
CN108882967A (zh) 2018-11-23
JP7232051B2 (ja) 2023-03-02
CN120732543A (zh) 2025-10-03
EP3435904A1 (en) 2019-02-06
US20250152267A1 (en) 2025-05-15
US20190105112A1 (en) 2019-04-11
JP2019512354A (ja) 2019-05-16

Similar Documents

Publication Publication Date Title
US20250152267A1 (en) Image guided robot for catheter placement
US12226168B2 (en) Systems and methods for registration of location sensors
US20240215856A1 (en) Skeleton model instrument localization
US12433696B2 (en) Tool positioning for medical instruments with working channels
JP7167030B2 (ja) 形状検知デバイスを使用した医療ナビゲーションシステム及びその動作方法
US20230181204A1 (en) Basket apparatus
US11534249B2 (en) Process for percutaneous operations
AU2018347893B2 (en) Robotic system configured for navigation path tracing
KR102683476B1 (ko) 항행(navigation)을 위한 이미지 기반 분지(branch) 감지 및 매핑
US20210393338A1 (en) Medical instrument driving
JP2022502179A (ja) 内視鏡支援経皮的医療処置のためのシステム及び方法
WO2021038469A1 (en) Systems and methods for weight-based registration of location sensors
JP2023515420A (ja) カテーテル標的ロックのための方法及びシステム

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018551379

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2017714419

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2017714419

Country of ref document: EP

Effective date: 20181031

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17714419

Country of ref document: EP

Kind code of ref document: A1

WWW Wipo information: withdrawn in national office

Ref document number: 2017714419

Country of ref document: EP