IL297557A - A training system for a neural network to guide a robotic arm to operate a catheter - Google Patents
A training system for a neural network to guide a robotic arm to operate a catheterInfo
- Publication number
- IL297557A IL297557A IL297557A IL29755722A IL297557A IL 297557 A IL297557 A IL 297557A IL 297557 A IL297557 A IL 297557A IL 29755722 A IL29755722 A IL 29755722A IL 297557 A IL297557 A IL 297557A
- Authority
- IL
- Israel
- Prior art keywords
- probe
- training
- sensors
- handpiece
- distal end
- Prior art date
Links
- 238000012549 training Methods 0.000 title claims description 74
- 238000013528 artificial neural network Methods 0.000 title claims description 33
- 239000000523 sample Substances 0.000 claims description 71
- 238000000034 method Methods 0.000 claims description 37
- 230000033001 locomotion Effects 0.000 claims description 22
- 238000010801 machine learning Methods 0.000 claims description 18
- 210000000056 organ Anatomy 0.000 claims description 16
- 210000003813 thumb Anatomy 0.000 claims description 8
- 238000003780 insertion Methods 0.000 claims description 2
- 230000037431 insertion Effects 0.000 claims description 2
- 230000000747 cardiac effect Effects 0.000 description 7
- 238000002679 ablation Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000013507 mapping Methods 0.000 description 4
- 210000003484 anatomy Anatomy 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 210000005242 cardiac chamber Anatomy 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000002432 robotic surgery Methods 0.000 description 2
- 210000001519 tissue Anatomy 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 206010003119 arrhythmia Diseases 0.000 description 1
- 230000006793 arrhythmia Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 239000012636 effector Substances 0.000 description 1
- 210000002837 heart atrium Anatomy 0.000 description 1
- 210000005003 heart tissue Anatomy 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 210000003492 pulmonary vein Anatomy 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/20—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/32—Surgical robots operating autonomously
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/74—Manipulators with manual electric input means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
- A61B5/061—Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
- A61B5/062—Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using magnetic field
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
- A61B5/061—Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
- A61B5/063—Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using impedance measurements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1124—Determining motor skills
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/00234—Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
- A61B2017/00238—Type of minimally invasive operation
- A61B2017/00243—Type of minimally invasive operation cardiac
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2059—Mechanical position encoders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B2034/301—Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/33—Director till display
- G05B2219/33027—Artificial neural network controller
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39001—Robot, manipulator control
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Surgery (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Robotics (AREA)
- Artificial Intelligence (AREA)
- Pathology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Business, Economics & Management (AREA)
- Business, Economics & Management (AREA)
- Evolutionary Computation (AREA)
- Mathematical Physics (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Software Systems (AREA)
- Fuzzy Systems (AREA)
- Physiology (AREA)
- Theoretical Computer Science (AREA)
- Computing Systems (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Psychiatry (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Dentistry (AREA)
Description
BIO6578USNP A TRAINING SYSTEM FOR A NEURAL NETWORK TO GUIDE A ROBOTIC ARM TO OPERATE A CATHETER CROSS-REFERENCE TO RELATED APPLICATIONS This application claims the benefit of U.S. Provisional Patent Application 63/271,270, filed October, 2021, whose disclosure is incorporated herein by reference.
FIELD OF THE INVENTION This invention relates generally to robotic surgery, and specifically to automatic guiding and maneuvering of a catheter coupled to a robotic arm by a neural network (NN).
BACKGROUND OF THE INVENTION Controlling medical robotic systems was previously suggested in the patent literature. For example, U.S. Patent Application Publication 2014/0046128 describes a control method that may be applied to a surgical robot system including a slave robot having a robot arm to which a main surgical tool and an auxiliary surgical tool are coupled, and a master robot having a master manipulator to manipulate the robot arm. The control method includes acquiring data regarding a motion of the master manipulator, predicting a basic motion to be performed by an operator based on the acquired motion data and results of learning a plurality of motions constituting a surgical task, and adjusting the auxiliary surgical tool so as to correspond to the operator basic motion based on the predicted basic motion. The control method allows an operator to perform surgery more comfortably and to move or fix all required surgical tools to or at an optimized surgical position.
BIO6578USNP As another example, U.S. Patent Application Publication 2021/0121251 describes an apparatus for robotic surgery that comprises a processor configured with instructions to receive patient data from treated patients, receive surgical robotics data for each of the plurality of treated patients, and output a treatment plan of a patient to be treated in response to the patient data and the surgical robotics data. This approach has the advantage of accommodating individual variability among patients and surgical system parameters so as to provide improved treatment outcomes. U.S. Patent Application Publication 2020/02974describes certain aspects related to systems and techniques for localizing and/or navigating a medical instrument within a luminal network. A medical system can include an elongate body configured to be inserted into the luminal network, as well as an imaging device positioned on a distal portion of the elongate body. The system may include memory and processors configured to receive from the imaging device image data that includes an image captured when the elongate body is within the luminal network. The image can depict one or more branchings of the luminal network. The processor can be configured to access a machine learning model of one or more luminal networks and determine, based on the machine learning model and information regarding the one or more branchings, a location of the distal portion of the elongate body within the luminal network. The present disclosure will be more fully understood from the following detailed description of the embodiments thereof, taken together with the drawings, in which: BIO6578USNP BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 is a schematic, pictorial illustration of a training system for a neural network (NN) to guide a robotic arm to operate a cardiac catheter, in accordance with an embodiment of the present invention; and Fig. 2 is a combined block diagram and flow chart that together schematically illustrate a method and algorithm for training and using an NN to guide a robotic arm to operate a catheter, according to an embodiment of the present invention.
DETAILED DESCRIPTION OF EMBODIMENTS OVERVIEW A probe, e.g., a catheter, such as a cardiac catheter, is typically manipulated manually by a physician operating a handpiece (e.g., a handle equipped with controls such as knobs and actuators) in order to position the probe correctly inside an organ such as a heart. For example, to diagnose and treat an arrhythmia, a physician may need to insert a multi-electrode catheter into the heart of the patient, to measure intracardiac signals via the catheter, and apply ablation according to findings. To this end, the physician has to be highly skillful in operating the catheter, while at the same time working under heavy workload conditions that limit physician availability. To increase availability of catheterization procedures in general, and cardiac catheter-based procedures in particular, some embodiments of the disclosed technique envision catheter manipulation performed by a robotic arm operated by a neural network (NN) that translates high level requests from a physician for a BIO6578USNP particular catheter operation (e.g., by specifying a target anatomical structure to be approached) into actions by the robotic arm operating the handpiece. Such hands-free robotic control requires a set of training data for the NN in order to train the robotic system, training data which is currently absent. Embodiments of the present invention that are described hereafter provide methods and systems for training machine learning (e.g., NN) or other artificial intelligence (AI) models to control a robot arm to manipulate a catheter to robotically perform an invasive clinical catheter-based procedure. Some embodiments of the invention provide one or more training setups, each training setup comprising a set of sensors attached to the handpiece of a probe (e.g., catheter) used by the physician performing an invasive procedure. The sensors record the motions of the handpiece, i.e., as its operation causes a distal end of the catheter to be translated, directed, rotated and locked in place. The sensors also track activation and operation of elements of the handpiece by the physician, e.g., a knob used to deflect the distal end of the catheter. The term "handpiece" generally refers hereinafter to a proximal section of a probe handle disposed with controls, but the "handpiece" may also include a proximal section of a shaft and/or a sheath of the probe in the vicinity of the handle, which the physician may manipulate as well. As the handpiece is manipulated by the physician, the readouts of the sensors are recorded, as are the corresponding locations of the catheter from the catheter sensors.
BIO6578USNP The one or more training setups of the disclosed training system are used by one or more (typically many) physicians, and the training data acquired from each physician’s actions is recorded and assembled into a set of data used to train (e.g., teach) an NN. The sensors may be numerous and of different types, such as magnetic sensors, electrical sensors, "encoding" sensors - i.e., those that use an encoder, and more. A detailed embodiment of such a training system with its particular sensors is described in Fig. 1. Training targets are clinical actions, such as arriving at a target location, and an NN optimization process (e.g., defined by a loss function) can be achieved by reaching a target using a minimal number of catheter actions performed the via handpiece or using a minimal amount of spatial movement of the distal end in each operational step. As an example, a catheter task is assumed to be anatomically mapping an atrium volume of 8×8×8 cm with a resolution of 2 mm, i.e., including 64,000 voxels. The operating physician wants the cardiac catheter to move, and provides two voxel numbers to the net: a start position (e.g., voxel 15) and an end position (e.g., voxel 2050). In this embodiment, the NN has to be trained to find the closest movement, or series of movements, from a given start location to a given end location (for example, it may find a double movement from a location (e.g., position, direction and roll angle) voxel 15 to voxel 350, and then from 350 to 2050, or a single movement from voxel 15 to 2050). Once the NN chooses the closest (i.e., minimal) movement(s), it outputs commands to the robot corresponding to motions of the handpiece, i.e., translation (e.g., BIO6578USNP advancement/retraction), deflection, and rotation of the distal end of the catheter’s shaft using controls over the handpiece. To have the above motions commanded by the NN accurately and consistently enough to fit varying patient anatomies, at least several tens of NN training data sets are envisaged to be collected. However, as hundreds of such procedures may be performed over a year, the training systems can readily accomplish a very extensive and robust training data cohort.
SYSTEM DESCRIPTION Fig. 1 is a schematic, pictorial illustration of a training system 20 for a neural network (NN) to guide a robotic arm to operate a cardiac catheter, in accordance with an embodiment of the present invention. Fig. 1 depicts a physician 30 using a multi-arm Pentaray® catheter 21 to perform mapping and/or ablation of a heart 26 of a patient on a surgical table 29. As inset 25 shows, catheter comprises, at its distal end, a multi-arm distal end assembly 40 coupled with electrodes 42. During the catheterization procedure, electrodes 42 acquire and/or inject electrical signals from and/or to the tissue of heart 26. Distal end assembly 40 is further coupled with a magnetic location sensor 50, which is configured to output signals indicative of distal end assembly 40’s position, direction and roll angle inside heart 26. Using sensor 50, a processor 34 in a console 24 can therefore determine the position, direction and roll angle of the distal end assembly. For this purpose, console further comprises a driver circuit 39, which drives BIO6578USNP magnetic field generators 36 placed at known positions external to patient 28, e.g., below the patient's torso. Physician 30 can view the location of distal end assembly in an image 33 of heart 26 on a user display 32. The method of magnetic location sensing is implemented, for example, in the CARTO™ system, produced by Biosense Webster, and is described in detail in U.S. Pat. Nos. 5,391,199, 6,690,963, 6,484,118, 6,239,724, 6,618,612 and 6,332,089, in PCT Patent Publication WO 96/05768, and in U.S. Patent Application Publications 2002/0065455 A1, 2003/0120150 A1 and 2004/0068178 A1, whose disclosures are all incorporated herein by reference. Further to magnetic location tracking, during the procedure an electrical location tracking system may be used to track the respective locations of electrodes 42, by associating each electrode-acquired location signal with a cardiac location at which the signal was acquired. For example, the Active Current Location (ACL) system, made by Biosense-Webster (Irvine, California), which is described in U.S. Patent 8,456,182, whose disclosure is incorporated herein by reference, may be used. In the ACL system, a processor estimates the respective locations of electrodes based on impedances measured between each of the catheter electrodes 42, and a plurality of surface electrodes (not shown) that are coupled to the skin of patient 28. Physician 30 navigates distal end assembly 40 that is mounted on a shaft 22 to a target location in heart 26 by holding catheter handle 31 with a right hand, and manipulating shaft 22 by advancing the shaft with a left BIO6578USNP hand or using a thumb control 43 near the proximal end of handle 31 and/or deflection from the sheath 23. As inset 45 shows, handle 31 further comprises a knob having a lock button 37, which physician 30 also uses for manipulating shaft 22. A set of sensors 52-60, described below, is coupled to catheter 21 in various locations to track the physician’s operations. In particular, multiple sensors 52-60 record the motions of handle 31, of a proximal section of sheath 23, and shaft 22, i.e., as physician 30 translates and rotates catheter during the catheterization procedure. The above-described operation of catheter 21 by physician 30 is collectively named hereinafter "manipulating the handpiece," and the purpose of system 20 is to generate location data that enables a training of an NN to be used by a processor to guide a catheter using a robotic arm, as described in Fig. 2. By way of example, elements of the handpiece e.g., knob 27 used to deflect (44) (i.e., redirect by variably curving) a distal end of shaft 22 of the catheter and button to lock it in place, and thumb control 43 to advance or retract (62) shaft 22 as well rotate (64) shaft 22, are activated by the physician. As the handpiece is manipulated by the physician, the readouts of sensors 52-60 are recorded, as are the corresponding locations of the catheter (from sensor 50 on the catheter and/or from the advanced location current (ACL) readouts). In the shown embodiment, sensor 52 is a magnetic sensor configured to indicate position, direction and roll angle of handle 31. Sensor 54 is an encoding sensor configured to indicate the amount and direction of knob 27 BIO6578USNP rotation. Sensor 56 is configured to indicate the timings and durations of using push button 37. Sensor 58 is a magnetic sensor configured to indicate position, direction and roll angle of a proximal section of sheath 23. Sensor is an encoding sensor configured to indicate the amount and direction of shaft 22 distal or proximal motion 62. Sensor 59 is an encoding senor configured to indicate the amount and direction of shaft 22 rotation 64 about its longitudinal axis. The entire recorded catheter location information is stored in a memory 38 in console 24, to be used during training of an NN, as described in Fig. 2. The example illustration shown in Fig. 1 is chosen purely for the sake of conceptual clarity. Additional or alternative sensor locations of sensors 52-60 may be used. Other types of sensors and catheters may equivalently be employed during training. The sensors can sense one or more actions of actuators located at the handpiece. Contact with tissue sensors may be fitted at distal end assembly 40. In an embodiment, processor 34 is further configured to indicate the quality of physical contact between each of the electrodes 42 and an inner surface of the cardiac tissue during measurement. Processor 34 typically comprises a general-purpose computer with software programmed to carry out the functions described herein. In particular, processor runs a dedicated algorithm as disclosed herein, including in Fig. 2, that enables processor 34 to perform the disclosed steps, as further described below. The software may be downloaded to the computer in electronic form, over a network, for example, or it may, alternatively or additionally, be provided and/or stored on non-transitory BIO6578USNP tangible media, such as magnetic, optical, or electronic memory.
TRAINING AN NN TO GUIDE A ROBOTIC ARM TO OPERATE A CATHETER Fig. 2 is a combined block diagram and flow chart that together schematically illustrate a method and algorithm for training and using an NN 250 to guide a robotic arm to operate a catheter, according to an embodiment of the present invention. The combined block diagram and flow chart is divided into two phases: a training phase 201 and an operational phase 202. Training phase 201 involves collection of training data from multiple training systems, such as system 20 of Fig. 1 and/or from multiple procedures performed on a system such as training system 20. The recorded training data comprises two corresponding data sets: A first data set 236: the probe handpiece sensors 233 (e.g., handpiece of catheter 21 of Fig. with sensors 52-60) and, a second data set 242: from catheter location readout 240 of the distal and of the probe inside an organ (e.g., respective position, direction and roll-angle of distal end assembly 40 of catheter 21 inside hearth 26) Multiple data sets 236 and 242 are stored in a memory, such as memory 38 of system 20, to be used for training NN 250 to become a trained NN 255. Operational phase 202 involves using a trained NN in a system that comprises a probe coupled to a robotic arm. Such system is described, for example, in U.S. patent BIO6578USNP 8,046,049 that describes an apparatus for use with a steerable catheter that includes a thumb control which can be adapted to control, for example, movement of a shaft or a deflection of a distal tip of the catheter. The apparatus includes a robot with an end-effector adapted to be coupled to the thumb control, such as thumb control 43 or a knob 27, to advance/retract and/or deflect the distal tip, such as distal assembly 40. U.S. patent 8,046,049 is assigned to the assignee of the current patent application, and its disclosure is incorporated herein by reference. As seen in Fig. 2, during operation phase 202, the physician communicates desired catheter motion 260. For example, a physician, such as physician 30, may command mapping of a cardiac chamber and/or ablation therein (e.g., of an ostium of a pulmonary vein). NN 250, which was trained (i.e., taught 248) using a collection 238 of data sets from numerous cardiac chamber mapping and/or ablation sessions, converts the high-level instruction 260 into actual commands 270 to the robotic arm, to manipulate the catheter as learned. The example combined block diagram and flow chart shown in Fig. 2 is chosen purely for the sake of conceptual clarity. The present embodiment may also comprise additional steps of the algorithm, such as confirmation request steps. These and other possible steps are omitted from the disclosure herein purposely in order to provide a more simplified figure. Although the embodiments described herein mainly address cardiac catheter-based application, the methods and systems described herein can also be used in training any BIO6578USNP medical robotic arm to operate a probe coupled to the robotic arm.
EXAMPLES OF EMBODIMENTS Example An embodiment of the present invention that is described hereinafter provides a training system (20) including one or more training setups and a processor (34). Each of the one or more training setups includes a probe (21) for insertion into an organ, the probe including (i) multiple sensors (52-60) located at a handpiece (31) of the probe, the sensors configured to sense probe operation by a physician performing an invasive procedure using the probe, and (ii) a distal sensor (50) located at a distal end of the probe (21) and configured to indicate a position of the distal end inside the organ corresponding to the probe operation. The processor (34) is configured to (a) receive, from the one or more training setups, probe operation data acquired using the multiple sensors (52-60) and the respective distal sensor (50) of each training setup, and (b) using the probe operation data, train a machine learning (ML) model (250) to guide a robotic arm to operate the probe to perform the invasive procedure. Example The training system according to example 1, wherein the sensors (52-60) at the handpiece (31) are configured BIO6578USNP to sense at least one probe operation type selected from a group of types consisting of a position adjustment, a direction adjustment and a roll angle adjustment of the distal end of the probe (21). Example The training system according to any of examples 1 and 2, wherein the sensors (52-60) at the handpiece (31) are configured to sense at least one probe operation type selected from a group of types consisting of advancing, retracting, deflecting and rotating of the distal end of a shaft of the probe (21). Example The training system according to any of examples through 3, wherein the sensors (52-60) at the handpiece (310 are configured to sense the probe operation by sensing one or more actions of actuators located at the handpiece. Example The training system according to example 4, wherein the actuators comprise at least one of a distal section of a sheath of the probe, a thumb control, a knob, and a lock button.
BIO6578USNP Example The training system according to any of examples through 5, wherein the sensors at the handpiece and the distal sensor are one of magnetic, electric and encoding sensors. Example The training system according to any of examples through 6, wherein the processor (34) is configured to train the ML model (250) to find a minimal number of movements from a given start location in the organ to a given end location of the distal end of the probe in the organ. Example The training system according to any of examples through 7, wherein the ML model (250) is a neural network (NN). Example A training method includes, in each of one or more training setups, inserting a probe (21) into an organ, the probe including multiple sensors (52-60) located at a handpiece (31) of the probe, the sensors configured to sense probe operation by a physician performing an invasive procedure using the probe. The probe (21) further includes BIO6578USNP a distal sensor (50) located at a distal end of the probe and configured to indicate a position of the distal end inside the organ corresponding to the probe (21) operation. Probe operation data is received from the one or more training setups, the data acquired using the multiple sensors (52-60) and the respective distal sensor (50) of each training setup. using the probe operation data, training a machine learning (ML) model (250) to guide a robotic arm to operate the probe (21) to perform the invasive procedure. It will be thus appreciated that the embodiments described above are cited by way of example, and that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and sub-combinations of the various features described hereinabove, as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.
Claims (16)
1. A training system, comprising: one or more training setups, each training setup comprising: a probe for insertion into an organ, the probe comprising: multiple sensors located at a handpiece of the probe, the sensors configured to sense probe operation by a physician performing an invasive procedure using the probe; and a distal sensor located at a distal end of the probe and configured to indicate a position of the distal end inside the organ corresponding to the probe operation; and a processor, configured to: receive, from the one or more training setups, probe operation data acquired using the multiple sensors and the respective distal sensor of each training setup; and using the probe operation data, train a machine learning (ML) model to guide a robotic arm to operate the probe to perform the invasive procedure.
2. The training system according to claim 1, wherein the sensors at the handpiece are configured to sense at least one probe operation type selected from a group of types consisting of a position adjustment, a direction adjustment and a roll angle adjustment of the distal end of the probe. BIO6578USNP
3. The training system according to claim 1, wherein the sensors at the handpiece are configured to sense at least one probe operation type selected from a group of types consisting of advancing, retracting, deflecting and rotating of the distal end of a shaft of the probe.
4. The training system according to claim 1, wherein the sensors at the handpiece are configured to sense the probe operation by sensing one or more actions of actuators located at the handpiece.
5. The training system according to claim 4, wherein the actuators comprise at least one of a distal section of a sheath of the probe, a thumb control, a knob, and a lock button.
6. The training system according to claim 1, wherein the sensors at the handpiece and the distal sensor are one of magnetic, electric and encoding sensors.
7. The training system according to claim 1, wherein the processor is configured to train the ML model to find a minimal number of movements from a given start location in the organ to a given end location of the distal end of the probe in the organ.
8. The training system according to claim 1, wherein the ML model is a neural network (NN). BIO6578USNP
9. A training method, comprising: in each of one or more training setups, inserting a probe into an organ, the probe comprising: multiple sensors located at a handpiece of the probe, the sensors configured to sense probe operation by a physician performing an invasive procedure using the probe; and a distal sensor located at a distal end of the probe and configured to indicate a position of the distal end inside the organ corresponding to the probe operation; receiving, from the one or more training setups, probe operation data acquired using the multiple sensors and the respective distal sensor of each training setup; and using the probe operation data, training a machine learning (ML) model to guide a robotic arm to operate the probe to perform the invasive procedure.
10. The training method according to claim 9, wherein the sensors at the handpiece are configured to sense at least one probe operation type selected from a group of types consisting of a position adjustment, a direction adjustment and a roll angle adjustment of the distal end of the probe.
11. The training method according to claim 9, wherein the sensors at the handpiece are configured to sense at least one probe operation type selected from a group of types consisting of advancing, retracting, deflecting and rotating of the distal end of a shaft of the probe. BIO6578USNP
12. The training method according to claim 9, wherein the sensors at the handpiece are configured to sense the probe operation by sensing one or more actions of actuators located at the handpiece.
13. The training method according to claim 12, wherein the actuators comprise at least one of a distal section of a sheath of the probe, a thumb control, a knob, and a lock button.
14. The training method according to claim 9, wherein the sensors at the handpiece and the distal sensor are one of magnetic, electric and encoding sensors.
15. The training method according to claim 9, wherein training the ML model comprises training the ML model to find a minimal number of movements from a given start location in the organ to a given end location of the distal end of the probe in the organ.
16. The training method according to claim 9, wherein the ML model is a neural network (NN).
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202163271270P | 2021-10-25 | 2021-10-25 | |
US17/965,061 US20230128764A1 (en) | 2021-10-25 | 2022-10-13 | Training system for a neural network to guide a robotic arm to operate a catheter |
Publications (1)
Publication Number | Publication Date |
---|---|
IL297557A true IL297557A (en) | 2023-05-01 |
Family
ID=83994986
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
IL297557A IL297557A (en) | 2021-10-25 | 2022-10-23 | A training system for a neural network to guide a robotic arm to operate a catheter |
Country Status (5)
Country | Link |
---|---|
US (1) | US20230128764A1 (en) |
EP (1) | EP4169444A1 (en) |
JP (1) | JP2023064087A (en) |
CN (1) | CN116030126A (en) |
IL (1) | IL297557A (en) |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5391199A (en) | 1993-07-20 | 1995-02-21 | Biosense, Inc. | Apparatus and method for treating cardiac arrhythmias |
CA2607769C (en) | 1994-08-19 | 2012-04-24 | Biosense, Inc. | Medical diagnosis, treatment and imaging systems |
US6690963B2 (en) | 1995-01-24 | 2004-02-10 | Biosense, Inc. | System for determining the location and orientation of an invasive medical instrument |
ES2210498T3 (en) | 1996-02-15 | 2004-07-01 | Biosense, Inc. | POSITIONABLE TRANSDUCERS INDEPENDENTLY FOR LOCATION SYSTEM. |
CA2246287C (en) | 1996-02-15 | 2006-10-24 | Biosense, Inc. | Medical procedures and apparatus using intrabody probes |
US6239724B1 (en) | 1997-12-30 | 2001-05-29 | Remon Medical Technologies, Ltd. | System and method for telemetrically providing intrabody spatial position |
US6484118B1 (en) | 2000-07-20 | 2002-11-19 | Biosense, Inc. | Electromagnetic position single axis system |
US7729742B2 (en) | 2001-12-21 | 2010-06-01 | Biosense, Inc. | Wireless position sensor |
US20040068178A1 (en) | 2002-09-17 | 2004-04-08 | Assaf Govari | High-gradient recursive locating system |
US8046049B2 (en) | 2004-02-23 | 2011-10-25 | Biosense Webster, Inc. | Robotically guided catheter |
US8456182B2 (en) | 2008-09-30 | 2013-06-04 | Biosense Webster, Inc. | Current localization tracker |
KR101997566B1 (en) | 2012-08-07 | 2019-07-08 | 삼성전자주식회사 | Surgical robot system and control method thereof |
EP3810019A4 (en) | 2018-06-21 | 2022-03-23 | PROCEPT BioRobotics Corporation | Artificial intelligence for robotic surgery |
US20200297444A1 (en) | 2019-03-21 | 2020-09-24 | The Board Of Trustees Of The Leland Stanford Junior University | Systems and methods for localization based on machine learning |
-
2022
- 2022-10-13 US US17/965,061 patent/US20230128764A1/en active Pending
- 2022-10-23 IL IL297557A patent/IL297557A/en unknown
- 2022-10-24 JP JP2022169712A patent/JP2023064087A/en active Pending
- 2022-10-24 EP EP22203167.6A patent/EP4169444A1/en not_active Withdrawn
- 2022-10-25 CN CN202211309498.0A patent/CN116030126A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
EP4169444A1 (en) | 2023-04-26 |
CN116030126A (en) | 2023-04-28 |
US20230128764A1 (en) | 2023-04-27 |
JP2023064087A (en) | 2023-05-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108720921B (en) | Automatic tracking and adjustment of viewing angle during catheter ablation process | |
US9204935B2 (en) | Robotic surgical system and method for diagnostic data mapping | |
US7974674B2 (en) | Robotic surgical system and method for surface modeling | |
KR102425170B1 (en) | Systems and methods for filtering localization data | |
KR102354675B1 (en) | Systems and methods for medical procedure confirmation | |
CA2497204C (en) | Robotically guided catheter | |
US6298257B1 (en) | Cardiac methods and system | |
KR20210005901A (en) | Systems and methods related to elongated devices | |
Elek et al. | Robotic platforms for ultrasound diagnostics and treatment | |
EP4216860A1 (en) | Haptic feedback for aligning robotic arms | |
WO2022064369A1 (en) | Haptic feedback for aligning robotic arms | |
US20230128764A1 (en) | Training system for a neural network to guide a robotic arm to operate a catheter | |
US20220265230A1 (en) | Device for moving a medical object and method for providing a correction preset |