WO2022254436A1 - Closed-loop steering of a medical instrument toward a moving target - Google Patents

Closed-loop steering of a medical instrument toward a moving target Download PDF

Info

Publication number
WO2022254436A1
WO2022254436A1 PCT/IL2022/050581 IL2022050581W WO2022254436A1 WO 2022254436 A1 WO2022254436 A1 WO 2022254436A1 IL 2022050581 W IL2022050581 W IL 2022050581W WO 2022254436 A1 WO2022254436 A1 WO 2022254436A1
Authority
WO
WIPO (PCT)
Prior art keywords
trajectory
target
model
medical instrument
data
Prior art date
Application number
PCT/IL2022/050581
Other languages
French (fr)
Inventor
Nir Shachar
Oz MOSKOVICH
Danna Perlman
Alon OHEV-ZION
Moran Shochat
Ido ROTH
Original Assignee
Xact Robotics Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xact Robotics Ltd. filed Critical Xact Robotics Ltd.
Publication of WO2022254436A1 publication Critical patent/WO2022254436A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00203Electrical control of surgical instruments with speech control or speech recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/00234Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
    • A61B2017/00292Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery mounted on or guided by flexible, e.g. catheter-like, means
    • A61B2017/003Steerable
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/064Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • A61B2090/3764Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT] with a rotating C-arm having a cone beam emitting source
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/35Surgical robots for telesurgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras

Definitions

  • the present invention relates to methods, devices and systems for closed-loop steering of medical instrument toward a moving target. More specifically, the present invention relates to real-time target tracking and steering of a medical instrument to facilitate the medical instrument reaching the target at a predicted target location within the body of a subject. Even more specifically, the present invention relates to tracking the target and predicting the end-point location of the target within the subject’s body, to facilitate the medical instrument reaching the target at its predicted end-point location, by steering the medical instrument according to a corresponding trajectory updated in real-time.
  • Various diagnostic and therapeutic procedures used in clinical practice involve the insertion of medical tools, such as needles and catheters, percutaneously to a subject’s body and in many cases further involve the steering of the medical tools within the body, to reach a target region.
  • the target region can be any internal body region, including, a lesion, tumor, organ or vessel.
  • procedures requiring insertion and steering of such medical tools include vaccinations, blood/fluid sampling, regional anesthesia, tissue biopsy, catheter insertion, cryogenic ablation, electrolytic ablation, brachytherapy, neurosurgery, deep brain stimulation, various minimally invasive surgeries, and the like.
  • Some automated insertion systems are based on manipulating robotic arms and some utilize a robotic device which can be attached to the patient’s body or positioned in close proximity thereto. These automated systems typically assist the physician in aligning the medical instrument with a selected insertion point at a desired location and the insertion itself is carried out manually by the physician.
  • Some automated systems further include an insertion mechanism that can insert the medical instrument toward the target, typically in a linear manner. More advanced automated systems further include non-linear steering capabilities, as described, for example, in U.S. Patents Nos. 8,348,861, 8,663,130 and 10,507,067, and in co-owned U.S. Patent No. US 10,245,110, co-owned U.S. Patent Application Publication No. 2019/290,372, and co-owned International Patent Application No. PCT/IL2020/051219, all of which are incorporated herein by reference in their entireties.
  • inserting a medical instrument through soft tissue typically involves displacement and/or deformation of the tissue, including the target (e.g., lesion) to be reached by the instrument, due to the penetration of the instrument through the tissue, as well as due to the patient’s respiratory motion and other patient movements.
  • target e.g., lesion
  • the present disclosure is directed to systems, devices and methods for automated insertion and steering of medical instruments/tools (for example, needles) in a subject’s body for diagnostic and/or therapeutic purposes, wherein the steering of the medical instrument within the body of a subject, is according to a trajectory (for example, 2D trajectory or 3D trajectory) for the medical instrument (for example, for the end or tip thereof), within the body of the subject, wherein the trajectory is determined/calculated, inter alia, according to a predicated spatial-temporal end-point location of the target, to thereby allow safely and accurately reaching the target by the most efficient and safe route.
  • a trajectory for example, 2D trajectory or 3D trajectory
  • the trajectory is determined/calculated, inter alia, according to a predicated spatial-temporal end-point location of the target, to thereby allow safely and accurately reaching the target by the most efficient and safe route.
  • the systems, devices and methods disclosed herein allow detecting, tracking and predicting the location of the target during and/or at the end of the procedure ("end-point"), such that at the end of the steering of the medical instrument according to a corresponding trajectory, the actual location of the medical instrument (in particular, the end thereof) coincides with the location of the target within the body, to increase effectiveness, safety and accuracy of the medical procedure.
  • Automatic insertion and steering of medical instruments within the body, and in particular utilizing a trajectory which is determined, inter alia, based on the predicted end-point location of the target, is advantageous over manual insertion of such instrument within the body.
  • the most effective spatio-temporal and safe route of the medical instrument to the end-point location of target within the body is achieved.
  • a closed-loop steering according to a trajectory which takes into account the predicted location of the target (which is determined and/or updated as disclosed herein)
  • increases safety as it reduces the risk of harming non-target regions and tissues within the subject's body, as the trajectory may take into account obstacles or any other regions along the route, and moreover, it may take into account changes in the real-time location of such obstacles as well as tissue movements during the procedure.
  • such automatic steering improves the accuracy of the procedure, which enables reaching small targets and/or targets which are located in areas in the body which are difficult to reach. This can be of particular importance in early detection of malignant neoplasms, for example. In addition, it provides increased safety for the patient, as there is a significant lower risk of human error.
  • a procedure can be remote controlled (e.g., from an adjacent control room or even from outside the medical facility), which is safer for the medical personnel, as it minimizes their radiation exposure during the procedure, as well as their exposure to any infectious diseases the patient may carry. Additionally, visualization of the planned and the executed trajectory towards the predicted location of the target vastly improves the user’s ability to supervise and control the medical procedure. Since the automated device can be controlled from a remote site, even from outside of the hospital, there is no longer a need for the physician to be present in the procedure room, according to some embodiments.
  • the automated medical devices are devices for insertion and steering of medical instruments (for example, needles, introducers or probes) in a subject’s body for various diagnostic and/or therapeutic purposes.
  • the automated insertion device may utilize real-time instrument and target position determination and real-time trajectory updating, as disclosed, for example, in abovementioned co-owned International Patent Application No. PCT/IL2020/051219.
  • the automated medical devices are configured to insert and steer/navigate a medical instrument (in particular, a tip of the medical instrument) in the body of the subject, to safely and accurately reach a target region within the subject’s body, to perform various medical procedures.
  • the operation of the medical devices may be controlled by at least one processor configured to provide instructions, in real-time, to steer the medical instrument toward the target, or more particularly, toward a predicted end-point location of the target, according to a planned and/or updated trajectory.
  • the steering may be controlled by the processor, via a suitable controller.
  • the steering may be controlled in a closed-loop manner, whereby the processor generates motion commands to the steering device via a suitable controller and receives feedback regarding the real-time location of the medical instrument and the target.
  • the processor(s) may be able to predict future locations and/or the movement pattern/profile of the target.
  • AI-based algorithm(s) may be used to predict the location and/or movement pattern of the target, of a tissue, and the like. As detailed herein, in some embodiments, AI-based algorithm(s) may be used to determine the trajectory to a predicted target location. In some embodiments, as detailed herein, AI-based algorithm(s) may be used to determine an optimized trajectory to a predicted target location. In some embodiments, the automated medical device may be configured to operate in conjunction with an imaging system. In some embodiments, the imaging system may include any type of imaging system, including, but not limited to: X-ray fluoroscopy, CT, cone beam CT, CT fluoroscopy, MRI, ultrasound, or any other suitable imaging modality.
  • the processor is configured to calculate a trajectory for the medical instrument based on a target, entry point and, optionally, “no-fly” zones, which include obstacles en route (such as bones or blood vessels), which may be manually marked by the user, or automatically identified by the processor, on one or more obtained images.
  • obstacles en route such as bones or blood vessels
  • the determination of the target movement profile and/or predicted location of the target may utilize various algorithms (including artificial intelligence (AI) models, such as machine learning (ML) models, deep learning (DL) models, and the like) which take into account various parameters and variables that can affect or influence the movement profile of the target and/or the end-point target location, including, for example, but not limited to: medical procedure (for example, ablation, biopsy, etc.); medical instrument (for example, type, size, gauge, etc.); tissue characteristics (for example, elasticity, location, dimensions, structure, shape, etc.); target characteristics (for example, type, dimensions, shape, location, accessibility); patient specific parameters (for example, age, gender, weight, body structure, etc.); patient related parameters (position, respiration, etc.); trajectory related parameters (“no-fly” zones, checkpoints, length, etc.); and the like, or any combinations thereof.
  • AI artificial intelligence
  • ML machine learning
  • DL deep learning
  • the systems and methods disclosed herein allow the determination of optimal interception point of the target by the medical instrument (i.e., optimal spatio-temporal location in which the medical instrument (for example, the tip thereof) reaches the target).
  • the term “location” can relate to spatial location, temporal location, or both spatial-temporal location.
  • the systems and methods disclosed herein may be operated automatically, and/or semi automatically (for example, with user confirmation and/or correction, if needed).
  • the systems and computer-implemented methods for estimating target movement and/or predicting target position, and the subsequent utilization of the prediction data to determine a suitable trajectory and steer the medical tool according to the determined trajectory toward the predicted target location may utilize specific algorithms which may be generated using machine learning tools, deep learning tools, data wrangling tools, and, more generally, AI and data analysis tools.
  • the specific algorithms may be implemented using artificial neural network(s) (ANN), such as convolutional neural network (CNN), recurrent neural network (RNN), long-short term memory (LSTM), auto-encoder (AE), generative adversarial network (GAN), Reinforcement-Learning (RL) and the like, as further detailed below.
  • ANN artificial neural network(s)
  • CNN convolutional neural network
  • RNN recurrent neural network
  • LSTM long-short term memory
  • AE auto-encoder
  • GAN generative adversarial network
  • Reinforcement-Learning RL
  • the specific algorithms may be implemented using machine learning methods, such as support vector machine (SVM), decision tree (DT), random forest (RF), boosting algorithms, linear regression, logistic regression, clustering algorithms, bayesian methods, and the like, or any combination thereof.
  • machine learning methods such as support vector machine (SVM), decision tree (DT), random forest (RF), boosting algorithms, linear regression, logistic regression, clustering algorithms, bayesian methods, and the like, or any combination thereof.
  • supervised, semi-supervised and/or unsupervised methods may be implemented.
  • systems for inserting and steering a medical instrument/tool within the body of a subject according to a trajectory updated in real-time based on detected and/or estimated target movement wherein the system includes an automated insertion and steering device (for example, a robot), a processor and optionally a controller.
  • the insertion and steering device is configured to insert and steer/navigate a medical instrument in the body of the subject, to reach a predicted target location within the subject’s body, according to a planned/determined and/or updated trajectory of the medical instrument, wherein the trajectory may be updated in real time, based on the real-time location of the medical instrument and/or of the target and/or the tissue and/or one various other parameters, and wherein the updating of the determined trajectory is facilitated utilizing the processor, which is further configured to convey real time steering instructions to the insertion and steering device.
  • the steering system may be configured to operate in conjunction with an imaging system.
  • the imaging system may include any type of imaging system (modality), including, but not limited to: X-ray fluoroscopy, CT, cone beam CT, CT fluoroscopy, MRI, ultrasound, or any other suitable imaging modality.
  • the processor of the system may be further configured to process and show on a display/monitor images, or image-views created from sets of images (or slices), from an imaging system (e.g., CT, MRI).
  • a computer- implemented method of generating a trajectory model for determining a trajectory for steering a medical instrument toward a moving target in a body of a subject in an image-guided procedure includes: collecting one or more datasets, at least one of the one or more datasets being related to an automated medical device configured to steer a medical instrument toward a target in the body of a patient and/or to operation thereof; creating a training set comprising at least a portion of the one or more datasets and one or more target parameters relating to planned and/or updated and/or executed trajectories in one or more previous image-guided procedures for steering a medical instrument toward a moving target in a body of a patient; training the trajectory model to output a trajectory that will reach a moving target at a predicted location of the target using the training set; calculating a trajectory prediction error; and optimizing the trajectory model using the calculated trajectory prediction error.
  • the one or more datasets further include one or more of: clinical procedure related dataset, patient
  • the trajectory model may be generated utilizing artificial intelligence tools comprising one or more of, but not limited to: machine learning tools, data wrangling tools, deep learning tools, artificial neural network (ANN), deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), long short term memory network (LSTM), decision trees or graphs, association rule learning, support vector machines, inductive logic programming, Bayesian networks, instance-based learning, manifold learning, sub-space learning, dictionary learning, reinforcement learning (RL), generative adversarial network (GAN), clustering algorithms, or any combination thereof.
  • artificial intelligence tools comprising one or more of, but not limited to: machine learning tools, data wrangling tools, deep learning tools, artificial neural network (ANN), deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), long short term memory network (LSTM), decision trees or graphs, association rule learning, support vector machines, inductive logic programming, Bayesian networks, instance-based learning, manifold learning, sub-space learning
  • training the trajectory model may include using one or more of: loss function, Ensemble Learning methods, Multi-Task Learning, Multi-Output regression and Multi-Output classification.
  • the method may further include executing one or more of a tissue movement model, a target movement model and a “no-fly” zones model using at least a portion of the one or more datasets.
  • the method may further include executing one or more individual models using at least a portion of the one or more datasets and the trajectory generated by the trajectory model; and obtaining one or more predictions from the one or more individual models.
  • the method may further include calculating a loss function using the trajectory prediction error and the one or more predictions generated by the one or more individual models; and optimizing the trajectory model using the loss function.
  • the method may further include training the one or more individual models.
  • the or more individual models may be selected from: a model for predicting an accuracy of an image-guided insertion procedure, an interaction model for predicting target and/or tissue movement resulting from an interaction between the medical instrument and the tissue and/or target movement; a model for predicting a duration of an image-guided insertion procedure or part thereof, a model for predicting a risk level of an image-guided insertion procedure, or any combination thereof.
  • calculating the loss function may include minimizing one or more of the trajectory prediction error, the predicted interaction movement, the predicted duration and the predicted risk.
  • calculating the loss function further includes maximizing the predicted accuracy of the image-guided insertion procedure.
  • the method may further include adjusting one or more coefficients of one or more terms used in the calculation of the loss function, the one or more terms being associated with at least one of the trajectory prediction error and the one or more predictions generated by the one or more individual models.
  • the adjusting of the one or more coefficients is executed during training of the trajectory model. According to some embodiments, the adjusting of the one or more coefficients is executed during execution of the trajectory model.
  • adjusting of the one or more coefficients being related to one or more of: a specific procedure type, a specific target type, a specific user, a specific population.
  • generating the trajectory model is executed by a training module comprising a memory and one or more processors.
  • the automated medical device is configured to allow real-time updating of the trajectory in accordance with predicted target movement and steer the medical instrument toward the target according to the updated trajectory.
  • a system for generating a trajectory model for determining a trajectory for steering a medical instrument toward a moving target in image-guided procedures includes: a training module comprising: a memory configured to store one or more datasets; and one or more processors configured to execute the method for generating a trajectory model as disclosed herein.
  • the training module may be located on a remote server, an “on premise” (local) server or a computer associated with the automated medical device.
  • the remote server is a cloud server.
  • a method of closed-loop steering a medical instrument toward a moving target within a body of a subject includes: calculating a planned trajectory for the medical instrument from an entry point to an initial target location in the body of the subject, steering the medical instrument toward the initial target location according to the planned trajectory; determining the real-time location of the target and the medical instrument; predicting movement of the target; and updating the trajectory based on the predicted movement of the target, such that the medical instrument will reach the target at a predicted location of the target; and steering the medical instrument toward the predicted location of the target according to the updated trajectory.
  • the predicting movement of the moving target may be executed by using a dynamic trajectory model.
  • the dynamic trajectory model may further include comprises predicting a movement of a tissue of the body and/or predicting a movement of tip of the medical instrument.
  • updating the trajectory is executed using a dynamic trajectory model.
  • calculating the planned trajectory is executed using a trajectory model.
  • the steering of the medical instrument toward the target is executed utilizing an automated medical device.
  • the planned trajectory and/or the updated trajectory are a 2D trajectory or a 3D trajectory.
  • the method for steering a medical instrument may further include obtaining one or more images of a region of interest within the body of the subject by means of an imaging system, selected from: a CT system, an X-ray fluoroscopy system, an MRI system, an ultrasonic system, a cone -beam CT system, a CT fluoroscopy system, an optical imaging system and an electromagnetic imaging system.
  • an imaging system selected from: a CT system, an X-ray fluoroscopy system, an MRI system, an ultrasonic system, a cone -beam CT system, a CT fluoroscopy system, an optical imaging system and an electromagnetic imaging system.
  • a system for steering a medical instrument toward a moving target in a body of a subject comprising: an automated device configured for steering the medical instrument toward a moving target, the automated device comprising one or more actuators and a control head configured for coupling the medical instrument thereto; and a processor configured for executing the method of steering a medical instrument, as disclosed herein.
  • the system may further include a controller configured to control the operation of the device.
  • a control device configured for steering a medical instrument toward a moving target in a body of a subject, said control device is configured to receive input from a processor (processing unit) configured for executing a method of steering a medical instrument as disclosed herein, and generate control data in response thereto, for controlling operation of the automatic medical device.
  • a processor processing unit
  • FIGS. 1A-1B show perspective views of an exemplary device (FIG. 1A) and an exemplary console (FIG. IB) of a system for inserting a medical instrument toward an internal target, according to some embodiments;
  • FIG. 2 shows an exemplary trajectory for a medical instrument to reach an internal target within the body of the subject, according to some embodiments
  • FIGS. 3A-3D show planning of an exemplary trajectory for inserting and steering a medical instrument toward a target, on CT images, according to some embodiments
  • FIG. 4 shows a flow chart of steps in a method for planning and real-time updating a trajectory of a medical instrument, according to some embodiments
  • FIG. 5 shows a diagram of a method of generating, deploying and using a data-analysis algorithm, according to some embodiments
  • FIGS. 6A-6B show an exemplary training module (FIG. 6A) and an exemplary training process (FIG. 6B) for training a data-analysis algorithm, according to some embodiments;
  • FIGS. 7A-7B show an exemplary inference module (FIG. 7A) and an exemplary inference process (FIG. 7B) for utilizing a data-analysis algorithm, according to some embodiments;
  • FIG. 8 shows a flowchart illustrating steps of a method of closed-loop steering of a medical instrument toward a moving target, according to some embodiments
  • FIG. 9 shows a flowchart illustrating steps of a method of closed-loop steering of a medical instrument toward a moving target utilizing a dynamic trajectory model, according to some embodiments
  • FIG. 10 shows a block diagram illustrating an exemplary method of generating (training) a tissue movement model for prediction of tissue movement during a procedure
  • FIG. 11 shows a block diagram illustrating an exemplary method of generating (training) a target movement model for prediction of target movement during a procedure
  • FIG. 12 shows a block diagram illustrating an exemplary method of generating (training) a trajectory model for determining a trajectory for steering a medical instrument toward a moving target during a medical procedure;
  • FIG. 13 shows a block diagram illustrating another exemplary method of generating (training) a trajectory model for determining a trajectory for steering a medical instrument toward a moving target, according to some embodiments.
  • FIGS. 14A-14D demonstrate real-time updating of a planned trajectory and steering a medical instrument according thereto, based on predicted movement of target, according to some embodiments.
  • systems, devices and methods for insertion and steering of a medical instrument in a subject’s body wherein the steering of the medical instrument within the body of a subject is based on a trajectory for the medical instrument (in particular, the end or tip thereof), within the body of the subject, wherein the trajectory is determined according to a predicted location of the target, to facilitate the safe and accurate reaching of the tip to the internal target region within the subject’s body, by the most efficient and safe route.
  • systems, devices and methods allowing the prediction of the end-point location of the target, to allow the safe reaching of the tip of the medical instrument to the moving target, to increase effectiveness, safety and accuracy of various related medical procedures.
  • a medical device for inserting and steering a medical instrument into (and within) a body of a subject may include any suitable automated device.
  • the automated steering device may include any type of suitable steering mechanism controlling the movement of an end effector (control head) at any one of desired movement angles or axis.
  • the automated inserting and steering device may have at least 3 degrees of freedom, at least 4 degrees of freedom, or at least five degrees of freedom (DOF).
  • the device 20 may include a housing (also referred to as “cover”) 21 accommodating therein at least a portion of a steering mechanism.
  • the steering mechanism may include at least one moveable platform (not shown) and at least two moveable arms 26 A and 26B, configured to allow or control movement of an end effector (also referred to as “control head”) 24, at any one of desired movement angles or axis, as disclosed, for example, in abovementioned U.S. Patent Application Publication No. 2019/290,372.
  • the moveable arms 26A and 26B may be configured as piston mechanisms.
  • a suitable medical instrument may be connected, either directly or by means of a suitable insertion module, such as the insertion module disclosed in co-owned U.S. Patent Application Publication No. 2017/258,489, which is incorporated herein by reference in its entirety.
  • the medical instrument may be any suitable instrument capable of being inserted and steered within the body of the subject, to reach a designated target, wherein the control of the operation and movement of the medical instrument is effected by the control head 24.
  • the control head 24 may include a driving mechanism (also referred to as “insertion mechanism”), or at least a portion thereof, which is configured to advance the medical instrument toward the target in the patient’s body.
  • the control head 24 may be controlled by a suitable control system, as detailed herein.
  • the medical instrument may be selected from, but not limited to: a needle, probe (e.g., an ablation probe), port, introducer, catheter (such as a drainage needle catheter), cannula, surgical tool, fluid delivery tool, or any other suitable insertable tool configured to be inserted into a subject’s body for diagnostic and/or therapeutic purposes.
  • the medical tool includes a tip at the distal end thereof (i.e., the end which is inserted into the subject’s body).
  • the tool tip may be a diamond tip, a bevel tip, a conical tip, etc.
  • the device 20 may have a plurality of degrees of freedom (DOF) in operating and controlling the movement the of the medical instrument along one or more axis.
  • DOF degrees of freedom
  • the device may have up to six degrees of freedom.
  • the device may have at least five degrees of freedom.
  • the device may have five degrees of freedom, including two linear translation DOF (in a first axis), a longitudinal linear translation DOF (in a second axis substantially perpendicular to the first axis) and two rotational DOF.
  • the device may have forward-backward and left-right linear translations facilitated by two moveable platforms, front-back and left-right rotations facilitated by two moveable arms (e.g., piston mechanism), and longitudinal translation toward the subject’s body facilitated by the insertion mechanism.
  • the control system i.e., processor and/or controller
  • the steering mechanism including the moveable platforms and the moveable arms
  • the insertion mechanism simultaneously, thus enabling non-linear steering of the medical instrument, i.e., enabling the medical instrument to reach the target by following a non-linear trajectory.
  • the device may have six degrees of freedom, including the five degrees of freedom described above and, in addition, rotation of the medical instrument about its longitudinal axis.
  • rotation of the medical instrument about its longitudinal axis may be facilitated by a designated rotation mechanism.
  • the control system i.e., processor and/or controller
  • the steering mechanism, the insertion mechanism and the rotation mechanism simultaneously.
  • the device 20 may further include a base 23, which allows positioning of the device 20 on or in close proximity to the subject’s body.
  • the device 20 may be configured for attachment to the subject’s body either directly or via a suitable mounting surface, such as the mounting base disclosed in co-owned U.S. Patent Application Publication No. 2019/125,397, or the attachment apparatus disclosed in co-owned International Patent Application Publication No. WO 2019/234,748, both of which are incorporated herein by reference in their entireties. Attachment of the device 20 to the mounting surface may be carried out using dedicated latches, such as latches 27A and 27B.
  • the device may be couplable to a dedicated arm or base which is secured to the patient’s bed, to a cart positioned adjacent the patient’s bed or to an imaging device (if used), and held on the subject’s body or in close proximity thereto, as described, for example, in abovementioned U.S. Patent No. 10,507,067 and in U.S. Patent No. 10,639,107, which is incorporated herein by reference in its entirety.
  • the device may include electronic components and motors (not shown) allowing the controlled operation of the device 20 in inserting and steering the medical instrument.
  • the device may include one or more Printed Circuit Board (PCB) (not shown) and electrical cables/wires (not shown) to provide electrical connection between a controller (not shown) and the motors of the device and other electronic components thereof.
  • the controller may be embedded, at least in part, within device 20.
  • the controller may be a separate component.
  • the device 20 may include a power supply (e.g., one or more batteries) (not shown).
  • the device 20 may be configured to communicate wirelessly with the controller and/or processor.
  • device 20 may include one or more sensors, such as a force sensor and/or an acceleration sensor (not shown).
  • sensors such as a force sensor and/or an acceleration sensor (not shown).
  • the housing 21 is configured to cover and protect, at least partially, the mechanical and/or electronic components of device 20 from being damaged or otherwise compromised.
  • the housing 21 may include at least one adjustable cover, and it may be configured to protect the device 20 from being soiled by dirt, as well as by blood and/or other bodily fluids, thus preventing/minimizing the risk of cross contamination between patients, as disclosed, for example, in co-owned International Patent Application No. PCT/IL2020/051220, which is incorporated herein by reference in its entirety.
  • the device may further include registration elements disposed at specific locations on the device 20, such as registration elements 29A and 29B, for registration of the device 20 to an image space, in image-guided procedures.
  • registration elements may be disposed on the mounting surface to which device 20 may be coupled, either instead or in addition to registration elements 29A-B disposed on device 20.
  • registration of the device 20 to the image space may be carried out via image processing of one or more components of the device 20, such as the control head 24, and/or of the mounting surface (or at least a portion thereof), which are visible in generated images.
  • the device may include a CCD/CMOS camera mounted on the device (e.g., the device’s frame), the mounting surface and/or as a separate apparatus, allowing the collection of visual images and/or videos of the patient’s body during a medical procedure.
  • the medical instrument is configured to be removably coupleable to the device 20, such that the device can be used repeatedly with new medical instruments.
  • the medical instruments are disposable. In some embodiments, the medical instruments are reusable.
  • device 20 is part of a system for inserting and steering a medical instrument in a subject’s body based on a preplanned and, optionally, real-time updated trajectory, as disclosed, for example, in abovementioned co-owned International Application No. PCT/IL2020/051219.
  • the system may include the steering and insertion device 20, as disclosed herein, and a control unit (or - “workstation” or “console”) configured to allow control of the operating parameters of device 20.
  • the user may operate the device 20 using a pedal or an activation button.
  • the system may include a remote control unit, which may enable the user to activate the device 20 from a remote location, such as the control room adjacent the procedure room (e.g., CT suite), a different location at the medical facility or even a location outside the medical facility.
  • a remote control unit may enable the user to activate the device 20 from a remote location, such as the control room adjacent the procedure room (e.g., CT suite), a different location at the medical facility or even a location outside the medical facility.
  • the user may operate the device using voice commands.
  • the workstation 25 may include a display 10 and a user interface (not shown).
  • the user interface may be in the form of buttons, switches, keys, keyboard, computer mouse, joystick, touch- sensitive screen, and the like.
  • the monitor and user interface may be two separate components, or they may form together a single component (e.g., in the form of a touch screen).
  • the workstation 25 may include one or more suitable processors (for example, in the form of a PC) and one or more suitable controllers, configured to physically and/or functionally interact with device 20, to determine and control the operation thereof.
  • the one or more processors may be implemented in the form of a computer (such as a workstation, a server, a PC, a laptop, a tablet, a smartphone or any other processor-based device).
  • the workstation 25 may be portable (e.g., by having wheels 12 or being placed on a movable platform).
  • the one or more processors may be configured to perform, for example, one or more of: determine the location of the target; determine the predicted location of the target during and/or at the end of the procedure (end-point), determine (plan) a trajectory for the medical instrument to reach the target (for example, at the predicted location of the target); update the trajectory in real-time, for example due to movement of the target from its initial identified position as a result of the advancement of the medical instrument within the patient’s body, respiration motion and/or patient movements; present the planned and/or updated trajectory on the monitor 10; control the movement (insertion/steering) of the medical instrument based on the planned and/or updated trajectory by providing executable instructions (directly or via the one or more controllers) to the device; determine the actual location of the medical instrument (e.g., the tip thereof) using image processing and/or by performing required compensation calculations; receive, process and visualize on the monitor images or image-views created from a set of images (between which the user may be able to scroll), operating parameters and the like
  • the planned trajectory of the medical instrument may be calculated based on a predicted location of the target within the subject body and optionally, inter alia, based on one or more inputs from the user, such as the entry point, areas to avoid en route (obstacles or “no-fly” zones), which the user marks on at least one of the obtained images.
  • the processor may be further configured to identify the target, actual location of the target, predicted location of the target, the obstacles and/or the insertion/entry point.
  • data-analysis algorithms e.g., AI-based models, may be used by the processor to perform such identifications/calculations .
  • AI-based models e.g., machine-learning and/or deep-learning based models
  • a “training” stage in which collected data is used to create (train) models.
  • the generated (trained) models may later be used for “inference” to obtain specific insights, predictions and/or recommendations when applied to new data during the clinical procedure or at any later time.
  • the insertion system and the system creating (training) the algorithms/models may be separate systems (i.e., each of the systems includes a different set of processors, memory modules, etc.). In some embodiments, the insertion system and the system creating the algorithms/models may be the same system. In some embodiments, the insertion system and the system creating the algorithms/models may share one or more resources (such as, processors, memory modules, GUI, and the like). In some embodiments, the insertion system and the system creating the algorithms/models may be physically and/or functionally associated. Each possibility is a separate embodiment.
  • the insertion system and the system utilizing the algorithms/models for inference may be separate systems (i.e., each of the systems includes a different set of processors, memory modules, etc.). In some embodiments, the insertion system and the system utilizing the algorithms/models for inference may be the same system. In some embodiments, the insertion system and the system utilizing the algorithms/models for inference may share one or more resources (such as, processors, memory modules, GUI, and the like). In some embodiments, the insertion system and the system utilizing the algorithms/models for inference may be physically and/or functionally associated. Each possibility is a separate embodiment.
  • the device may be configured to operate in conjunction with an imaging system, including, but not limited to: X-Ray, CT, cone beam CT, CT fluoroscopy, MRI, ultrasound, or any other suitable imaging modality.
  • an imaging system including, but not limited to: X-Ray, CT, cone beam CT, CT fluoroscopy, MRI, ultrasound, or any other suitable imaging modality.
  • the steering of the medical instrument based on a planned and, optionally, real-time updated 2D or 3D trajectory of the tip of the medical instrument may be image-guided.
  • various types of data may be generated, accumulated and/or collected, for further use and/or manipulation, as detailed below.
  • the data may be divided into various types/sets of data, including, for example, data related to operating parameters of the device, data related to clinical procedures, data related to the treated patient, data related to administrative information, and the like, or any combination thereof.
  • such collected datasets may be collected from one or more (i.e., a plurality) of automated medical devices, operating under various circumstances (for example, different procedures, different medical instruments, different patients, different locations and operating staff, etc.), to thereby generate a large data base ("big data"), that can be used, utilizing suitable data analysis tools and/or AI-based tools to ultimately generate models or algorithms that allow performance enhancements, automatic control or affecting control (i.e., by providing recommendations), of the medical devices.
  • big data a large data base
  • suitable data analysis tools and/or AI-based tools to ultimately generate models or algorithms that allow performance enhancements, automatic control or affecting control (i.e., by providing recommendations), of the medical devices.
  • FIG. 2 schematically shows a trajectory planned using one or more processors, such as the processor(s) of the insertion system described in FIG. IB, for delivering a medical instrument to an internal target within the body of the subject, using an automated medical device, such as the automated device of FIG. 1A.
  • the planned trajectory may be linear or substantially linear.
  • the trajectory may be non-linear trajectory having any suitable/acceptable degree of curvature.
  • the one or more processors may calculate a planned trajectory for the medical instrument to reach the target.
  • the planning of the trajectory and the controlled steering of the instrument according to the planned trajectory may be based on a model of the medical instrument as a flexible beam having a plurality of virtual springs connected laterally thereto to simulate lateral forces exerted by the tissue on the instrument, thereby calculating the trajectory through the tissue on the basis of the influence of the plurality of virtual springs on the instrument, and utilizing an inverse kinematics solution applied to the virtual springs model to calculate the required motion to be imparted to the instrument to follow the planned trajectory.
  • the processor may then provide motion commands to the automated device, for example via a controller.
  • the one or more processors generate motion commands to the automated device and receives feedback regarding the real-time location of the medical instrument (e.g., the tip thereof), which is then used for real-time trajectory corrections, as disclosed, for example, in abovementioned U.S . Patent No. 8,348,861.
  • the one or more processors may calculate the motion to be applied to the robot to reduce the deviation.
  • the real-time location of the medical instrument and/or the corrections may be calculated and/or applied using data-analysis models/algorithms.
  • certain deviations of the medical instrument from the planned trajectory for example deviations which exceed a predetermined threshold, may require recalculation of the trajectory for the remainder of the procedure, as described in further detail hereinbelow.
  • a trajectory 32 is planned between an entry point 36 and an internal target 38.
  • the planning of the trajectory 32 may take into account various variables, including, but not limited to: the type of the medical instrument to be used and its characteristics, the dimensions of the medical instrument (e.g., length, gauge), the type of imaging modality (such as, CT, CBCT, MRI, X-Ray, CT fluoroscopy, ultrasound and the like), the tissues through which the medical instrument is to be inserted, the location of the target, the size of the target, the insertion point, the angle of insertion (relative to one or more axis), milestone points (“secondary targets” through which the medical instrument should pass) and the like, or any combination thereof.
  • At least one of the milestone points may be a pivot point, i.e., a predefined point along the trajectory in which the deflection of the medical instrument is prevented or minimized, to maintain minimal pressure on the tissue (even if this results in a larger deflection of the instrument in other parts of the trajectory).
  • the planned trajectory is an optimal trajectory based on one or more of these parameters. Further taken into account in determining the trajectory may be various obstacles 39A-39C, which may be identified along the path and which should be avoided, to prevent damage to the neighboring tissues and/or to the medical instrument.
  • safety margins 34 may be marked along the planned trajectory 32, to ensure a minimal distance between the trajectory 32 and potential obstacles en route.
  • the width of the safety margins may be symmetrical in relation to the trajectory 32.
  • the width of the safety margins may be asymmetrical in relation to the trajectory 32.
  • the width of the safety margins 34 may be preprogrammed.
  • the width of the safety margins may be automatically set, or recommended to the user, by the processor, based on data obtained from previous procedures using a data analysis algorithm.
  • the width of the safety margins 34 may be determined and/or adjusted by the user.
  • FIG. 2 is an end of a control head 30 of the exemplary automated insertion device, to which the medical instrument (not shown in FIG. 2) is coupled, as virtually displayed on the monitor, to indicate its position and orientation.
  • the trajectory 32 shown in FIG. 2 is a planar trajectory (i.e., two dimensional).
  • steering of the instrument is carried out according to a planner trajectory, for example trajectory 32.
  • the calculated planner trajectory may be superpositioned with one or more additional planner trajectories, to form a three-dimensional (3D) trajectory.
  • additional planner trajectories may be planned on one or more different planes, which may be perpendicular to the plane of the first planner trajectory (e.g., trajectory 32) or otherwise angled relative thereto.
  • the 3D trajectory may include any type of trajectory, including a linear trajectory or a non-linear trajectory.
  • the steering of the medical instrument is carried out in a 3D space, wherein the steering instructions are determined on each of the planes of the superpositioned planner trajectories, and are then superpositioned to form the steering in the three-dimensional space.
  • the data/parameters/values thus obtained during the steering of the medical instrument using the automated device can be used as data/parameters/values for the generation/training and/or utilization/inference of the data-analysis model(s)/algorithm(s).
  • FIGS. 3A-3D show planning of an exemplary trajectory for inserting and steering a medical instrument toward a target, according to some embodiments.
  • the exemplary trajectory may be planned using a processor, such as the processor(s) of the insertion system described in FIG. IB, and the insertion and steering of the medical instrument toward the target according to the planned trajectory may be executed using an automated insertion device, such as the automated device of FIG. 1A.
  • FIGS. 3A-3D The planning in FIGS. 3A-3D is shown on CT image-views, however it can be appreciated that the planning can be carried out similarly on images obtained from other imaging systems, such as ultrasound, MRI and the like.
  • FIG. 3A Shown in FIG. 3A are CT image- views of a subject, depicting at the left-hand panel an axial plane view and on the right-hand panel a sagittal plane view. Also indicated in the figure is an internal target 44 and an automated insertion device 40. Further indicated is a vertebra 46.
  • FIG. 3B which shows the CT image- views of FIG. 3A, the insertion point 42 is indicated.
  • a linear trajectory 48 between the insertion point 42 and the internal target 44 may be calculated and displayed on each of the two views (for example, axial plane view and sagittal plane view).
  • a linear trajectory is preferred, thus, if the displayed linear trajectory does not pass in close proximity to any potential obstacles, then the linear trajectory is determined as the planned trajectory for the insertion procedure.
  • a transverse process 462 of vertebra 46 is detected in close proximity to the calculated linear trajectory, and is identified and marked, in this example on the axial plane view, to allow considering the obstacle when planning the trajectory for the procedure.
  • FIG. 3C a transverse process 462 of vertebra 46 is detected in close proximity to the calculated linear trajectory, and is identified and marked, in this example on the axial plane view, to allow considering the obstacle when planning the trajectory for the procedure.
  • the trajectory is re-calculated, so as to allow the instrument to avoid contacting the obstacle 462, resulting in a non-linear trajectory 48’.
  • the planned trajectory may not be calculated until potential obstacles are marked on the image-view/s, either manually or automatically, until the user confirms that there are no potential obstacles and/or until the user manually initiates trajectory calculation.
  • an interim linear trajectory similar to linear trajectory 48 of FIG. 3B, may not be calculated and/or displayed.
  • a maximal allowable curvature level may be pre-set for the calculation of the non-linear trajectory.
  • the maximal curvature threshold may depend, for example, on the trajectory parameters (e.g., distance between the entry point and the target) and on the type of instrument intended to be used in the procedure and its characteristics (for example, type, diameter (gauge), material, and the like).
  • a maximal allowable lateral movement of the instrument at the entry point may be pre-set for the calculation of the non-linear trajectory.
  • a maximal allowable proximity to obstacle(s) may be pre-set for the calculation of the non-linear trajectory.
  • the planned trajectory may be updated in real-time based on the real-time position of the medical instrument (for example, the tip thereof) and/or the real-time position of the target and/or the real-time position(s) of obstacle(s). In some embodiments, the planned trajectory may be updated in real-time based on a predicted/estimated position of the target and/or a predicted/estimated position(s) of obstacle(s).
  • the target 44, insertion point 42 and, optionally, obstacle/s, such as transverse process 462 are marked manually by the user.
  • the processor of the insertion system (or of a separate system) may be configured to identify and mark at least one of the target, the insertion point and the obstacle/s, and the user may, optionally, be prompted to confirm or adjust the processor’s proposed markings.
  • the target and/or obstacle/s may be identified using known image processing techniques and/or data-analysis models/algorithms, optionally based also on data obtained from previous procedures.
  • the insertion point may be suggested based solely on the obtained images, or, alternatively or additionally, on data obtained from previous procedures using data-analysis models/algorithms.
  • the trajectory may be calculated based solely on the obtained images and the marked locations of the entry point, target (and, optionally, obstacle/s). According to other embodiments, the calculation of the trajectory may be based also on data obtained from previous procedures, using data-analysis models/algorithms.
  • checkpoints along the trajectory may be set.
  • Checkpoints (not shown in FIGS. 3A-3D) may be used so that upon the medical instrument reaching a checkpoint, its insertion is paused and imaging of the region of interest is initiated (either manually by the user or automatically by the processor), to verify the position of the instrument (specifically, to verify that the instrument (e.g., the tip thereof) follows the planned trajectory), to monitor the location of the marked obstacles and/or identify previously unmarked obstacles along the trajectory, and to verify the target’s position, such that recalculation of the trajectory may be initiated, if the user chooses to do so, before advancing the instrument to the next checkpoint/the target.
  • the checkpoints may be manually set by the user, or they may be automatically set or recommended by the processor, as described in further detail hereinbelow.
  • FIGS. 3A- 3D views pertaining to different planes or orientations (e.g., coronal, pseudo axial, pseudo sagittal, pseudo coronal, etc.) or additionally generated views (e.g., trajectory view, tool view, 3D view, etc.), may be used in order to perform and/or display the trajectory planning.
  • views pertaining to different planes or orientations e.g., coronal, pseudo axial, pseudo sagittal, pseudo coronal, etc.
  • additionally generated views e.g., trajectory view, tool view, 3D view, etc.
  • recalculation of the trajectory may also be required if the instrument deviated from the planned trajectory above a predetermined deviation threshold.
  • determining the actual real-time location of the instrument may require applying a correction to the determined location of the tip of the medical instrument, to compensate for deviations due to imaging artifacts.
  • the actual location of the tip may be determined based on an instrument position compensation “look-up” table, which corresponds to the imaging modality and the medical instrument used, as disclosed, for example, in abovementioned co-owned International Patent Application No. PCT/IL2020/051219.
  • one or more checkpoints may be added and/or repositioned along the planned trajectory, either manually by the user or automatically by the processor, to direct the instrument back to the planned trajectory.
  • the processor may prompt the user to add and/or reposition checkpoint/s.
  • the processor may recommend to the user specific position/s for the new and/or repositioned checkpoints. Such a recommendation may be generated using data-analysis algorithm(s).
  • recalculation of the trajectory may also be required if, for example, an obstacle is identified along the trajectory.
  • an obstacle may be an obstacle which was marked (manually or automatically) prior to the calculation of the planned trajectory but tissue movement, e.g., tissue movement resulting from the advancement of the instrument within the tissue, caused the obstacle to move such that it entered the planned path.
  • the obstacle may be a new obstacle, i.e., an obstacle which was not visible in the image (or set of images) based upon which the planned trajectory was calculated, and became visible during the insertion procedure.
  • recalculation of the trajectory if the instrument deviated from the planned trajectory (e.g., above a predetermined deviation threshold), a new or repositioned obstacle is identified along the planned trajectory and/or the target has moved (e.g., above a predetermined threshold), the user may be prompted to initiate an update (recalculation) of the trajectory.
  • recalculation of the trajectory if required, is executed automatically by the processor and the insertion of the instrument is automatically resumed based on the updated trajectory.
  • recalculation of the trajectory if required, is executed automatically by the processor, however the user is prompted to confirm the recalculated trajectory before advancement of the instrument (e.g., to the next checkpoint) according to the updated trajectory can be resumed.
  • the trajectory may be updated during the procedure, as detailed herein below.
  • the trajectory may be updated according to various parameters and variables, including, for example, the actual (real-time) position of the target.
  • the trajectory may be updated according to predicted location of the target, as determined/calculated as detailed herein (for example, using suitable machine learning (or deep learning) algorithms and/or image processing techniques).
  • FIG. 4 illustrates steps in an exemplary method for planning and updating a trajectory of a medical instrument toward an internal target in a body of a subject, according to some embodiments.
  • the trajectory may be updated according to the actual (real-time) location of the target.
  • the trajectory of the medical instrument is planned from an insertion point on the body of the subject to an internal target.
  • the target may be identified and marked on obtained image(s) manually by the user.
  • the target may be identified automatically by the processor, using image processing techniques and/or data- analysis algorithms.
  • the insertion point may be selected and marked on obtained image(s) manually by the user.
  • one or more optional (e.g., optimal) insertion points(s) may be identified by the processor, using image processing techniques and/or data-analysis algorithms.
  • the recommended insertion points may be displayed on the obtained image(s) and the user may be prompted to select one of the entry points and/or adjust the location of a recommended entry point.
  • the planned trajectory may be any type of trajectory, such as, 2D trajectory or 3D trajectory.
  • a planned 3D trajectory may be obtained by planning a route on each of two planes and superpositioning the two 2D routes on said planes, at their intersection line, to form the planned 3D trajectory.
  • the two planes are perpendicular.
  • the planning/calculating of the trajectory may take into account various parameters, including but not limited to: type of medical instrument, characteristics of the medical instrument (material, length, gauge, etc.), type of imaging modality (such as, CT, CBCT, MRI, X-RAY, CT Fluoroscopy, Ultrasound and the like), insertion point, insertion angle, type of tissue(s), location of the internal target, size of the target, shape of target, tissue, obstacles along the route, milestone points (“secondary targets” through which the medical instrument should pass) and the like, or any combination thereof.
  • type of medical instrument material, length, gauge, etc.
  • type of imaging modality such as, CT, CBCT, MRI, X-RAY, CT Fluoroscopy, Ultrasound and the like
  • insertion point such as, CT, CBCT, MRI, X-RAY, CT Fluoroscopy, Ultrasound and the like
  • insertion point such as, CT, CBCT, MRI, X-RAY, CT Fluoroscopy, Ultrasound and the like
  • At least one of the milestone points may be a pivot point, i.e., a predefined point along the trajectory in which the deflection of the medical instrument is prevented or minimized, to maintain minimal pressure on the tissue (even if this results in a larger deflection of the instrument in other parts of the trajectory).
  • a maximal allowable curvature level may be pre-set for the planning of the trajectory.
  • a maximal allowable lateral movement of the instrument at the entry point may be pre-set for the planning of the trajectory.
  • a maximal allowable proximity to obstacle(s) may be pre-set for the calculation of the non-linear trajectory.
  • the planned trajectory is an optimal trajectory based on one or more of these parameters.
  • the medical instrument is inserted into the body of the subject at the designated (selected) entry point and steered (in a suitable space) towards the predetermined target, according to the planned trajectory.
  • the insertion and steering of the medical instrument is facilitated by an automated device for inserting and steering, such as, for example, device 2 of FIG. 1 A.
  • the real-time location/position (and optionally the orientation) of the medical instrument (e.g., the tip thereof) and/or the real-time location of one or more obstacles and/or the location of newly identified one or more obstacles along the trajectory and/or the real-time location of one or more of the milestone points (“secondary targets”) and/or the real-time location of the target are determined.
  • the determination of any of the above may be performed manually by the user. In some embodiments, the determination of any of the above may be performed automatically by one or more processors.
  • the determination may be performed by any suitable methods known in the art, including, for example, using suitable image processing techniques and/or machine learning (or deep learning) algorithms, using data collected in previous procedures (procedures previously performed), as further described hereinbelow.
  • Step 54 may optionally further include correcting the determined location of the tip of the medical instrument, to compensate for deviations due to imaging artifacts, in order to determine the actual location of the tip. Determining the actual location of the tip prior to updating the trajectory can in some embodiments vastly increase the accuracy of the procedure.
  • the determination of the real-time locations may be performed at any spatial and/or temporal distribution/pattem and may be continuous or at any time (temporal) or space (spatial) intervals.
  • the procedure may be paused, either automatically or selectively by the user, at spatio and/or temporal intervals, to allow processing, determining, changing and/or approving continuation of the procedure.
  • the determination of the real-time locations indicated above may be performed at one or more checkpoints.
  • the checkpoints may be predetermined and/or determined during the steering procedure.
  • the checkpoints may include spatial checkpoints (for example, regions or locations along the trajectory, including, for example, specific tissues, specific regions, length or location along the trajectory (for example, every 20-50 mm), and the like).
  • the checkpoints may be temporal checkpoints, i.e., a checkpoint performed at designated time points during the procedure (for example, every 2-5 seconds).
  • the checkpoints may include both spatial and temporal check points.
  • the checkpoints may be spaced apart, including the first checkpoint from the entry point and the last checkpoint from the target, at an essentially similar distance along the planned trajectory.
  • the checkpoints may be manually set by the user.
  • the checkpoints may be automatically set by the processor, using image processing or computer vision algorithms, based on the obtained images and the planned trajectory and/or also on data obtained from previous procedures using machine learning capabilities, as disclosed, for example, in co-owned International Patent Application No.
  • the user may be required to confirm the checkpoints recommended by the processor or choose to adjust their location/timing.
  • Upper and/or lower interval thresholds between checkpoints may be predetermined.
  • the checkpoints may be automatically set by the processor at, for example, about 20 mm intervals, and the user may be permitted to adjust the distance between each two checkpoints (or between the entry point and the first checkpoint and/or between the last checkpoint and the target) such that the maximal distance between them is, for example, about 30mm and/or the minimal distance between them is about 3mm.
  • the trajectory is updated.
  • the deviation may be determined compared to a previous time point or spatial point, as detailed above.
  • the deviation is compared with a respective threshold, to determine if the deviation exceeds the threshold.
  • the threshold may be, for example, a set value or a percentage reflecting a change in a value.
  • the threshold may be determined by the user.
  • the threshold may be determined by the processor, for example based on data collected in previous procedures and using machine learning algorithms. If deviation is detected, or if the detected deviation exceeds the set threshold, the trajectory may be updated according to the required change. In some embodiments, updating the trajectory may be executed by calculating the trajectory required to reach the target at its new location. The calculation of the updated trajectory may take into account the various parameters taken into account during the planning of the initial trajectory, as described above. In some embodiments, updating the trajectory may be executed using data-analysis algorithm(s), e.g., AI-based model(s).
  • the trajectory may be updated by updating the route, according to the required change in each of two planes (for example, planes perpendicular thereto) and thereafter superpositioning the two updated 2D routes on the two (optionally perpendicular) planes to form the updated 3D trajectory.
  • the updated route on each of the two planes may be performed by any suitable method, including, for example, utilizing a kinematics model.
  • the user may add and/or reposition one or more checkpoints along the planned trajectory, to direct the instrument back to the planned trajectory.
  • the processor may prompt the user to add and/or reposition checkpoint/s. In some embodiments, the processor may recommend to the user specific position/s for the new and/or repositioned checkpoints. Such a recommendation may be generated using image processing techniques and/or machine learning algorithms.
  • step 58 the steering of the medical instrument is then continued, according to the updated trajectory, to facilitate the tip of the instrument reaching the internal target (and secondary targets along the trajectory, if such are required). It can be appreciated, that if no deviation in the abovementioned parameters was detected, the steering of the medical instrument can continue according to the planned trajectory.
  • steps 54-58 may be repeated for any number of times, until the tip of the medical instrument reaches the internal target, or until a user terminates the procedure. In some embodiments, the number of repetitions of steps 54-58 may be predetermined or determined in real-time, during the procedure. According to some embodiments, at least some of the steps (or sub-steps) are performed automatically. In some embodiments, at least some of the steps (or sub-steps) may be performed manually, by a user. According to some embodiments, one or more of the steps are performed automatically. According to some embodiments, one or more of the steps are performed manually. According to some embodiments, one or more of the steps are supervised manually and may proceed after being approved by user.
  • the planning (and/or updating) of the trajectory is a dynamic planning (and/or updating), allowing automatically predicting changes (for example, predicted target change), obstacles (for example, bones and/or blood vessels which are to be avoided), milestones along the trajectory, and the like, and adjusting the steering of the medical instrument accordingly is in fully-automated or at least semi- automated manner.
  • the dynamic planning proposes a planned and/or updated trajectory to a user for confirmation prior to proceeding with any of the steps.
  • the trajectory planning is a dynamic planning, taking into consideration expected cyclic changes in the position of the target, obstacles, etc., resulting from the body motion during the breathing cycle, as described, for example, in co-owned U.S. Patent No.
  • Such dynamic planning may be based on sets of images obtained during at least one breathing cycle of the subject (e.g., using a CT system), or based on a video generated during at least one breathing cycle of the subject (e.g., using a CT fluoroscopy system or any other imaging system capable of continuous imaging).
  • the steering of the medical instrument to the target is achieved by directing the medical instrument (for example, the tip of the medical instrument), to follow, in real-time, the planned trajectory, which may be updated in real time, during the procedure, as needed.
  • the directing is affected by a control device/ controller unit configured to receive input and generate control data in response thereto, for controlling operation of the automatic medical device.
  • the term "real-time trajectory" of a medical instrument relates to the actual path the medical instrument transverses in the body of the subject, i.e., its actual position at each point in time during the steering procedure.
  • the trajectory planning and updating using the systems disclosed herein is facilitated using any suitable imaging device.
  • the imaging device is a CT imaging device.
  • the planning and/or real-time updating of the trajectory is performed based on CT images of the subject obtained before and/or during the procedure.
  • inherent difficulties may arise in identifying the actual location of the tip of the medical instrument.
  • the accurate orientation and position of the tool are important for high accuracy steering. Further, by determining the actual position of the tip safety is increased, as the medical instrument is not inserted beyond the target or beyond what is defined by the user. Depending on the imaging modality, the tissue and the type of medical instrument, artifacts which obscure the actual location of the tip can occur.
  • the tip position may not be easily visually detected, and in some cases, the determination may vastly deviate, for example by over 2-3mm.
  • the actual and relatively exact location of the tip may be determined at below visualized pixel size level.
  • the determination of the actual position of the tip may depend on the desired/required accuracy level, which may depend on several parameters, including, for example, but not limited to: the clinical indication (for example, biopsy vs. fluid drainage); the target size, target location and/or movement; the lesion size (for a biopsy procedure, for example); the anatomical location (for example, lungs/brain vs. liver/kidneys); the trajectory (for example, if it passes near delicate organs, blood vessels, etc.); and the like, or any combination thereof.
  • the determination/correction of the actual location of the tip may be performed in real-time.
  • the determination/correction of the actual location of the tip may be performed continuously and/or in time lapses on suitable images obtained from various imaging modalities. According to some embodiments, such artifacts and inaccuracies are compensated in real time in order to determine the actual location of the tip and ensure it meets the target end-point in an accurate and effective spatio-temporal manner.
  • FIG. 5 is a diagram 60 of a method of generating, deploying and using a data- analysis algorithm, according to some embodiments.
  • automated medical procedure(s) are executed using automated medical device(s).
  • Automated medical procedure(s) involve a plurality of datasets related thereto (as further detailed below). For example, some of the datasets directly relate to the operation of the medical device (such as operating parameters), some of the datasets relate to the clinical procedure, some of the datasets relate to the treated patient and some of the datasets relate to administrative related information.
  • datasets may be generated during training sessions performed by users on a dedicated simulator system.
  • Such a simulator system may be configured to at least partially simulate a medical procedure, including enabling users to plan the procedure on existing images and then simulating the execution of the procedure according to the procedure plan via a virtual automated medical device and a virtual medical instrument.
  • step 62 at least some of the generated datasets, values thereof and/or parameters related thereto are collected from the medical procedures and/or simulation sessions and stored in a centralized database.
  • the collected datasets may be split/divided for use as training sets, validation sets and/or testing sets.
  • the collected data is annotated, to thereby generate and train the data-analysis algorithm, at stage 64.
  • the data-analysis algorithm is validated and deployed.
  • the results from the algorithm are obtained, at step 66, and the results are then used to provide, at stage 67, recommendations/operating instructions/predictions/alerts.
  • Subsequent medical procedures executed by automated medical devices may implement at least some of the recommendations/operating instructions/predictions/alerts, thereby returning to step 61 and repeating the method.
  • the performance of the validated algorithm is monitored, at stage 68, and is further enhanced/improved, based on data stored in the centralized database and/or on newly acquired data.
  • the various obtained datasets may be used for the training, construction and/or validation of the algorithm.
  • the datasets may be selected from, but not limited to: medical device related dataset, clinical procedures related dataset, patient related dataset, administrative-related dataset, and the like, or any combination thereof.
  • the medical device related dataset may include such data parameters or values as, but not limited to: procedure steps timing, overall procedure time, overall steering time (of the medical instrument), entry point of the medical instrument, target point/region, target updates (for example, updating real-time depth and/or lateral position of the target), planned trajectory of the medical instrument, real-time trajectory of the medical instrument, (real-time) trajectory updates, number of checkpoints (CPs) along the planned or real-time-updated trajectory of the medical instrument, CPs positions/locations, CPs updates during the procedure, CPs errors (in 2D and/or in 3D), position of the medical device, insertion angles of the medical instrument (for example, insertion angle in the axial plane and off-axial angle), indication whether the planned (indicated) target has been reached during the procedure, target error (for example, lateral and depth, in 2D and /or in 3D), scans/images, parameters per scan, radiation dose per scan, total radiation dose in the steering phase of the medical instrument, total
  • one or more of the values may be configured to be collected automatically by the system. For example, values such as procedure steps timing, overall steering time, entry, target, target updates (depth and lateral), trajectory, trajectory updates, number of CPs, CP positions, CP updates, CP errors (2 planes and/or 3D), robot position, scans/images, parameters per scan, radiation dose, errors/wamings, software logs, motion control traces, medical device registration logs, medical instrument detection logs, homing and BIT results may be collected automatically.
  • values such as procedure steps timing, overall steering time, entry, target, target updates (depth and lateral), trajectory, trajectory updates, number of CPs, CP positions, CP updates, CP errors (2 planes and/or 3D), robot position, scans/images, parameters per scan, radiation dose, errors/wamings, software logs, motion control traces, medical device registration logs, medical instrument detection logs, homing and BIT results may be collected automatically.
  • the clinical procedures related dataset may include such data parameters or values as, but not limited to: procedure type (e.g., blood/fluid sampling, regional anesthesia, tissue biopsy, catheter insertion, cryogenic ablation, electrolytic ablation, brachy therapy, neurosurgery, deep brain stimulation, various minimally invasive surgeries, and the like), target organ, target dimensions, target type (tumor, abscess, and the like), type of medical instrument, parameters of medical instrument (e.g., gauge, length, material, tip type, etc.), complications before/during/after the procedure, adverse events before/during/after the procedure, respiration signals of the patient, movement of the patient, and the like, or any combination thereof.
  • procedure type e.g., blood/fluid sampling, regional anesthesia, tissue biopsy, catheter insertion, cryogenic ablation, electrolytic ablation, brachy therapy, neurosurgery, deep brain stimulation, various minimally invasive surgeries, and the like
  • target organ e.g., target dimensions, target type (tumor, ab
  • one or more of the values may be configured to be collected automatically.
  • the type of medical instrument for example, type of a needle
  • parameters of the medical instrument for example, respiration signal(s) of the patient, movement of the patient, movement traces of the automated medical device and system logs may be collected automatically.
  • one or more of the values may be configured to be collected manually by requesting the user to insert the data, information and/or visual marking using a graphic-user-interface (GUI), for example.
  • GUI graphic-user-interface
  • the patient related dataset may include such data parameters or values as, but not limited to: age, gender, race, relevant medical history, vital signs before/after/during the procedure, body dimensions (height, weight, BMI, circumference, etc.), current medical condition, pregnancy, smoking habits, demographic data, and the like, or any combination thereof.
  • age, gender, race, relevant medical history, vital signs before/after/during the procedure body dimensions (height, weight, BMI, circumference, etc.), current medical condition, pregnancy, smoking habits, demographic data, and the like, or any combination thereof.
  • the administrative related dataset may include such data parameters or values as, but not limited to: institution (healthcare facility) in which the procedure is performed, physician, staff, system serial numbers, disposables used, software/operating systems versions, configuration parameters, and the like, or any combination thereof. Each possibility is a separate embodiment.
  • various predictions, recommendations and/or implementations may be generated that can enhance further medical procedures.
  • the generated algorithm/s may be customized to a specific procedure, specific patient (or cohort of patients), or any other set of specific parameters.
  • the algorithm/s may be used for enhancing medical procedures, predicting clinical outcome and/or clinical complications and overall increasing safety and accuracy.
  • the data-analysis algorithms generated by the systems and methods disclosed herein may be used for, but not limited to: Tissue segmentation; Tissue reconstruction; Target detection; Target tracking; Predicting target movement and/or target location during and/or at the end of the procedure; predicting tissue/organs movement and/or tissue/organs location during and/or at the end of the procedure; Predicting obstacles location and/or movement during and/or at the end of the procedure; Predicting changes in the anatomical structure (e.g., deformation) of tissues/target/obstacles during and/or at the end of the procedure; Determining and/or recommending entry point location; Determining and/or recommending a trajectory for the insertion procedure; Updating a trajectory during the procedure; Optimizing checkpoint positioning along a trajectory (planned and/or updated trajectory), e.g., by recommending the best tradeoff between accuracy and radiation exposure/procedure time, as disclosed, for example, in abovementioned co-owned International Patent Application No.
  • PCT/IL2021/050441 Determining or recommending “no-fly” zones, i.e., areas (obstacles and/or vital anatomical structures) to avoid during instrument insertion, as disclosed, for example, in co-owned International Patent Application No.
  • PCT/IL2020/051219 Evaluating procedure success (estimated success and/or estimated risk level) based on the current planning and similar past procedures; Utilizing force sensor measurements for evaluation of tissue compliance, early detection of clinical complications and/or optimizing instrument steering; Utilization of additional sensor measurements (e.g., accelerometer, radiation sensor, respiration sensor, etc.); and the like, or any combination thereof.
  • additional sensor measurements e.g., accelerometer, radiation sensor, respiration sensor, etc.
  • generated algorithms may be used for providing recommendations regarding various device functions and operations, including providing optimized routes or modes of operation. According to some embodiments, generated algorithms may be used for providing improved/optimized procedures, while taking into account various variables that may change during the procedure, such as, for example, predicting target movement, correlating body movement (breathing -related) and device operation, etc.
  • a training module (also referred to as “learning module”) may be used to train an AI model (e.g., ML or DL-based model) to be used in an inference module, based on the datasets and/or the features extracted therefrom and/or additional metadata, in the form of annotations (e.g., labels, bounding-boxes, segmentation maps, visual locations markings, etc.).
  • the training module may constitute part of the inference module or it may be a separate module.
  • a training process (step) may precede the inference process (step).
  • the training process may be on-going and may be used to update/validate/enhance the inference step (see “active-learning” approach described herein).
  • the inference module and/or the training module may be located on a local server (“on premise”), a remote server (such as, a server farm or a cloud-based server) or on a computer associated with the automated medical device.
  • the training module and the inference module may be implemented using separate computational resources.
  • the training module may be located on a server (local or remote) and the inference module may be located on a local computational resource (computer), or vice versa.
  • both the training module and the inference module may be implemented using common computational resources, i.e., processors and memory components shared therebetween.
  • the inference module and/or the training module may be located or associated with a controller (or steering system) of an automated medical device.
  • a plurality of inference modules and/or learning modules (each associated with a medical device or a group of medical devices), may interact to share information therebetween, for example, utilizing a communication network.
  • the model(s) may be updated periodically (for example, every 1-36 weeks, every 1-12 months, etc.).
  • the model(s) may be updated based on other business logic.
  • the processor(s) of the automated medical device e.g., the processor of the insertion system
  • the learning module may be used to construct a suitable algorithm (such as, a classification algorithm), by establishing relations/connections/pattems/correspondences/correlations between one or more variables of the primary datasets and/or between parameters derived therefrom.
  • a suitable algorithm such as, a classification algorithm
  • the learning may be supervised learning (e.g., classification, object detection, segmentation and the like).
  • the learning may be unsupervised learning (e.g., clustering, anomaly detection, dimensionality reduction and the like).
  • the learning may be reinforcement learning.
  • the learning may use a self-learning approach.
  • the learning process is automatic. In some embodiments, the learning process is semi-automatic. In some embodiments, the learning is manually supervised. In some embodiments, at least some variables of the learning process may be manually supervised/confirmed, for example, by a user (such as a physician).
  • the training stage may be an offline process, during which a database of annotated training data is assembled and used for the creation of data-analysis model(s)/algorithm(s), which may then be used in the inference stage. In some embodiments, the training stage may be performed "online", as detailed herein.
  • the generated algorithm may essentially constitute at least any suitable specialized software (including, for example, but not limited to: image recognition and analysis software, statistical analysis software, regression algorithms (linear, non-linear, or logistic etc.), and the like).
  • the generated algorithm may be implemented using an artificial neural network (ANN), such as a deep neural network (DNN), a convolutional neural network (CNN), a recurrent neural network (RNN) and the like, decision trees or graphs, association rule learning, support vector machines, inductive logic programming, Bayesian networks, instance-based learning, manifold learning, sub-space learning, and the like, or any combination thereof.
  • ANN artificial neural network
  • DNN deep neural network
  • CNN convolutional neural network
  • RNN recurrent neural network
  • decision trees or graphs decision trees or graphs
  • association rule learning support vector machines
  • inductive logic programming Bayesian networks
  • instance-based learning instance-based learning
  • manifold learning manifold learning
  • sub-space learning and the like, or any combination thereof.
  • FIGS. 6A-6B show an exemplary training module (FIG. 6A) and an exemplary training process (FIG. 6B), according to some embodiments.
  • a training module 70 may include two main hardware components/units: at least one memory 72 and at least one processing unit 74, which are functionally and/or physically associated. Training module 70 may be configured to train a model based on data.
  • Memory 72 may include any type of accessible memory (volatile and/or non-volatile), configured to receive, store and/or provide various types of data, to be processed by processing unit 74, which may include any type of at least one suitable processor, as detailed below.
  • the memory and the processing units may be functionally or physically integrated, for example, in the form of a Static Random Access Memory (SRAM) array.
  • SRAM Static Random Access Memory
  • the memory is a non-volatile memory having stored therein executable instructions (for example, in the form of a code, service, executable program and/or a model file).
  • the memory unit 72 may be configured to receive, store and/or provide various types of data values or parameters related to the data.
  • Memory 72 may store or accept raw (primary) data 722 that has been collected, as detailed herein. Additionally, metadata 724, related to the raw data 722 may also be collected/stored in memory 72.
  • Such metadata may include a variety of parameters/values related to the raw data, such as, but not limited to: the specific device which was used to provide the data, the time the data was obtained, the place the data was obtained (such as a specific procedure/operating room, specific institution, etc.), and the like.
  • Memory 72 may further be configured to store/collect data annotations (e.g., labels) 726.
  • the collected data may require additional steps for the generation of data- annotations that will be used for the generation of the machine-learning, deep-learning models or other statistical or predictive algorithms as disclosed herein.
  • such data annotations may include labels describing the clinical procedure’s characteristics, the automated device’s operation and computer-vision related annotations, such as segmentation masks, target marking, organs and tissues marking, and the like.
  • the different annotations may be generated in an “online” manner, which is performed while the data is being collected, or in an “offline” manner, which is performed at a later time after sufficient data has been collected.
  • the memory 72 may further include features database 728.
  • the features database 728 may include a database (“store") of previously known or generated features that may be used in the training/generation of the models.
  • the memory 72 of training module 70 may further, optionally, include pre-trained models 729.
  • the pre-trained models 729 include existing pre-trained algorithms which may be used to automatically annotate a portion of the data and/or to ease training of new models using “transfer-learning” methods and/or to shorten training time by using the pre-trained models as starting points for the training process on new data and/or to evaluate and compare performance metrics of existing versus newly developed models before deployment of new model to production, as detailed hereinbelow.
  • processing unit 74 of training module 70 may include at least one processor, configured to process the data and allow/provide model training by various processing steps (detailed in FIG. 6B). Thus, as shown in FIG. 6A, processing unit 74 may be configured at least to perform pre-processing of the data 742.
  • Pre-processing of the data may include actions for preparing the data stored in memory 72 for downstream processing, such as, but not limited to, checking for and handling null values, imputation, standardization, handling categorical variables, one-hot encoding, resampling, scaling, filtering, outlier removal etc.
  • Processing unit 74 may further, optionally, be configured to perform feature extraction 744, in order to reduce the raw data dimension and/or add informative domain-knowledge into the training process and allow the use of additional machine-learning algorithms not suitable for training on raw data and/or optimization of existing or new models by training them on both the raw data and the extracted features.
  • Feature extraction may be executed using dimensionality reduction methods, for example, Principal Components Analysis (PCA), Independent Component Analysis (ICA), Linear Discriminant Analysis (LDA), Locally Linear Embedding (LLE), t-distributed Stochastic Neighbor Embedding (t-SNE), Unified Manifold Approximation and Projection (UMAP) and/or Autoencoders, etc.
  • Feature extraction may be executed using feature engineering methods in which mathematical tools are used to extract domain-knowledge features from the raw data, for example - statistical features, such as mean, variance, ratio, frequency etc. and/or visual features, such as dimension or shape of certain objects in an image.
  • processing unit 74 may further be configured to execute model training 746.
  • FIG. 6B shows steps in an exemplary training process 76, executed by a suitable training module (such as training module 70 of FIG. 6A).
  • collected datasets may first require an Extract- Transform-Load (ETL) or ELT process that may be used to (1) Extract the data from a single or multiple data sources (including, but not limited to, the automated medical device itself, Picture Archiving and Communication System (PACS), Radiology Information System (RIS), imaging device, healthcare facility’s Electronic Health Record (EHR) system, etc.), (2) Transform the data by applying one or more of the following steps: handling missing values, checking for duplicates, converting data types as needed, encoding values, joining data from multiple sources, aggregating data, translating coded values etc.
  • PES Picture Archiving and Communication System
  • RIS Radiology Information System
  • EHR Electronic Health Record
  • the ETL process may be automatic and triggered with every new data collected. In other embodiments, the ETL process may be triggered at a predefined schedule, such as once a day or once a week, for example. In some embodiments, another business logic may be used to decide when to trigger the ETL process.
  • the data may be cleaned to ensure high quality data by, for example removal of duplicates, removal or modification of incorrect and/or incomplete and/or irrelevant data samples, etc.
  • the data is annotated.
  • the data annotations may include, for example, labels describing the clinical procedure’s characteristics, the automated device’s operation and computer- vision related annotations, such as segmentation masks, target marking, organs and tissues marking, existence of medical conditions/complications, existence of certain pathologies, etc.
  • the different annotations may be generated in an “online” manner, which is performed while the data is being collected, or in an “offline” manner, which is performed at a later time after sufficient data has been collected.
  • the data annotations may be generated automatically using an “active learning” approach, in which existing pre-trained algorithms are used to automatically annotate a portion of the data.
  • the data annotations may be generated using a partially automated approach with “human in the loop”, i.e., human approval or human annotations will be required in cases where the annotation confidence is low, or per other business logic decision or metric.
  • the data annotations may be generated in a manual approach, i.e., using human annotators to generate the required annotations using convenient annotation tools.
  • the annotated data is pre- processed, for example, by one or more of checking for and handling null values, imputation, standardization, handling categorical variables, one -hot encoding, resampling, scaling, filtering, outlier removal and other data manipulations, to prepare the data for further processing.
  • extraction (or selection) of various features of the data may be performed, as explained hereinabove.
  • the data and/or features extracted therefrom is divided to training data (“training set”), which will be used to train the model, and testing data (“testing set”), which will not be introduced into the model during model training so it can be used as “hold-out” data to test the final trained model before deployment.
  • the training data may be further divided into a “train set” and a “validation set”, where the train set is used to train the model and the validation set is used to validate the model’s performance on unseen data, to allow optimization/fine-tuning of the training process’ configuration/hyperparameters during the training process.
  • hyperparameters may be the learning-rate, weights regularization, model architecture, optimizer selection, etc.
  • the training process may include the use of a Cross-Validation (CV) methods in which the training data is divided into a “train set” and a “validation set”, however, upon training completion, the training process may repeat multiple times with different selections of “train set” and “validation set” out of the original training data.
  • CV Cross-Validation
  • Data augmentation may include, for example, generation of additional data from/based on the collected or annotated data. Possible augmentations that may be used for image data are: rotation, flip, noise addition, color distribution change, crop, stretch, etc. Augmentations may also be generated using other types of data, for example by adding noise or applying a variety of mathematical operations.
  • augmentation may be used to generate synthetic data samples using synthetic data generation approaches, such as distribution based, Monte-Carlo, Variational Autoencoder (VAE), Generative-Adversarial-Network (GAN), etc.
  • VAE Variational Autoencoder
  • GAN Generative-Adversarial-Network
  • the model is trained, wherein the training may be performed “from scratch” (i.e., an initial/primary model with initialized weights is trained based on all relevant data) and/or utilizing existing pre-trained models as starting points and training them only on new data.
  • the generated model is validated.
  • Model validation may include evaluation of different model performance metrics, such as accuracy, precision, recall, FI score, AUC- ROC, etc., and comparison of the trained model against other existing models, to allow deployment of the model which best fits the desired solution.
  • the evaluation of the model at this step is performed using the testing data (“test set”) which was not used for model training nor for hyperparameters optimization and best represents the real-world (unseen) data.
  • the trained model is deployed and integrated or utilized with the inference module to generate output based on newly collected data, as detailed herein.
  • the training database may grow in size and may be updated. The updated database may then be used to re-train the model, thereby updating/enhancing/improving the model’s output.
  • the new instances in the training database may be obtained from new clinical cases or procedures or from previous (existing) procedures that have not been previously used for training.
  • an identified shift in the collected data’s distribution may serve as a trigger for the re-training of the model.
  • an identified shift in the deployed model’s performance may serve as a trigger for the re-training of the model.
  • the training database may be a centralized database (for example, a cloud-based database), or it may be a local database (for example, for a specific healthcare facility).
  • learning and updating may be performed continuously or periodically on a remote location (for example, a cloud server), which may be shared among various users (for example, between various institutions, such as hospitals).
  • learning and updating may be performed continuously or periodically on a single or on a cohort of medical devices, which may constitute an internal network (for example, of an institution, such as a hospital).
  • a validated model may be executed locally on processors of one or more medical systems operating in a defined environment (for example, a designated institution, such as a hospital), or on local online servers of the designated institution.
  • the model may be continuously updated based on data obtained from the specific institution ("local data"), or periodically updated based on the local data and/or on additional external data, obtained from other resources.
  • federated learning may be used to update a local model with a model that has been trained on data from multiple facilities/tenants without requiring the local data to leave the facility or the institution.
  • FIGS. 7A-7B show an exemplary inference module (FIG. 7A) and an exemplary inference process (FIG. 7B), according to some embodiments.
  • inference module 80 may include two main hardware components/units: at least one memory unit 82 and at least one processing unit 84, which are functionally and/or physically associated. Inference module 80 is essentially configured to run collated data into the trained model to calculate/process an output/prediction.
  • Memory 82 may include any type of accessible memory (volatile and/or non-volatile), configured to receive, store and/or provide various types of data and executable instructions, to be processed by processing unit 84, which may include any type of at least one suitable processor.
  • the memory 82 and the processing unit 84 may be functionally or physically integrated, for example, in the form of a Static Random Access Memory (SRAM) array.
  • SRAM Static Random Access Memory
  • the memory is a non-volatile memory having stored therein executable instructions (for example, in the form of a code, service, executable program and/or a model file containing the model architecture and/or weights) that can be used to perform a variety of tasks, such as data cleaning, required pre-processing steps and inference operation (as detailed below) on new data to obtain the model’ s prediction or result.
  • executable instructions for example, in the form of a code, service, executable program and/or a model file containing the model architecture and/or weights
  • memory 82 may be configured to accept/receive, store and/or provide various types of data values or parameters related to the data as well as executable algorithms (in the case of machine learning based algorithms, these may be referred to as “trained models”).
  • Memory unit 82 may store or accept new acquired data 822, which may be raw (primary) data that has been collected, as detailed herein.
  • Memory module 82 may further store metadata 824 related to the raw data.
  • metadata may include a variety of parameters/values related to the raw data, such as, but not limited to: the specific device which was used to provide the data, the time the data was obtained, the place the data was obtained (such as specific operation room, specific institution, etc.), and the like.
  • Memory 82 may further store the trained model(s) 826.
  • the trained models may be the models generated and deployed by a training module, such as training module 70 of FIG. 6A.
  • the trained model(s) may be stored, for example in the form of executable instructions and/or model file containing the model’s weights, capable of being executed by processing unit 84.
  • Processing unit 84 of inference module 80 may include at least one processor, configured to process the new obtained data and execute a trained model to provide corresponding results (detailed in FIG. 7B).
  • processing unit 84 is configured at least to perform pre-processing of the data 842, which may include actions for preparing the data stored in memory 82 for downstream processing, such as, but not limited to, checking for and handling null values, imputation, standardization, handling categorical variables, one-hot encoding, resampling, scaling, filtering, outlier removal etc.
  • processing unit 84 may further be configured to extract features 844 from the acquired data, using techniques such as, but not limited to, Principal Components Analysis (PCA), Independent Component Analysis (ICA), Linear Discriminant Analysis (LDA), Locally Linear Embedding (LLE), t- distributed Stochastic Neighbor Embedding (t-SNE), Unified Manifold Approximation and Projection (UMAP) and/or Autoencoders, etc.
  • Feature extraction may be executed using feature engineering methods in which mathematical tools are used to extract domain- knowledge features from the raw data, for example: statistical features such as mean, variance, ratio, frequency etc. and/or visual features such as dimension or shape of certain objects in an image.
  • processing unit 84 may be configured to perform feature selection.
  • Processing unit 84 may further be configured to execute the model on the collected data and/or features extracted therefrom, to obtain model results 846.
  • the processing unit 84 may further be configured to execute a business logic 848, which can provide further fine-tuning of the model results and/or utilization of the model’s results to a variety of automated decisions, guidelines or recommendations supplied to the user.
  • FIG. 7B shows steps in an exemplary inference process 86, executed by a suitable inference module (such as inference module 80 of FIG. 7 A).
  • a suitable inference module such as inference module 80 of FIG. 7 A.
  • new data is acquired/collected from or related to newly executed medical procedures.
  • the new data may include any type of raw (primary) data, as detailed herein.
  • suitable trained model(s) (generated, for example by a suitable training model in a corresponding training process) may be loaded, per task(s). This step may be required in instances in which computational resources are limited and only a subset of the required models or algorithms can be loaded into RAM memory to be used for inference.
  • the inference process may require an additional management step responsible to load the required models from storage memory for a specific subset of inference tasks/jobs, and once inference is completed, the loaded models are replaced with other models that will be loaded to allow an additional subset of inference tasks/jobs.
  • the raw data collected in step 861 is pre-processed.
  • the pre-processing steps may be similar or identical to the pre-processing step preformed in the training process (by the training module), to thereby allow the data to be processed similarly by the two modules (i.e., training module and inference module).
  • this step may include actions such as, but not limited to, checking for and handling null values, imputation, standardization, handling categorical variables, one-hot encoding, etc., to prepare the input data for analysis by the model(s).
  • extraction of features from the data may be performed using, for example, Principal Components Analysis (PCA), Independent Component Analysis (ICA), Linear Discriminant Analysis (LDA), Locally Linear Embedding (LLE), t-distributed Stochastic Neighbor Embedding (t-SNE), ETnified Manifold Approximation and Projection (UMAP) and/or Autoencoders, etc.
  • PCA Principal Components Analysis
  • ICA Independent Component Analysis
  • LDA Linear Discriminant Analysis
  • LLE Locally Linear Embedding
  • t-SNE t-distributed Stochastic Neighbor Embedding
  • UMAP ETnified Manifold Approximation and Projection
  • UMAP Manifold Approxim
  • the results of the model are obtained, i.e., the model is executed on the processed data to provide corresponding results.
  • fine-tuning of the model results may be performed, whereby post-inference business logic is executed.
  • Execution of post-inference business logic refers to the utilization of the model’s results to a variety of automated decisions, guidelines or recommendations supplied to the user.
  • Post inference business logic may be configured to accommodate specific business and/or clinical needs or metrics, and can vary between different scenarios or institutions based on users’ or institutions’ requests or needs.
  • the model results may be utilized in various means, including, for example, enhancing the operation of the automated medical device (e.g., enabling automatic target tracking and closed-loop steering based on the tracked real-time position of the target, etc.), providing recommendations regarding various device operations (including recommending one or more optimal entry points, recommending optimized trajectories or modes of operation, etc.), providing prediction, prevention and/or early detection of various clinical conditions (e.g., pneumothorax, breathing anomalies, bleeding, etc.), as disclosed, for example, in co-owned International Patent Application No. PCT/IL2021/050438, which is incorporated herein by reference in its entirety, and the like, as further detailed hereinabove.
  • various clinical conditions e.g., pneumothorax, breathing anomalies, bleeding, etc.
  • inference operation may be performed on a single data instance. In other embodiments, inference operation may be performed using a batch of multiple data instances to receive multiple predictions or results for all data instances in the batch. In some embodiments, an ensemble of models or algorithms can be used for inference, where the same input data is processed by a group of different models and results are being aggregated using averaging, majority voting or the like. In some embodiments, the model can be designed in a hierarchical manner where input data is processed by a primary model and based on the prediction or result of the primary model’s inference, the data is processed by a secondary model. In some embodiments, multiple secondary models may be used, and hierarchy may have more than two levels.
  • the methods and systems disclosed herein utilize data-driven methods to create algorithms based on various datasets, including, functional, anatomical, clinical, diagnostic, demographic and/or administrative datasets.
  • artificial intelligence e.g., machine-learning
  • algorithms are used to learn the complex mapping/correlation/correspondence between the multimodal (e.g., data obtained from different modalities, such as images, logs, sensory data, etc.) input datasets parameters procedure, clinical, operation, patient related and/or administrative information, to optimize the clinical procedure’s outcome or any other desired functionalities.
  • the systems and methods disclosed herein determine such optimal mapping using various approaches, such as, for example, a statistical approach, and utilizing machine-learning algorithms to learn the mapping/correlation/correspondence from the training datasets.
  • the algorithm may be a generic algorithm, which is agnostic to specific procedure characteristics, such as type of procedure, user, service provider or patient.
  • the algorithm may be customized to a specific user (for example, preferences of a specific healthcare provider), a specific service provider (for example, preferences of a specific hospital), a specific population (for example, preferences of different age groups), a specific patient (for example, preferences of a specific patient), and the like.
  • the algorithm may be combined a generic portion and a customized portion.
  • FIG. 8 shows a flowchart 90 illustrating the steps of an exemplary method of closed-loop steering of a medical instrument toward a predicted location of a target, according to some embodiments.
  • parameters related to the medical procedure, and in particular to the steering of a medical instruments are obtained or identified, based on image(s) of region(s) of interest in the subject’s body.
  • Such parameters include the target to be reached, the entry point for the medical instrument and optionally, "no-fly" zones (which are regions to be avoided during the procedure).
  • the parameters may be identified/determined automatically (for example, by image analysis and/or ML/DL algorithms) and/or obtained from a user (for example, a healthcare provider), who may mark one or more of the above parameters on the image(s).
  • creating AI models to detect/identify/recommend the above parameters may include a preliminary phase, in which one or more individual models are trained.
  • generating a “no-fly” zone model may include an accuracy estimation model, a procedure duration estimation model and/or a risk estimation model, as disclosed, for example, in abovementioned co-owned International Patent Application No. PCT/IL2021/050437.
  • a planned trajectory for the medical instrument from the entry point to the target is calculated/determined.
  • the calculation of the planned trajectory may take into account various parameters, including but not limited to: type and characteristics of the medical instrument, type of imaging modality, selected insertion point, determined “no-fly” zones, type and characteristics of the tissue(s) through which the instrument is intended to advance, the characteristics of the target (type, dimensions, shape, etc.) and its location within the subject’s body, milestone points (i.e., “secondary targets” through which the medical instrument should pass) and the like, or any combination thereof.
  • a maximal allowable curvature level may be pre-set for the calculation of the planned trajectory.
  • a maximal allowable lateral movement of the instrument at the entry point may be pre-set for the calculation of the planned trajectory.
  • a maximal allowable proximity to obstacle(s) may be pre-set for the calculation of the non linear trajectory.
  • checkpoints may be set along the trajectory, either manually or automatically.
  • the medical instrument is steered toward the target according to the planned trajectory.
  • steering of the medical instrument may be based on an inverse kinematics solution applied to a virtual springs model to calculate the required motion to be imparted to the instrument (or to an end effector of the automated device, as shown, for example, in Fig. 1A) in order for the instrument to follow the planned trajectory, as described in further detail hereinabove.
  • real-time images of the region of interest may be obtained.
  • the images may be obtained at any suitable format and form, such as, for example, discrete images obtained at spatial and/or temporal intervals (e.g., upon reaching a checkpoint), semi-continuous images (for example, discrete images obtained at a high frequency), continuous images (for example, obtained as a video), and the like, depending on the utilized imaging modality.
  • the estimated tissue movement during the steering procedure is calculated/determined. In some embodiments, tissue movement during the entire procedure may be estimated. In some embodiments, tissue movement during a portion of the procedure (for example, between consecutive checkpoints) may be estimated.
  • tissue movement may include changes in the location of the tissue during the procedure, as well as changes in the anatomical structure of the tissue, such as changes in shape and/or size of the tissue, and the like, or any combination thereof.
  • tissue movement may refer to changes in the location and/or structure of the tissue as determined at specific points in time during the procedure.
  • tissue movement may refer to a movement profile, i.e., to changes in the location and/or structure of the tissue occurring during the entire steering procedure or during a certain time interval during the procedure (i.e., location/structure as a function of time).
  • estimation of tissue movement may take into account tissue movement resulting from the subject’s breathing.
  • the real-time position of the medical instrument and the target may be determined. The determination of the real-time positions may be performed automatically (for example, by a processor performing image analysis or any other suitable algorithm, such as ML/DL model(s)) and/or manually (for example, by a healthcare provider). In some embodiments, step 905 and step 906 may be performed in parallel.
  • the estimated target movement during the steering procedure may be determined/calculated, as further detailed herein.
  • target movement may include changes in the location of the target during the procedure, as well as changes in the anatomical structure of the target, such as changes in shape and/or size of the target, and the like, or any combination thereof.
  • target movement may refer to changes in the location and/or structure of the target as determined at specific points in time during the procedure.
  • target movement may refer to a movement profile, i.e., to changes in the location and/or structure of the target occurring during the entire steering procedure or during a certain time interval during the procedure (i.e., location/structure as a function of time).
  • the expected target position and/or expected target movement profile may be determined utilizing data analytics algorithms, such as, AI models, which may be generated using data obtained from past procedures, as further described hereinbelow.
  • the trajectory may be updated based on the estimated/predicated target movement and/or the estimated/predicted tissue movement, to facilitate the medical instrument reaching the predicted target location.
  • the trajectory may be updated using data- analysis algorithm(s), such as AI models, which may be generated using data obtained from past procedures, as further described hereinbelow.
  • the updated trajectory is evaluated to determine if it is optimal.
  • the determination if the updated trajectory is optimal may include comparing one or more parameters, or associated values, to a predetermined threshold. In some embodiments, the updated trajectory may be determined as optimal if a certain parameter (or associated value) exceeds its predetermined threshold. In some embodiments, the updated trajectory may be determined as optimal if a certain parameter (or associated value) is below its predetermined threshold.
  • the updated trajectory may be determined as optimal if a certain parameter exceeds its respective threshold and a different parameter is below its respective threshold. The determination if the updated trajectory is optimal may be performed automatically and/or manually. If the updated trajectory is determined not to be optimal, step 908 may be repeated, to further update the trajectory. Once a trajectory is found to be optimal, at step 910, the medical instrument is steered toward the target in accordance with the updated trajectory. Next, at step 911 it is determined if the target has been reached by the medical instrument. If the target has been reached, the procedure ends 912. If, however, the target has not yet been reached, steps 904- 911 may be repeated until the target is reached.
  • relevant parameters indicative of the trajectory being optimal may be utilized as feedback in the calculation performed in step 907 (e.g., as feedback to an AI-based target movement model), to increase accuracy and reliability of such calculations in subsequent iterations of steps 904-911, as well as in further procedures.
  • FIG. 9 shows a flowchart 92 illustrating steps of an exemplary method of closed-loop steering of a medical instrument toward a predicted location of a target, utilizing a dynamic trajectory model (an “inference” process), according to some embodiments.
  • parameters related to the medical procedure, and in particular to the steering of a medical instruments are obtained or identified, based on images of region(s) of interest in the subject body obtained using an imaging system (e.g., CT, MRI, ultrasound, CT fluoroscopy, CBCT, etc.).
  • an imaging system e.g., CT, MRI, ultrasound, CT fluoroscopy, CBCT, etc.
  • Such parameters include the target to be reached, the entry point for the medical instrument and optionally, "no-fly" zones.
  • the parameters may be identified/determined automatically (for example, by image analysis and/or AI-based algorithms) and/or obtained from a user (for example, a healthcare provider), who may mark one or more of the above parameters on the image(s).
  • a planned trajectory for the medical instrument from the entry point to the target is calculated/determined.
  • the planned trajectory may be determined using a data-analysis algorithm, for example a dynamic trajectory model (see step 926), which takes into account the predicted movement of the tissue, the predicted movement of the target, predicted movement of a tip of the medical instrument, and the like, to predict the location of the target at a desired time/space point (for example, at the end-point of the procedure) and plan the trajectory according thereto.
  • the medical instrument is steered toward the target according to the planned trajectory.
  • steering of the medical instrument may be based on an inverse kinematics solution applied to a virtual springs model to calculate the required motion to be imparted to the instrument (or to an end effector of the automated device) in order for the instrument to follow the planned trajectory, as described in further detail hereinabove.
  • real-time images of the region of interest may be obtained.
  • the images may be obtained at any suitable format and form, such as, for example, discrete images obtained at spatial and/or temporal intervals (for example, at different checkpoints), semi-continuous images (for example, discrete images obtained at a high frequency), continuous images (for example, obtained as a video), and the like, depending on the utilized imaging modality.
  • the real-time position of the medical instrument, the target, and optionally other regions of interest may be determined. In some embodiments, the real time position of previously determined “no-fly” zones may be determined.
  • the determination of the real-time positions may be performed automatically (for example, by a processor performing image analysis or executing suitable algorithm(s), such as ML/DL model(s)) and/or manually (for example, by a healthcare provider).
  • a dynamic trajectory model is applied to update the trajectory (if needed).
  • the dynamic trajectory model (DTM) may include one or more algorithms and/or AI-based models, each of which may be configured to provide information, predictions, estimations and/or calculations regarding various parameters and variables that may affect tissue, target and/or medical instrument movement and the consequent trajectory.
  • Such algorithms and models may provide parameters such as, predicted/estimated tissue movement, predicted/estimated target movement, and/or predicted/estimated medical instrument movement, to ultimately predict the estimated target spatiotemporal location during and/or at the end of the procedure, to thereby allow the planning and/or updating of a corresponding trajectory to facilitate the medical instrument reaching the target at its predicted location.
  • estimation of tissue movement may take into account tissue movement resulting from the patient’s respiratory cycle.
  • the patient’s respiratory cycle and/or the tissue movement resulting from the patient’s respiratory cycle may be predicted using a separate algorithm/model.
  • estimation of tissue movement may take into account tissue movement resulting from the medical device steering toward the target.
  • the tissue movement resulting from the medical device steering may be predicted using a separate algorithm/model.
  • the dynamic trajectory model DTM may include algorithms/models to predict the movement of previously determined “no-fly” zones and/or algorithms/models to update the “no-fly” zones map according to the predicted tissue and target movement.
  • the DTM may include determining if a calculated trajectory is optimal, based on various parameters as described herein, such that the output of the model is the optimal trajectory. It can be appreciated that different trajectories may be considered as “optimal”, depending on the chosen parameters, the weight given to each parameter, user preferences, etc.
  • step 927 the medical instrument may be steered toward the target in accordance with the updated trajectory.
  • step 928 it is determined if the target has been reached by the medical instrument. If the target has been reached, the procedure ends 929. If, however, the target has not yet been reached, steps 924-928 may be repeated, until the target is reached.
  • tissue movement may include various variables and parameters related to changes in the tissue during (continuously or at discrete stages) the medical procedure, including, but not limited to: locations of the tissue during the procedure, changes in the anatomical structure of the tissue (e.g., changes in the shape, dimensions, density and/or form of the tissue), and the like, or any combination thereof.
  • tissue movement may refer to a time-dependent profile, i.e., to changes in the location and/or structure of the tissue occurring during the entire procedure or during certain time interval(s) during the procedure (i.e., location/orientation/structure as a function of time).
  • the input data 951 may be used directly with the tissue movement model and/or may be processed/analyzed by a tissue segmentation model 953.
  • the input data may include any relevant parameters and/or datasets from previous procedures which are related to tissue movement, and the target variable (“ground truth”) for training the tissue movement model how the tissue actually moved or otherwise changed during these previous procedures.
  • the input data 951 may include various data sets, selected from, for example, but not limited to: data related to the clinical procedure and patient related data, such as, tissues’ characteristics (e.g., types, boundaries between tissue types, dimensions, elasticity, etc.), medical instrument (e.g., needle) type and characteristics (e.g., gauge, length, tip type (e.g., diamond, bevel), etc.), relative angle of the medical instrument (for example, the relative angle of the medical instrument to the patient body and or to an axial plane and/or to a sagittal plane of images obtained for an imaging system, the relative angle of a tip of the medical instrument to the patient body and or to an axial plane and/or to a sagittal plane of images obtained for an imaging system, and the like), respiration signals (e.g., from respiration sensor or from a ventilator, if the patient was ventilated), respiration abnormalities, patient characteristics (age, gender, race, BMI, medical condition, smoking habits, ventilation, intubation, self-breathing
  • logs of motors’ performance data procedure timing, skin to target time, entry and target positions, trajectory length, target movements and paths updates, number and position of checkpoints, errors and correction of checkpoints, images (e.g., CT scans) generated during the procedure (e.g., at checkpoints and/or at milestone points (“secondary targets”)), magnitude of lateral steering of the medical instrument, medical device position, instrument angles, distance of the instrument from the different tissues/organs, instrument insertion speed, patient’s position (e.g., supine, prone, decubitus), any other relevant data influencing tissue movement during the medical procedures (medical instrument steering in the subject’s body), and the like, or any combination thereof.
  • data annotations may further be utilized for model training and validation.
  • the data annotations may include values and/or parameters such as, but not limited to: organs segmentation masks and/or bounding boxes and/or location, tissues segmentation masks and/or bounding boxes and/or location, instrument segmentation masks and/or bounding box and/or location, and the like.
  • each or at least some of the parameters may be attributed an appropriate weight which is taken into account in generating the tissue movement model.
  • the input data 951 may include multi-modal data collected from past procedures and arranged, where possible/applicable, as time-series together with the patient's parameters and medical history.
  • the time-series structure may allow the analysis of time-dependency events in past procedures’ data to better predict the tissue movement during a procedure and better study the impact of the different factors and their correlation to the procedure timeline.
  • specialized tissue segmentation model 953 may be used to generate meaningful domain-knowledge features that may, in turn, be input to the primary tissue movement model during the training process.
  • tissue segmentation models may utilize image segmentation and/or reconstruction, to identify/detect different tissues (e.g., organs).
  • tissue segmentation may be performed on 2D images (“slices”) which are then reconstructed to generate 3D images of the segmented tissues/organs.
  • the output of the tissue movement model may be tissue movement prediction 955, which may include one or more of predicted tissue location, change of tissue anatomical structure (change in shape, tissue deformation), and the like, during the input (current) procedure.
  • the tissue movement prediction together with ground- truth annotations regarding tissue movement during a procedure, may be used to calculate a loss function representing the error between the tissue movement prediction and the ground- truth data. During the training process, optimization of this loss function will allow the adjustment of the model’s weights.
  • the tissue movement prediction model may be trained in a multi-task and/or multi-output approach.
  • the tissue movement prediction model may be trained to predict the exact movement (or any other related changes, as detailed above) of the tissue at each point in time during the procedure. This may require corresponding time-based annotations of tissue related parameters (for example, location, size, shape, form, etc.) at desired points in time throughout the procedures in the dataset.
  • the tissue movement model may be used in the training or application of various other models, such as, for example, target movement prediction model, trajectory prediction model, and the like.
  • the tissue movement model may be deployed and used to estimate tissue movement during a medical procedure, such as steering of medical instrument toward a moving target.
  • the prediction of the tissue movement model may be for a given instrument trajectory, patient position (e.g., supine, prone, etc.) and/or patient respiration behavior.
  • FIG. 11 shows a block diagram 97 illustrating an exemplary method of generating (training) a target movement model for prediction of target movement during a medical procedure.
  • input data 971 from past procedures is used to train the target movement model 974 to predict target movement (output/result) 975 during a medical procedure.
  • target movement may include various variables and parameters related to changes in the target during (continuously or at discrete stages of) the medical procedure, including, but not limited to: locations of the target during the procedure, changes in the anatomical structure of the target during the procedure (e.g., changes in the dimensions, shape, etc. of the target), and the like, or any combination thereof.
  • target movement may refer to a time-dependent profile, i.e., to changes in the location and/or structure of the target occurring during the entire procedure or during certain time interval(s) during the procedure (i.e., location/structure as a function of time).
  • the input data 971 may include any relevant parameters and/or datasets from previous procedures which are related to target movement, and the target variable (“ground truth”) for training the target movement model may be how the target actually moved or otherwise changed during these past procedures.
  • the input data 971 may include various data sets, selected from, for example, but not limited to: data related to clinical procedure and patient related data, such as, target type, target dimensions, target depth, target shape, tissue characteristics (e.g., types, boundaries, dimensions, elasticity, etc.), medical instrument (e.g., needle) type and characteristics (e.g., gauge, length, tip type (e.g., diamond, bevel)), respiration signals, respiration abnormalities, patient characteristics (age, gender, race, BMI, medical condition, smoking habits, ventilation, intubation, self-breathing, sedation etc.), data related to the medical device and its operation, including, for example, motors’ current traces (i.e.
  • logs of motors’ performance data procedure timing, skin to target time, entry and target positions, trajectory length, paths updates, number and position of checkpoints, errors and correction of checkpoints, images (e.g., CT scans) generated during the procedure (e.g., at checkpoints and/or at milestone points (“secondary targets”)), magnitude of lateral steering of the medical instrument, medical device position, angle of the instrument relative to the target, distance of the instrument from the target, instrument insertion speed, final tip-to-target distance, patient’s position (e.g., supine, prone, decubitus), any other relevant dataset influencing target movement during medical procedures (medical instrument steering in the subject’s body), and the like, or any combination thereof.
  • images e.g., CT scans
  • secondary targets e.g., CT scans
  • secondary targets e.g., CT scans
  • magnitude of lateral steering of the medical instrument e.g., medical device position, angle of the instrument relative to the target, distance of the instrument from
  • data annotations may further be utilized for model training and validation.
  • the data annotations may include values and/or parameters such as, but not limited to: organs segmentation masks and/or bounding boxes and/or location, tissues segmentation masks and/or bounding boxes and/or location, target contours and/or bounding boxes and/or location, instrument segmentation masks and/or bounding box and/or location, and the like.
  • each or at least some of the parameters may be attributed an appropriate weight which is taken into account in generating the target movement model.
  • the input data 971 may include multi-modal data collected from past procedures and arranged, where possible/applicable, as time-series together with the patient's parameters and medical history.
  • the time-series structure may allow the analysis of time-dependency events in past procedures’ data to better predict the target movement during a procedure and better study the impact of the different factors and their correlation to the procedure timeline.
  • the input data 971 may be first processed by a target detection model 972, configured to detect the target in image(s).
  • the target detection model may include any type of algorithm(s), including, for example, image analysis algorithm(s) and/or ML/DL algorithm(s), allowing the detection of a target in an image or a set of images (for example, a video stream).
  • image analysis algorithm(s) and/or ML/DL algorithm(s)
  • ML/DL algorithm(s) allowing the detection of a target in an image or a set of images (for example, a video stream).
  • data from previous procedures may be used, wherein the initial detection of a target may be manual (for example, by marking of a target by a healthcare provider on an image), semi-automatic and/or automatic (for example, by suitable image processing and/or data-analysis algorithms).
  • a combination of manual and automatic target detection may be utilized.
  • initial manual marking is performed by the user (e.g., a healthcare provider) and the identification/detection/recognition of the target in subsequent (following) images may be performed automatically utilizing a suitable algorithm.
  • the user may be required to confirm the automatically detected target.
  • the target detection model may optionally include re-identification of the target to increase accuracy of the model.
  • the results of the target detection model may then be used as input to the target movement model 974.
  • the results of the target detection model may also be used as input to a trained tissue movement model 973, the results of which are then used as input to the target movement model 974.
  • the trained tissue movement model 973 may be the tissue movement model described in FIG.
  • Utilization of the tissue movement model in the training of the target movement model may increase accuracy, as it allows accurate target tracking, while minimizing artifacts which may be caused by tissue movement (for example neighboring tissues) that may otherwise be erroneously interpreted or considered as target movement.
  • the output of the thus generated target movement model 974 is a target movement prediction 975.
  • the target movement prediction may include, for example, prediction of the location (e.g., spatiotemporal location) of the target at different stages of the procedure (including, for example, at the end-point of the procedure), prediction of related changes in the target during the procedure (for example, change in shape, size, structure, orientation, etc.)
  • the target movement prediction together with ground-truth annotations regarding target movement during a procedure, may be used to calculate a loss function representing the error between the target movement prediction and the ground-truth data. During the training process, optimization of this loss function will allow the adjustment of the model’s weights.
  • the target movement prediction model may be trained in a multi-task and/or multi-output approach.
  • the target movement prediction model may be trained to predict the exact movement/location (or any other related changes, as detailed above) of the target at each point in time during the procedure. This may require corresponding time-based annotations of target related parameters (for example, location, size, shape, form, etc.) at desired points in time throughout the procedures in the dataset.
  • the target movement model may be used in the training or application of various other models, such as, for example, a trajectory prediction model, and the like.
  • the target movement model may be deployed and used to estimate target movement during a medical procedure, such as steering of medical instrument toward a moving target.
  • the prediction of the target movement model may be for a given instrument trajectory, patient position (e.g., supine, prone, etc.) and/or patient respiration behavior.
  • FIG. 12 shows a block diagram 100 illustrating an exemplary method of generating (training) a data-analysis (e.g., AI-based) model for outputting a trajectory for steering a medical instrument toward a moving target.
  • input data 1001 is used to train the trajectory model 1005 to predict a trajectory prediction 1006.
  • Various types of input data 1001 may be used for training the trajectory model.
  • the input data 1001 may be used directly to train the trajectory model 1005 and may alternatively or additionally be used as input to one or more trained models, such as, tissue movement model 1002, target movement model 1003 and/or "no-fly" zones model 1004.
  • the input data 1001 may include any type of datasets and/or parameters from previous procedures which are relevant for the prediction of the trajectory, and the target variable (“ground truth”) for training the trajectory model how the trajectory was actually adjusted/updated during these past procedures.
  • data annotations included in the data used for training and/or validating the trajectory model may include trajectories which were not updated, or were not correctly updated, when tissue/target movement occurred in previous procedures, such that the medical instrument did not reach the target. Such data annotations may also be artificially generating for the purpose of training the trajectory model.
  • the input data may include, for example, data related to clinical procedure and patient related data, such as, target type, target dimensions, target depth, target shape, tissue characteristics (e.g., types, boundaries, dimensions, elasticity, density, etc.), medical instrument type and characteristics (e.g., gauge, length, material, tip type (e.g., diamond, bevel, etc.)), maximal allowable curvature, maximal allowable lateral movement of the instrument at the entry point, respiration signals, respiration abnormalities, patient characteristics (age, gender, race, BMI, etc.), data related to the medical device and its operation, including, for example, motors’ current traces, procedure timing, skin to target time, entry and target positions, trajectory length, target movements, number and position of checkpoints, errors and correction of checkpoints, images (e.g., CT scans) generated during the procedure (e.g., at checkpoints and/or at milestone points (“secondary targets”)), magnitude of lateral steering of the medical instrument, medical device position, relative angles of the medical instrument, distance of the instrument
  • the input data 1001 may include multi-modal data collected from past procedures and arranged, where possible/applicable, as time-series together with the patient's parameters and medical history.
  • the time-series structure may allow the analysis of time- dependency events in past procedures’ data to better predict trajectory adjustments during a procedure and better study the impact of the different factors and their correlation to the procedure timeline.
  • the relevant input data 1001 is used (directly or indirectly) for the training of the trajectory model which provides a prediction regarding a trajectory of a medical instrument during a medical procedure, which takes into account various variables and parameters, to facilitate the medical instrument reaching a moving target in the most efficient, safe and accurate manner.
  • the trajectory model may be trained based on the input data directly and/or based on output of one or more trained models, such as, for example: tissue movement model 1002 (such as the model described in FIG. 10 above), target movement model 1003 (such as the model described in FIG. 11 above) and/or "no-fly" zones model, which is a model for predicting regions which are to be avoided during the medical procedure.
  • the trajectory may be any type of trajectory, such as, 2D trajectory or 3D trajectory.
  • the generated trajectory prediction together with ground- truth annotations regarding trajectory adjustments during a procedure, may be used to calculate a loss function representing the error between the trajectory prediction and the ground-truth data. During the training process, optimization of this loss function will allow the adjustment of the model’s weights.
  • the trajectory model may be trained in a multi-task and/or multi-output approach. In some embodiments, the trajectory model may be trained to predict the exact trajectory adjustments required at each point in time during the procedure.
  • the output of the trajectory model facilitates the spatio-temporal reaching of a target by a medical instrument steered in accordance with the output trajectory, in medical procedures which require steering of a medical instrument toward a moving target (e.g., biopsy, ablation, fluid delivery, fluid drainage, etc.).
  • a medical instrument steered in accordance with the output trajectory, in medical procedures which require steering of a medical instrument toward a moving target (e.g., biopsy, ablation, fluid delivery, fluid drainage, etc.).
  • FIG. 13 shows a block diagram 110 illustrating an exemplary method of generating (training) a data-analysis (e.g., AI-based) model for outputting a trajectory in an image-guided procedure of inserting and steering a medical instrument toward a moving internal target, to facilitate the medical instrument accurately reaching the internal target, according to some embodiments.
  • a data-analysis e.g., AI-based
  • real time images e.g., scans
  • the trajectory can be updated as detailed herein.
  • the generated (updated) trajectory may affect several parameters, such as the accuracy of the procedure (e.g., the tip-to-target distance), the tissue and/or target movement resulting from the interaction with the medical instrument as it follows the trajectory toward the target, the duration of the steering phase of the procedure, the risk level of the procedure (e.g., probability of complications), etc. Therefore, in some embodiments, the trajectory model should also take one or more of these parameters into account.
  • the training process of the trajectory model may include one or more phases of training, including, for example, training the basic trajectory model to output a trajectory prediction (as described, for example, in FIG.
  • the target variable (“ground truth”) for training the accuracy model is the actual procedure accuracy (e.g., instrument tip- to-target accuracy).
  • the target variable for training the interaction model is the actual movement of the target and/or tissue due to interaction (direct or indirect) with the medical instrument.
  • the target variable for training the duration model is the actual duration of the steering phase of the procedure.
  • the target variable for training the risk model is the occurrence of complications during the procedure. It can be appreciated that for each individual model the target variable is not included in the input variables used for the training process of the individual model.
  • training the basic trajectory model includes training the model to predict the trajectory as similar as possible to the ground truth trajectory (i.e., with minimal error from the actual successful trajectory in previous procedures).
  • the trajectory model is trained to output an optimized trajectory, which allows reaching the spatio-temporal location of the target in the most accurate, safe, fast, efficient, and/or reliable manner.
  • such optimal trajectory may result in the maximal possible tip-to-target accuracy, minimal movement of tissue and/or target due to insertion of the medical instrument through the tissue, minimal steering phase duration and/or minimal risk for clinical complications during instrument steering.
  • such optimal trajectory may have a minimal trajectory prediction error.
  • such training may be executed using a loss function, e.g., a Multi-Loss scheme.
  • a loss function e.g., a Multi-Loss scheme.
  • such training may be executed using Ensemble Learning methods.
  • such training may be executed using a Multi-Output regression/classification approach.
  • Multi-Task learning may be used.
  • FIG. 13 which illustrates training executed using a Multi-Loss scheme
  • input data 1102 such as the data described above, is used to train the trajectory model 1104 to predict the trajectory 1106.
  • tissue movement model 1130, target movement model 1132 and/or "no-fly" zones model 1134 may also be used in the training of the trajectory model.
  • the individual models’ predictions (1116, 1118, 1120 and/or 1122), together with the trajectory model’s prediction (or respective scores), are then used to calculate a loss function 1124, aimed to minimize the trajectory prediction error, maximize the tip-to-target accuracy, minimize the interaction resultant movement, minimize the procedure duration and minimize the risk.
  • the generated weighted loss represents the model’s prediction error, which may be used to fine-tune or adjust the trajectory model’s 1104 weights as part of the training process.
  • only one or more of the individual models described above are used in the training process of the trajectory model.
  • only the accuracy and duration models may be used, whereas in other embodiments only the accuracy and interaction movement models may be used.
  • the weights/coefficients used in the Multi-Loss function 1124 may be adjusted according to certain needs and/or preferences. For example, if minimal interaction movement and/or minimal duration have a higher priority than minimized risk or tip-to-target accuracy, the interaction movement and duration may be given higher coefficients during the training process, such that they will have a greater impact on the generated trajectory.
  • different trajectory models may be trained for different needs and/or preferences.
  • one trajectory model may be trained to generate a trajectory that will allow the highest achievable tip-to- target accuracy
  • another trajectory model may be trained to generate a trajectory that will result in the lowest achievable interaction movement
  • a further trajectory model may be trained to generate a trajectory that will result in the shortest achievable procedure duration, etc.
  • a single trajectory model may be trained and deployed, and the coefficients used in the Multi-Loss function 1124 may be adjusted during inference, i.e., during use of the trajectory model to generate a trajectory for (or during) a specific procedure.
  • the need/preference upon which the coefficients may be fine-tuned may be associated with, for example, a specific procedure type (e.g., biopsy, ablation, fluid drainage, etc.), a specific target type, a specific user, a specific population, and the like.
  • a specific procedure type e.g., biopsy, ablation, fluid drainage, etc.
  • a specific target type e.g., a specific user, a specific population, and the like.
  • the systems and methods disclosed herein may allow automatic or semi-automatic steering corrections, for example, if the tip of the medical instrument deviates and/or if the tissue or the target does not move as predicted.
  • the systems and methods disclosed herein may allow automatic steering through different tissue types and crossing tissue layer boundaries.
  • the insertion speed may be automatically or semi-automatically adjusted (e.g., decreased) prior to the instrument crossing a boundary between tissue layers, especially when there is a risk of clinical complications, such as pneumothorax when crossing the pleura.
  • the maximal curvature of the medical instrument may be continuously verified against a pre-set threshold, so as to ensure safety and accuracy.
  • data obtained from various sensors may be utilized with the methods disclosed herein, including, for example, force sensor, respiration sensor, imaging unit sensor, camera, and the like.
  • the trajectory for the medical instrument may be pre- operatively calculated using a trajectory model, i.e., taking into account estimated tissue and target movements during the procedure.
  • steering of the medical instrument may be carried out automatically and continuously from the insertion point to the target, with the steering being paused only if indication of risk to the patient or of substantial deviation from the planned trajectory, etc., is received.
  • indications may be generated using sensors disposed on the medical instrument, the medical device (e.g., force sensor) or at the procedure room (for example, examining bed, portable device), and external camera, or the like.
  • respiration monitoring and/or prediction may be utilized for the trajectory calculation and/or for the timing of insertion of the medical instrument, to increase safety, accuracy and reliability of the procedures.
  • respiration synchronization may be performed, either manually (for example, by instructing the subject to hold breath), automatically or semi automatically, for example, by determining the respiration cycle and synchronizing insertion and/or steering steps therewith.
  • one or more of the models disclosed herein may take into account the respiration cycle.
  • tissue movement model, target movement model, trajectory model may take into account the respiration cycle.
  • tissue and/or target movement due to respiration movement may be ignored.
  • FIGS. 14A-14D demonstrate real-time updating of a trajectory and steering a medical instrument according thereto, based on predicted movement of target, according to some embodiments.
  • the exemplary planned and updated trajectories presented may be calculated using a processor executing the models and methods disclosed herein, such as the processor(s) of the insertion system described in FIG. IB, and the insertion and steering of the medical instrument toward the predicted target location according to the planned and updated trajectories may be executed using an automated insertion device, such as the automated device of FIG. 1A.
  • the automated device may be body-mountable, for example by attachment to a subject’s body using an attachment apparatus, such as the attachment apparatus described in abovementioned co-owned International Patent Application Publication No. WO2019/234,748.
  • the automated insertion device is marked as automated device 150, which is body mountable in FIGS. 14A-14D.
  • FIGS. 14A-14D are shown on CT image- views, however it can be appreciated that the planning can be carried out similarly on images obtained from other imaging systems, such as ultrasound, MRI and the like.
  • FIG. 14A shows an automated insertion device 150 mounted on a subject’s body (a cross-section of which is shown in FIGS. 14A-14D) and a planned (initial) trajectory 160 from an entry point toward the initial position of an internal target 162.
  • the trajectory may have checkpoints marked thereon.
  • the planned trajectory 160 is a linear or substantially linear trajectory. In some embodiments, if necessitated (for example, due to obstacles), the planned trajectory may be a non-linear trajectory.
  • the planned trajectory may be updated in real-time based on the real-time position of the medical instrument (for example, the tip thereof) and/or the real-time position of the target and/or the real-time positions of obstacle/s and/or based on predictions generated by one or more machine learning models, such as those detailed herein, for example, in FIGS. 12-13.
  • the initial target location may be obtained manually and/or automatically.
  • the target position may be determined as detailed in FIG. 11 utilizing the target detection model.
  • FIG. 14B shows medical instrument 174 being inserted and steered into the subject’s body, along the planned trajectory 160. As shown in FIG.
  • the target has moved from its initial position to new (updated) position 162’ during and as a result of the advancement of the medical instrument within the tissue, as detailed herein.
  • the determination of the real-time location of the target may be performed manually by the user, i.e., the user visually identifies the target in images (continuously or manually or automatically initiated, for example when the instrument reaches a checkpoint), and marks the new target position on the image using the GUI.
  • the determination of the real-time target location may be performed automatically by a processor using image processing techniques and/or data-analysis algorithm(s), such as detailed hereinabove.
  • the trajectory may be updated based on the determined real-time position of the target.
  • the subsequent movement of the target is predicted, for example using a target movement model, as detailed hereinabove, and the trajectory is then updated based on the predicted location (e.g., the end-point location) of the target.
  • the updating of the trajectory based on the predicted location of the target may be performed automatically, by utilizing one or more of the AI models disclosed herein, including the tissue movement model, target movement model, trajectory model and any suitable sub-model (or individual model) disclosed herein.
  • recalculation of the trajectory may also be required if, for example, an obstacle is identified along the trajectory.
  • an obstacle may be an obstacle which was marked (manually or automatically) prior to the calculation of the planned trajectory but tissue movement, e.g., tissue movement resulting from the advancement of the instrument within the tissue, caused the obstacle to move such that it entered the planned path.
  • the obstacle may be a new obstacle, i.e., an obstacle which was not visible in the image (or set of images) based upon which the planned trajectory was calculated, and became visible during the insertion procedure.
  • the user may be prompted to initiate an update (recalculation) of the trajectory.
  • recalculation of the trajectory if required, is executed automatically by the processor and the insertion of the instrument automatically continues according to the updated trajectory.
  • recalculation of the trajectory if required, is executed automatically by the processor, however the user is prompted to confirm the recalculated trajectory before advancement of the instrument (e.g., to the next checkpoint or to the target) according to the updated trajectory can be resumed.
  • an updated trajectory 160' has been calculated based on the predicted end-point location of the target 162”, to facilitate the medical instrument 170 reaching the target at its end-point location.
  • the preplanned trajectory 160 was linear
  • the recalculation of the trajectory e.g., using the trajectory model, due to movement of the target, resulted in the medical instrument 170, specifically the tip of the instrument, following a non-linear trajectory to accurately reach the target.
  • FIG. 14D summarizes the target movement during the procedure shown in FIGS. 14A-14C, from an initial target location 162 to an updated target location 162’ and finally to an end-point target location 162”.
  • the target movement during the procedure may be predicted by the target movement model, which may be further used (optionally with additional models, such as, tissue movement model, “no-fly” zones model) to update the trajectory utilizing the trajectory model, to thereby facilitate the medical instrument 170 reaching the target at its endpoint location in an optimal manner, as detailed herein.
  • additional models such as, tissue movement model, “no-fly” zones model
  • FIG. 14D also shown in FIG. 14D are the planned trajectory 160 and the updated trajectory 160’, which allowed the medical instrument 170 to reach the moving target, without having to remove and re-insert the instrument.
  • the target, insertion point and, optionally, obstacle/s may be marked manually by the user.
  • the processor of the insertion system (or of a separate system) may be configured to identify and mark at least one of the target, the insertion point and the obstacle/s, and the user may, optionally, be prompted to confirm or adjust the processor’s proposed markings.
  • the target and/or obstacle/s may be identified using known image processing techniques and/or data-analysis models/algorithms, based on data obtained from previous procedures.
  • the insertion point may be suggested based solely on the obtained images, or, alternatively or additionally, on data obtained from previous procedures using data-analysis models/algorithms .
  • Implementations of the systems, devices and methods described above may further include any of the features described in the present disclosure, including any of the features described hereinabove in relation to other system, device and method implementations.
  • computer-readable storage medium having stored therein data-analysis algorithm(s), executable by one or more processors, for generating one or more models for providing recommendations, operating instructions and/or functional enhancements related to operation of automated medical devices.
  • control unit configured to receive input from a corresponding processor (processing unit), and generate control data in response thereto, for controlling operation of an automated medical device.
  • processing unit and the control unit may be physically and/or functionally associated.
  • the processing unit and the control unit may be part of the same system (for example, insertion system), or separate systems, capable of interacting therewith.
  • the embodiments described in the present disclosure may be implemented in digital electronic circuitry, or in computer software, firmware or hardware, or in combinations thereof.
  • the disclosed embodiments may be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on computer storage medium for execution by, or to control the operation of, one or more data processing apparatus.
  • the computer program instructions may be encoded on an artificially generated propagated signal, for example, a machine-generated electrical, optical or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus for execution by a data processing apparatus.
  • a computer storage medium can be, or be included in, a computer-readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of any one or more of the above.
  • a computer storage medium is not a propagated signal, a computer storage medium can be a source or destination of computer program instructions encoded in an artificially generated propagated signal.
  • the computer storage medium can also be, or be included in, one or more separate physical components or media (for example, multiple CDs, disks, or other storage devices).
  • the operations described in the present disclosure can be implemented as operations performed by a data processing apparatus on data stored on one or more computer-readable storage devices or received from other sources.
  • processors and/or “data processing apparatus” as used herein may encompass all types of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip/s, or combinations thereof.
  • the data processing apparatus can include special purpose logic circuitry, for example, an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
  • the apparatus can also include, in addition to hardware, code that creates an execution environment for the computer program in question, for example, code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross -platform runtime environment, a virtual machine, or combinations thereof.
  • the apparatus and execution environment can realize various different computing model infrastructures, such as web services, distributed computing and grid computing infrastructures.
  • a computer program (also referred to as a program, software, software application, script or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a standalone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment.
  • a computer program may, but need not, correspond to a file in a file system.
  • a computer program can be stored in a portion of a file that holds other programs or data, in a single file dedicated to the program in question, or in multiple coordinated files (for example, files that store one or more modules, sub programs or portions of code).
  • a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • the processes and logic flows described herein can be performed by one or more programmable processors, executing one or more computer programs to perform actions by operating on input data and generating output.
  • the processes and logic flows can also be performed by, and an apparatus can also be implemented as, special purpose logic circuitry, for example, an FPGA or an ASIC.
  • Processors suitable for the execution of a computer program include both general and special purpose microprocessors, and any one or more processors of any type of digital computer.
  • a processor will receive instructions and data from a read-only memory or a random access memory or both.
  • the essential elements of a computer are a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data.
  • a computer may, optionally, also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, for example, magnetic, magneto optical discs, or optical discs.
  • a computer can be embedded in another device, for example, a mobile phone, a tablet, a personal digital assistant (PDA, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (for example, a USB flash drive).
  • PDA personal digital assistant
  • GPS Global Positioning System
  • USB flash drive for example, a USB flash drive
  • Non-volatile memory media and memory devices
  • semiconductor memory devices for example, EPROM, EEPROM, random access memories (RAMs), including SRAM, DRAM, embedded DRAM (eDRAM) and Hybrid Memory Cube (HMC), and flash memory devices
  • RAMs random access memories
  • eDRAM embedded DRAM
  • HMC Hybrid Memory Cube
  • flash memory devices magnetic discs, for example, internal hard discs or removable discs; magneto optical discs; read-only memories (ROMs), including CD-ROM and DVD-ROM discs; solid state drives (SSDs); and cloud-based storage.
  • the processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • cloud computing is generally used to describe a computing model which enables on-demand access to a shared pool of computing resources, such as computer networks, servers, software applications, and services, and which allows for rapid provisioning and release of resources with minimal management effort or service provider interaction.
  • terms such as “processing”, “computing”, “calculating”, “determining”, “estimating”, “assessing” or the like may refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data, represented as physical (e.g. electronic) quantities within the computing system’s registers and/or memories, into other data similarly represented as physical quantities within the computing system’s memories, registers or other such information storage, transmission or display devices.
  • the terms “medical instrument” and “medical tool” may be used interchangeably.
  • the term “moving target” relates to a mobile target, i.e., a target that is capable of being moved within the body of the subject, independently of, or at least partially due to or during a medical procedure.
  • automated medical device and “automated device” may be used interchangeably.
  • image-guided insertion procedure and ‘image- guided procedure” may be used interchangeably.
  • model In some embodiments, the term “model”, “algorithm”, “data-analysis algorithm” and “data-based algorithm” may be used interchangeably.
  • the terms “user”, “doctor”, “physician”, “clinician”, “technician”, “medical personnel” and “medical staff’ are used interchangeably throughout this disclosure and may refer to any person taking part in the performed medical procedure.
  • subject and “patient” may be used interchangeably, and they may refer either to a human subject or to an animal subject.
  • the words “include” and “have”, and forms thereof, are not limited to members in a list with which the words may be associated.
  • steps of methods according to some embodiments may be described in a specific sequence, methods of the disclosure may include some or all of the described steps carried out in a different order. The methods of the disclosure may include a few of the steps described or all of the steps described. No particular step in a disclosed method is to be considered an essential step of that method, unless explicitly specified as such.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Manipulator (AREA)

Abstract

Provided are systems, devices and methods for generating and/or utilizing data- analysis model(s) for updating in real-time a trajectory for inserting a medical instrument toward a target in a body of a subject, based on the predicted movement of the target, and steering the instrument toward the moving target according to the real-time updated trajectory.

Description

CLOSED-LOOP STEERING OF A MEDICAL INSTRUMENT
TOWARD A MOVING TARGET
FIELD OF THE INVENTION
The present invention relates to methods, devices and systems for closed-loop steering of medical instrument toward a moving target. More specifically, the present invention relates to real-time target tracking and steering of a medical instrument to facilitate the medical instrument reaching the target at a predicted target location within the body of a subject. Even more specifically, the present invention relates to tracking the target and predicting the end-point location of the target within the subject’s body, to facilitate the medical instrument reaching the target at its predicted end-point location, by steering the medical instrument according to a corresponding trajectory updated in real-time.
BACKGROUND
Various diagnostic and therapeutic procedures used in clinical practice involve the insertion of medical tools, such as needles and catheters, percutaneously to a subject’s body and in many cases further involve the steering of the medical tools within the body, to reach a target region. The target region can be any internal body region, including, a lesion, tumor, organ or vessel. Examples of procedures requiring insertion and steering of such medical tools include vaccinations, blood/fluid sampling, regional anesthesia, tissue biopsy, catheter insertion, cryogenic ablation, electrolytic ablation, brachytherapy, neurosurgery, deep brain stimulation, various minimally invasive surgeries, and the like.
The guidance and steering of medical tools, such as needles, in soft tissue is a complicated task that requires good three-dimensional coordination, knowledge of the patient’s anatomy and a high level of experience. Image-guided automated (e.g., robotic) systems have thus been proposed for performing these functions.
Some automated insertion systems are based on manipulating robotic arms and some utilize a robotic device which can be attached to the patient’s body or positioned in close proximity thereto. These automated systems typically assist the physician in aligning the medical instrument with a selected insertion point at a desired location and the insertion itself is carried out manually by the physician. Some automated systems further include an insertion mechanism that can insert the medical instrument toward the target, typically in a linear manner. More advanced automated systems further include non-linear steering capabilities, as described, for example, in U.S. Patents Nos. 8,348,861, 8,663,130 and 10,507,067, and in co-owned U.S. Patent No. US 10,245,110, co-owned U.S. Patent Application Publication No. 2019/290,372, and co-owned International Patent Application No. PCT/IL2020/051219, all of which are incorporated herein by reference in their entireties.
However, inserting a medical instrument through soft tissue typically involves displacement and/or deformation of the tissue, including the target (e.g., lesion) to be reached by the instrument, due to the penetration of the instrument through the tissue, as well as due to the patient’s respiratory motion and other patient movements.
Thus, there is still a need in the art for automated insertion and steering devices and systems capable of accurately and reliably reaching a moving target, in the most efficient, accurate and safe manner.
SUMMARY
According to some embodiments, the present disclosure is directed to systems, devices and methods for automated insertion and steering of medical instruments/tools (for example, needles) in a subject’s body for diagnostic and/or therapeutic purposes, wherein the steering of the medical instrument within the body of a subject, is according to a trajectory (for example, 2D trajectory or 3D trajectory) for the medical instrument (for example, for the end or tip thereof), within the body of the subject, wherein the trajectory is determined/calculated, inter alia, according to a predicated spatial-temporal end-point location of the target, to thereby allow safely and accurately reaching the target by the most efficient and safe route. In further embodiments, the systems, devices and methods disclosed herein allow detecting, tracking and predicting the location of the target during and/or at the end of the procedure ("end-point"), such that at the end of the steering of the medical instrument according to a corresponding trajectory, the actual location of the medical instrument (in particular, the end thereof) coincides with the location of the target within the body, to increase effectiveness, safety and accuracy of the medical procedure. Automatic insertion and steering of medical instruments (such as, needles) within the body, and in particular utilizing a trajectory which is determined, inter alia, based on the predicted end-point location of the target, is advantageous over manual insertion of such instrument within the body. For example, by utilizing a real-time steering towards the predicted location of the target, the most effective spatio-temporal and safe route of the medical instrument to the end-point location of target within the body is achieved. Further, the use of a closed-loop steering according to a trajectory which takes into account the predicted location of the target (which is determined and/or updated as disclosed herein), increases safety as it reduces the risk of harming non-target regions and tissues within the subject's body, as the trajectory may take into account obstacles or any other regions along the route, and moreover, it may take into account changes in the real-time location of such obstacles as well as tissue movements during the procedure. Additionally, such automatic steering improves the accuracy of the procedure, which enables reaching small targets and/or targets which are located in areas in the body which are difficult to reach. This can be of particular importance in early detection of malignant neoplasms, for example. In addition, it provides increased safety for the patient, as there is a significant lower risk of human error. Further, according to some embodiments, such a procedure can be remote controlled (e.g., from an adjacent control room or even from outside the medical facility), which is safer for the medical personnel, as it minimizes their radiation exposure during the procedure, as well as their exposure to any infectious diseases the patient may carry. Additionally, visualization of the planned and the executed trajectory towards the predicted location of the target vastly improves the user’s ability to supervise and control the medical procedure. Since the automated device can be controlled from a remote site, even from outside of the hospital, there is no longer a need for the physician to be present in the procedure room, according to some embodiments.
In some exemplary embodiments, the automated medical devices are devices for insertion and steering of medical instruments (for example, needles, introducers or probes) in a subject’s body for various diagnostic and/or therapeutic purposes. In some embodiments, the automated insertion device may utilize real-time instrument and target position determination and real-time trajectory updating, as disclosed, for example, in abovementioned co-owned International Patent Application No. PCT/IL2020/051219. In some embodiments, the automated medical devices are configured to insert and steer/navigate a medical instrument (in particular, a tip of the medical instrument) in the body of the subject, to safely and accurately reach a target region within the subject’s body, to perform various medical procedures. In some embodiments, the operation of the medical devices may be controlled by at least one processor configured to provide instructions, in real-time, to steer the medical instrument toward the target, or more particularly, toward a predicted end-point location of the target, according to a planned and/or updated trajectory. In some embodiments, the steering may be controlled by the processor, via a suitable controller. In some embodiments the steering may be controlled in a closed-loop manner, whereby the processor generates motion commands to the steering device via a suitable controller and receives feedback regarding the real-time location of the medical instrument and the target. In some embodiments, the processor(s) may be able to predict future locations and/or the movement pattern/profile of the target. As detailed herein, AI-based algorithm(s) may be used to predict the location and/or movement pattern of the target, of a tissue, and the like. As detailed herein, in some embodiments, AI-based algorithm(s) may be used to determine the trajectory to a predicted target location. In some embodiments, as detailed herein, AI-based algorithm(s) may be used to determine an optimized trajectory to a predicted target location. In some embodiments, the automated medical device may be configured to operate in conjunction with an imaging system. In some embodiments, the imaging system may include any type of imaging system, including, but not limited to: X-ray fluoroscopy, CT, cone beam CT, CT fluoroscopy, MRI, ultrasound, or any other suitable imaging modality. In some embodiments, the processor is configured to calculate a trajectory for the medical instrument based on a target, entry point and, optionally, “no-fly” zones, which include obstacles en route (such as bones or blood vessels), which may be manually marked by the user, or automatically identified by the processor, on one or more obtained images.
According to some embodiments, as disclosed herein, the determination of the target movement profile and/or predicted location of the target (e.g., the end-point target location) may utilize various algorithms (including artificial intelligence (AI) models, such as machine learning (ML) models, deep learning (DL) models, and the like) which take into account various parameters and variables that can affect or influence the movement profile of the target and/or the end-point target location, including, for example, but not limited to: medical procedure (for example, ablation, biopsy, etc.); medical instrument (for example, type, size, gauge, etc.); tissue characteristics (for example, elasticity, location, dimensions, structure, shape, etc.); target characteristics (for example, type, dimensions, shape, location, accessibility); patient specific parameters (for example, age, gender, weight, body structure, etc.); patient related parameters (position, respiration, etc.); trajectory related parameters (“no-fly” zones, checkpoints, length, etc.); and the like, or any combinations thereof.
According to some embodiments, the systems and methods disclosed herein allow the determination of optimal interception point of the target by the medical instrument (i.e., optimal spatio-temporal location in which the medical instrument (for example, the tip thereof) reaches the target). Thus, in some embodiments, as referred to herein, the term “location” can relate to spatial location, temporal location, or both spatial-temporal location.
According to some embodiments, the systems and methods disclosed herein may be operated automatically, and/or semi automatically (for example, with user confirmation and/or correction, if needed).
According to some embodiments, the systems and computer-implemented methods for estimating target movement and/or predicting target position, and the subsequent utilization of the prediction data to determine a suitable trajectory and steer the medical tool according to the determined trajectory toward the predicted target location, may utilize specific algorithms which may be generated using machine learning tools, deep learning tools, data wrangling tools, and, more generally, AI and data analysis tools. In some embodiments, the specific algorithms may be implemented using artificial neural network(s) (ANN), such as convolutional neural network (CNN), recurrent neural network (RNN), long-short term memory (LSTM), auto-encoder (AE), generative adversarial network (GAN), Reinforcement-Learning (RL) and the like, as further detailed below. In other embodiments, the specific algorithms may be implemented using machine learning methods, such as support vector machine (SVM), decision tree (DT), random forest (RF), boosting algorithms, linear regression, logistic regression, clustering algorithms, bayesian methods, and the like, or any combination thereof. In some embodiments, supervised, semi-supervised and/or unsupervised methods may be implemented.
According to some embodiments, there are provided systems for inserting and steering a medical instrument/tool within the body of a subject according to a trajectory updated in real-time based on detected and/or estimated target movement, wherein the system includes an automated insertion and steering device (for example, a robot), a processor and optionally a controller. In some embodiments, the insertion and steering device is configured to insert and steer/navigate a medical instrument in the body of the subject, to reach a predicted target location within the subject’s body, according to a planned/determined and/or updated trajectory of the medical instrument, wherein the trajectory may be updated in real time, based on the real-time location of the medical instrument and/or of the target and/or the tissue and/or one various other parameters, and wherein the updating of the determined trajectory is facilitated utilizing the processor, which is further configured to convey real time steering instructions to the insertion and steering device.
In some embodiments the steering system may be configured to operate in conjunction with an imaging system. In some embodiments, the imaging system may include any type of imaging system (modality), including, but not limited to: X-ray fluoroscopy, CT, cone beam CT, CT fluoroscopy, MRI, ultrasound, or any other suitable imaging modality. In some embodiments, the processor of the system may be further configured to process and show on a display/monitor images, or image-views created from sets of images (or slices), from an imaging system (e.g., CT, MRI).
According to some embodiments, there is provided a computer- implemented method of generating a trajectory model for determining a trajectory for steering a medical instrument toward a moving target in a body of a subject in an image-guided procedure, the method includes: collecting one or more datasets, at least one of the one or more datasets being related to an automated medical device configured to steer a medical instrument toward a target in the body of a patient and/or to operation thereof; creating a training set comprising at least a portion of the one or more datasets and one or more target parameters relating to planned and/or updated and/or executed trajectories in one or more previous image-guided procedures for steering a medical instrument toward a moving target in a body of a patient; training the trajectory model to output a trajectory that will reach a moving target at a predicted location of the target using the training set; calculating a trajectory prediction error; and optimizing the trajectory model using the calculated trajectory prediction error. According to some embodiments, the one or more datasets further include one or more of: clinical procedure related dataset, patient related dataset and administrative related dataset.
According to some embodiments, the trajectory model may be generated utilizing artificial intelligence tools comprising one or more of, but not limited to: machine learning tools, data wrangling tools, deep learning tools, artificial neural network (ANN), deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), long short term memory network (LSTM), decision trees or graphs, association rule learning, support vector machines, inductive logic programming, Bayesian networks, instance-based learning, manifold learning, sub-space learning, dictionary learning, reinforcement learning (RL), generative adversarial network (GAN), clustering algorithms, or any combination thereof.
According to some embodiments, training the trajectory model may include using one or more of: loss function, Ensemble Learning methods, Multi-Task Learning, Multi-Output regression and Multi-Output classification.
According to some embodiments, the method may further include executing one or more of a tissue movement model, a target movement model and a “no-fly” zones model using at least a portion of the one or more datasets.
According to some embodiments, the method may further include executing one or more individual models using at least a portion of the one or more datasets and the trajectory generated by the trajectory model; and obtaining one or more predictions from the one or more individual models.
According to some embodiments, the method may further include calculating a loss function using the trajectory prediction error and the one or more predictions generated by the one or more individual models; and optimizing the trajectory model using the loss function.
According to some embodiments, the method may further include training the one or more individual models.
According to some embodiments, the or more individual models may be selected from: a model for predicting an accuracy of an image-guided insertion procedure, an interaction model for predicting target and/or tissue movement resulting from an interaction between the medical instrument and the tissue and/or target movement; a model for predicting a duration of an image-guided insertion procedure or part thereof, a model for predicting a risk level of an image-guided insertion procedure, or any combination thereof.
According to some embodiments, calculating the loss function may include minimizing one or more of the trajectory prediction error, the predicted interaction movement, the predicted duration and the predicted risk.
According to some embodiments, calculating the loss function further includes maximizing the predicted accuracy of the image-guided insertion procedure.
According to some embodiments, the method may further include adjusting one or more coefficients of one or more terms used in the calculation of the loss function, the one or more terms being associated with at least one of the trajectory prediction error and the one or more predictions generated by the one or more individual models.
According to some embodiments, the adjusting of the one or more coefficients is executed during training of the trajectory model. According to some embodiments, the adjusting of the one or more coefficients is executed during execution of the trajectory model.
According to some embodiments, adjusting of the one or more coefficients being related to one or more of: a specific procedure type, a specific target type, a specific user, a specific population. According to some embodiments, generating the trajectory model is executed by a training module comprising a memory and one or more processors.
According to some embodiments, the automated medical device is configured to allow real-time updating of the trajectory in accordance with predicted target movement and steer the medical instrument toward the target according to the updated trajectory. According to some embodiments, there is provided a system for generating a trajectory model for determining a trajectory for steering a medical instrument toward a moving target in image-guided procedures, the system includes: a training module comprising: a memory configured to store one or more datasets; and one or more processors configured to execute the method for generating a trajectory model as disclosed herein.
According to some embodiments, the training module may be located on a remote server, an “on premise” (local) server or a computer associated with the automated medical device.
According to some embodiments, the remote server is a cloud server.
According to some embodiments, there is provided a method of closed-loop steering a medical instrument toward a moving target within a body of a subject, the method includes: calculating a planned trajectory for the medical instrument from an entry point to an initial target location in the body of the subject, steering the medical instrument toward the initial target location according to the planned trajectory; determining the real-time location of the target and the medical instrument; predicting movement of the target; and updating the trajectory based on the predicted movement of the target, such that the medical instrument will reach the target at a predicted location of the target; and steering the medical instrument toward the predicted location of the target according to the updated trajectory.
According to some embodiments, the predicting movement of the moving target may be executed by using a dynamic trajectory model.
According to some embodiments, the dynamic trajectory model may further include comprises predicting a movement of a tissue of the body and/or predicting a movement of tip of the medical instrument.
According to some embodiments, updating the trajectory is executed using a dynamic trajectory model.
According to some embodiments, calculating the planned trajectory is executed using a trajectory model.
According to some embodiments, the steering of the medical instrument toward the target is executed utilizing an automated medical device. According to some embodiments, the planned trajectory and/or the updated trajectory are a 2D trajectory or a 3D trajectory.
According to some embodiments, the method for steering a medical instrument may further include obtaining one or more images of a region of interest within the body of the subject by means of an imaging system, selected from: a CT system, an X-ray fluoroscopy system, an MRI system, an ultrasonic system, a cone -beam CT system, a CT fluoroscopy system, an optical imaging system and an electromagnetic imaging system.
According to some embodiments, there is provided a system for steering a medical instrument toward a moving target in a body of a subject, the system comprising: an automated device configured for steering the medical instrument toward a moving target, the automated device comprising one or more actuators and a control head configured for coupling the medical instrument thereto; and a processor configured for executing the method of steering a medical instrument, as disclosed herein.
According to some embodiments, the system may further include a controller configured to control the operation of the device.
According to some embodiments, there is provided a control device configured for steering a medical instrument toward a moving target in a body of a subject, said control device is configured to receive input from a processor (processing unit) configured for executing a method of steering a medical instrument as disclosed herein, and generate control data in response thereto, for controlling operation of the automatic medical device.
Certain embodiments of the present disclosure may include some, all, or none of the above advantages. One or more other technical advantages may be readily apparent to those skilled in the art from the figures, descriptions, and claims included herein. Moreover, while specific advantages have been enumerated above, various embodiments may include all, some, or none of the enumerated advantages. BRIEF DESCRIPTION OF THE DRAWINGS
Some exemplary implementations of the methods and systems of the present disclosure are described with reference to the accompanying drawings. In the drawings, like reference numbers indicate identical or substantially similar elements.
FIGS. 1A-1B show perspective views of an exemplary device (FIG. 1A) and an exemplary console (FIG. IB) of a system for inserting a medical instrument toward an internal target, according to some embodiments;
FIG. 2 shows an exemplary trajectory for a medical instrument to reach an internal target within the body of the subject, according to some embodiments;
FIGS. 3A-3D show planning of an exemplary trajectory for inserting and steering a medical instrument toward a target, on CT images, according to some embodiments;
FIG. 4 shows a flow chart of steps in a method for planning and real-time updating a trajectory of a medical instrument, according to some embodiments;
FIG. 5 shows a diagram of a method of generating, deploying and using a data-analysis algorithm, according to some embodiments;
FIGS. 6A-6B show an exemplary training module (FIG. 6A) and an exemplary training process (FIG. 6B) for training a data-analysis algorithm, according to some embodiments;
FIGS. 7A-7B show an exemplary inference module (FIG. 7A) and an exemplary inference process (FIG. 7B) for utilizing a data-analysis algorithm, according to some embodiments;
FIG. 8 shows a flowchart illustrating steps of a method of closed-loop steering of a medical instrument toward a moving target, according to some embodiments;
FIG. 9 shows a flowchart illustrating steps of a method of closed-loop steering of a medical instrument toward a moving target utilizing a dynamic trajectory model, according to some embodiments;
FIG. 10 shows a block diagram illustrating an exemplary method of generating (training) a tissue movement model for prediction of tissue movement during a procedure; FIG. 11 shows a block diagram illustrating an exemplary method of generating (training) a target movement model for prediction of target movement during a procedure;
FIG. 12 shows a block diagram illustrating an exemplary method of generating (training) a trajectory model for determining a trajectory for steering a medical instrument toward a moving target during a medical procedure;
FIG. 13 shows a block diagram illustrating another exemplary method of generating (training) a trajectory model for determining a trajectory for steering a medical instrument toward a moving target, according to some embodiments; and
FIGS. 14A-14D demonstrate real-time updating of a planned trajectory and steering a medical instrument according thereto, based on predicted movement of target, according to some embodiments.
DETAILED DESCRIPTION
The principles, uses and implementations of the teachings herein may be better understood with reference to the accompanying description and figures. Upon perusal of the description and figures present herein, one skilled in the art will be able to implement the teachings herein without undue effort or experimentation. In the figures, same reference numerals refer to same parts throughout.
In the following description, various aspects of the invention will be described. For the purpose of explanation, specific details are set forth in order to provide a thorough understanding of the invention. However, it will also be apparent to one skilled in the art that the invention may be practiced without specific details being presented herein. Furthermore, well-known features may be omitted or simplified in order not to obscure the invention.
According to some embodiments, there are provided systems, devices and methods for insertion and steering of a medical instrument in a subject’s body wherein the steering of the medical instrument within the body of a subject is based on a trajectory for the medical instrument (in particular, the end or tip thereof), within the body of the subject, wherein the trajectory is determined according to a predicted location of the target, to facilitate the safe and accurate reaching of the tip to the internal target region within the subject’s body, by the most efficient and safe route. In further embodiments, there are provided systems, devices and methods allowing the prediction of the end-point location of the target, to allow the safe reaching of the tip of the medical instrument to the moving target, to increase effectiveness, safety and accuracy of various related medical procedures.
In some embodiments, a medical device for inserting and steering a medical instrument into (and within) a body of a subject may include any suitable automated device. The automated steering device may include any type of suitable steering mechanism controlling the movement of an end effector (control head) at any one of desired movement angles or axis. In some embodiments, the automated inserting and steering device may have at least 3 degrees of freedom, at least 4 degrees of freedom, or at least five degrees of freedom (DOF).
Reference is now made to FIG. 1A, which shows an exemplary automated medical device for inserting and steering a medical instrument in a body of a subject, according to some embodiments. As shown in FIG. 1 A, the device 20 may include a housing (also referred to as “cover”) 21 accommodating therein at least a portion of a steering mechanism. The steering mechanism may include at least one moveable platform (not shown) and at least two moveable arms 26 A and 26B, configured to allow or control movement of an end effector (also referred to as “control head”) 24, at any one of desired movement angles or axis, as disclosed, for example, in abovementioned U.S. Patent Application Publication No. 2019/290,372. The moveable arms 26A and 26B may be configured as piston mechanisms. To the end 28 of control head 24, a suitable medical instrument (not shown) may be connected, either directly or by means of a suitable insertion module, such as the insertion module disclosed in co-owned U.S. Patent Application Publication No. 2017/258,489, which is incorporated herein by reference in its entirety. The medical instrument may be any suitable instrument capable of being inserted and steered within the body of the subject, to reach a designated target, wherein the control of the operation and movement of the medical instrument is effected by the control head 24. The control head 24 may include a driving mechanism (also referred to as “insertion mechanism”), or at least a portion thereof, which is configured to advance the medical instrument toward the target in the patient’s body. The control head 24 may be controlled by a suitable control system, as detailed herein.
According to some embodiments, the medical instrument may be selected from, but not limited to: a needle, probe (e.g., an ablation probe), port, introducer, catheter (such as a drainage needle catheter), cannula, surgical tool, fluid delivery tool, or any other suitable insertable tool configured to be inserted into a subject’s body for diagnostic and/or therapeutic purposes. In some embodiments, the medical tool includes a tip at the distal end thereof (i.e., the end which is inserted into the subject’s body). The tool tip may be a diamond tip, a bevel tip, a conical tip, etc.
In some embodiments, the device 20 may have a plurality of degrees of freedom (DOF) in operating and controlling the movement the of the medical instrument along one or more axis. For example, the device may have up to six degrees of freedom. For example, the device may have at least five degrees of freedom. For example, the device may have five degrees of freedom, including two linear translation DOF (in a first axis), a longitudinal linear translation DOF (in a second axis substantially perpendicular to the first axis) and two rotational DOF. For example, the device may have forward-backward and left-right linear translations facilitated by two moveable platforms, front-back and left-right rotations facilitated by two moveable arms (e.g., piston mechanism), and longitudinal translation toward the subject’s body facilitated by the insertion mechanism. In some embodiments, the control system (i.e., processor and/or controller) may be capable of controlling the steering mechanism (including the moveable platforms and the moveable arms) and the insertion mechanism simultaneously, thus enabling non-linear steering of the medical instrument, i.e., enabling the medical instrument to reach the target by following a non-linear trajectory. In some embodiments, the device may have six degrees of freedom, including the five degrees of freedom described above and, in addition, rotation of the medical instrument about its longitudinal axis. In some embodiments, rotation of the medical instrument about its longitudinal axis may be facilitated by a designated rotation mechanism. In some embodiments, the control system (i.e., processor and/or controller) may be capable of controlling the steering mechanism, the insertion mechanism and the rotation mechanism simultaneously.
In some embodiments, the device 20 may further include a base 23, which allows positioning of the device 20 on or in close proximity to the subject’s body. In some embodiments, the device 20 may be configured for attachment to the subject’s body either directly or via a suitable mounting surface, such as the mounting base disclosed in co-owned U.S. Patent Application Publication No. 2019/125,397, or the attachment apparatus disclosed in co-owned International Patent Application Publication No. WO 2019/234,748, both of which are incorporated herein by reference in their entireties. Attachment of the device 20 to the mounting surface may be carried out using dedicated latches, such as latches 27A and 27B. In some embodiments, the device may be couplable to a dedicated arm or base which is secured to the patient’s bed, to a cart positioned adjacent the patient’s bed or to an imaging device (if used), and held on the subject’s body or in close proximity thereto, as described, for example, in abovementioned U.S. Patent No. 10,507,067 and in U.S. Patent No. 10,639,107, which is incorporated herein by reference in its entirety.
In some embodiments, the device may include electronic components and motors (not shown) allowing the controlled operation of the device 20 in inserting and steering the medical instrument. In some exemplary embodiments, the device may include one or more Printed Circuit Board (PCB) (not shown) and electrical cables/wires (not shown) to provide electrical connection between a controller (not shown) and the motors of the device and other electronic components thereof. In some embodiments, the controller may be embedded, at least in part, within device 20. In some embodiments, the controller may be a separate component. In some embodiments, the device 20 may include a power supply (e.g., one or more batteries) (not shown). In some embodiments, the device 20 may be configured to communicate wirelessly with the controller and/or processor. In some embodiments, device 20 may include one or more sensors, such as a force sensor and/or an acceleration sensor (not shown). Use of sensor/s for sensing parameters associated with the interaction between a medical instrument and a bodily tissue, e.g., a force sensor, and utilizing the sensor data for monitoring and/or guiding the insertion of the instrument and/or for initiating imaging, is described, for example, in co-owned U.S. Patent Application Publication No. 2018/250,078, which is incorporated herein by reference in its entirety.
In some embodiments, the housing 21 is configured to cover and protect, at least partially, the mechanical and/or electronic components of device 20 from being damaged or otherwise compromised. In some embodiments, the housing 21 may include at least one adjustable cover, and it may be configured to protect the device 20 from being soiled by dirt, as well as by blood and/or other bodily fluids, thus preventing/minimizing the risk of cross contamination between patients, as disclosed, for example, in co-owned International Patent Application No. PCT/IL2020/051220, which is incorporated herein by reference in its entirety. In some embodiments, the device may further include registration elements disposed at specific locations on the device 20, such as registration elements 29A and 29B, for registration of the device 20 to an image space, in image-guided procedures. In some embodiments, registration elements may be disposed on the mounting surface to which device 20 may be coupled, either instead or in addition to registration elements 29A-B disposed on device 20. In some embodiments, registration of the device 20 to the image space may be carried out via image processing of one or more components of the device 20, such as the control head 24, and/or of the mounting surface (or at least a portion thereof), which are visible in generated images. In some embodiments, the device may include a CCD/CMOS camera mounted on the device (e.g., the device’s frame), the mounting surface and/or as a separate apparatus, allowing the collection of visual images and/or videos of the patient’s body during a medical procedure.
In some embodiments, the medical instrument is configured to be removably coupleable to the device 20, such that the device can be used repeatedly with new medical instruments. In some embodiments, the medical instruments are disposable. In some embodiments, the medical instruments are reusable.
In some embodiments, device 20 is part of a system for inserting and steering a medical instrument in a subject’s body based on a preplanned and, optionally, real-time updated trajectory, as disclosed, for example, in abovementioned co-owned International Application No. PCT/IL2020/051219. In some embodiments, the system may include the steering and insertion device 20, as disclosed herein, and a control unit (or - “workstation” or “console”) configured to allow control of the operating parameters of device 20. In some embodiments, the user may operate the device 20 using a pedal or an activation button. In some embodiments, the system may include a remote control unit, which may enable the user to activate the device 20 from a remote location, such as the control room adjacent the procedure room (e.g., CT suite), a different location at the medical facility or even a location outside the medical facility. In some embodiments, the user may operate the device using voice commands.
Reference is now made to FIG. IB, which shows an exemplary console/workstation 25 of an insertion system, according to some embodiments. The workstation 25 may include a display 10 and a user interface (not shown). In some embodiments, the user interface may be in the form of buttons, switches, keys, keyboard, computer mouse, joystick, touch- sensitive screen, and the like. The monitor and user interface may be two separate components, or they may form together a single component (e.g., in the form of a touch screen). The workstation 25 may include one or more suitable processors (for example, in the form of a PC) and one or more suitable controllers, configured to physically and/or functionally interact with device 20, to determine and control the operation thereof. The one or more processors may be implemented in the form of a computer (such as a workstation, a server, a PC, a laptop, a tablet, a smartphone or any other processor-based device). In some embodiments, the workstation 25 may be portable (e.g., by having wheels 12 or being placed on a movable platform).
In some embodiments, the one or more processors may be configured to perform, for example, one or more of: determine the location of the target; determine the predicted location of the target during and/or at the end of the procedure (end-point), determine (plan) a trajectory for the medical instrument to reach the target (for example, at the predicted location of the target); update the trajectory in real-time, for example due to movement of the target from its initial identified position as a result of the advancement of the medical instrument within the patient’s body, respiration motion and/or patient movements; present the planned and/or updated trajectory on the monitor 10; control the movement (insertion/steering) of the medical instrument based on the planned and/or updated trajectory by providing executable instructions (directly or via the one or more controllers) to the device; determine the actual location of the medical instrument (e.g., the tip thereof) using image processing and/or by performing required compensation calculations; receive, process and visualize on the monitor images or image-views created from a set of images (between which the user may be able to scroll), operating parameters and the like; or any combination thereof.
According to some embodiments, the planned trajectory of the medical instrument (in particular, the tip thereof) may be calculated based on a predicted location of the target within the subject body and optionally, inter alia, based on one or more inputs from the user, such as the entry point, areas to avoid en route (obstacles or “no-fly” zones), which the user marks on at least one of the obtained images. In some embodiments, the processor may be further configured to identify the target, actual location of the target, predicted location of the target, the obstacles and/or the insertion/entry point. In some embodiments, data-analysis algorithms, e.g., AI-based models, may be used by the processor to perform such identifications/calculations .
In some embodiments, the use of AI-based models (e.g., machine-learning and/or deep-learning based models) requires a “training” stage in which collected data is used to create (train) models. The generated (trained) models may later be used for “inference” to obtain specific insights, predictions and/or recommendations when applied to new data during the clinical procedure or at any later time.
In some embodiments, the insertion system and the system creating (training) the algorithms/models may be separate systems (i.e., each of the systems includes a different set of processors, memory modules, etc.). In some embodiments, the insertion system and the system creating the algorithms/models may be the same system. In some embodiments, the insertion system and the system creating the algorithms/models may share one or more resources (such as, processors, memory modules, GUI, and the like). In some embodiments, the insertion system and the system creating the algorithms/models may be physically and/or functionally associated. Each possibility is a separate embodiment.
In some embodiments, the insertion system and the system utilizing the algorithms/models for inference may be separate systems (i.e., each of the systems includes a different set of processors, memory modules, etc.). In some embodiments, the insertion system and the system utilizing the algorithms/models for inference may be the same system. In some embodiments, the insertion system and the system utilizing the algorithms/models for inference may share one or more resources (such as, processors, memory modules, GUI, and the like). In some embodiments, the insertion system and the system utilizing the algorithms/models for inference may be physically and/or functionally associated. Each possibility is a separate embodiment.
In some embodiments, the device may be configured to operate in conjunction with an imaging system, including, but not limited to: X-Ray, CT, cone beam CT, CT fluoroscopy, MRI, ultrasound, or any other suitable imaging modality. In some embodiments, the steering of the medical instrument based on a planned and, optionally, real-time updated 2D or 3D trajectory of the tip of the medical instrument, may be image-guided.
According to some embodiments, during the operation of the automated medical device, various types of data may be generated, accumulated and/or collected, for further use and/or manipulation, as detailed below. In some embodiments, the data may be divided into various types/sets of data, including, for example, data related to operating parameters of the device, data related to clinical procedures, data related to the treated patient, data related to administrative information, and the like, or any combination thereof.
In some embodiments, such collected datasets may be collected from one or more (i.e., a plurality) of automated medical devices, operating under various circumstances (for example, different procedures, different medical instruments, different patients, different locations and operating staff, etc.), to thereby generate a large data base ("big data"), that can be used, utilizing suitable data analysis tools and/or AI-based tools to ultimately generate models or algorithms that allow performance enhancements, automatic control or affecting control (i.e., by providing recommendations), of the medical devices. Thus, by generating such advantageous and specialized models or algorithms, enhanced control and/or operation of the medical device may be achieved.
Reference is now made to FIG. 2, which schematically shows a trajectory planned using one or more processors, such as the processor(s) of the insertion system described in FIG. IB, for delivering a medical instrument to an internal target within the body of the subject, using an automated medical device, such as the automated device of FIG. 1A. In some embodiments, the planned trajectory may be linear or substantially linear. In some embodiments, and as shown in FIG. 2, the trajectory may be non-linear trajectory having any suitable/acceptable degree of curvature.
In some embodiments, the one or more processors may calculate a planned trajectory for the medical instrument to reach the target. The planning of the trajectory and the controlled steering of the instrument according to the planned trajectory may be based on a model of the medical instrument as a flexible beam having a plurality of virtual springs connected laterally thereto to simulate lateral forces exerted by the tissue on the instrument, thereby calculating the trajectory through the tissue on the basis of the influence of the plurality of virtual springs on the instrument, and utilizing an inverse kinematics solution applied to the virtual springs model to calculate the required motion to be imparted to the instrument to follow the planned trajectory. The processor may then provide motion commands to the automated device, for example via a controller. In some embodiments, the one or more processors generate motion commands to the automated device and receives feedback regarding the real-time location of the medical instrument (e.g., the tip thereof), which is then used for real-time trajectory corrections, as disclosed, for example, in abovementioned U.S . Patent No. 8,348,861. For example, if the instrument has deviated from the planned trajectory, the one or more processors may calculate the motion to be applied to the robot to reduce the deviation. The real-time location of the medical instrument and/or the corrections may be calculated and/or applied using data-analysis models/algorithms. In some embodiments, certain deviations of the medical instrument from the planned trajectory, for example deviations which exceed a predetermined threshold, may require recalculation of the trajectory for the remainder of the procedure, as described in further detail hereinbelow.
As shown in FIG. 2, a trajectory 32 is planned between an entry point 36 and an internal target 38. The planning of the trajectory 32 may take into account various variables, including, but not limited to: the type of the medical instrument to be used and its characteristics, the dimensions of the medical instrument (e.g., length, gauge), the type of imaging modality (such as, CT, CBCT, MRI, X-Ray, CT fluoroscopy, ultrasound and the like), the tissues through which the medical instrument is to be inserted, the location of the target, the size of the target, the insertion point, the angle of insertion (relative to one or more axis), milestone points (“secondary targets” through which the medical instrument should pass) and the like, or any combination thereof. In some embodiments, at least one of the milestone points may be a pivot point, i.e., a predefined point along the trajectory in which the deflection of the medical instrument is prevented or minimized, to maintain minimal pressure on the tissue (even if this results in a larger deflection of the instrument in other parts of the trajectory). In some embodiments, the planned trajectory is an optimal trajectory based on one or more of these parameters. Further taken into account in determining the trajectory may be various obstacles 39A-39C, which may be identified along the path and which should be avoided, to prevent damage to the neighboring tissues and/or to the medical instrument. According to some embodiments, safety margins 34 may be marked along the planned trajectory 32, to ensure a minimal distance between the trajectory 32 and potential obstacles en route. The width of the safety margins may be symmetrical in relation to the trajectory 32. The width of the safety margins may be asymmetrical in relation to the trajectory 32. According to some embodiments, the width of the safety margins 34 may be preprogrammed. According to some embodiments, the width of the safety margins may be automatically set, or recommended to the user, by the processor, based on data obtained from previous procedures using a data analysis algorithm. According to some embodiments, the width of the safety margins 34 may be determined and/or adjusted by the user. Further shown in FIG. 2 is an end of a control head 30 of the exemplary automated insertion device, to which the medical instrument (not shown in FIG. 2) is coupled, as virtually displayed on the monitor, to indicate its position and orientation.
The trajectory 32 shown in FIG. 2 is a planar trajectory (i.e., two dimensional). In some embodiments, steering of the instrument is carried out according to a planner trajectory, for example trajectory 32. In some embodiments, the calculated planner trajectory may be superpositioned with one or more additional planner trajectories, to form a three-dimensional (3D) trajectory. Such additional planner trajectories may be planned on one or more different planes, which may be perpendicular to the plane of the first planner trajectory (e.g., trajectory 32) or otherwise angled relative thereto. According to some embodiments, the 3D trajectory may include any type of trajectory, including a linear trajectory or a non-linear trajectory.
According to some embodiments, the steering of the medical instrument is carried out in a 3D space, wherein the steering instructions are determined on each of the planes of the superpositioned planner trajectories, and are then superpositioned to form the steering in the three-dimensional space. The data/parameters/values thus obtained during the steering of the medical instrument using the automated device can be used as data/parameters/values for the generation/training and/or utilization/inference of the data-analysis model(s)/algorithm(s).
Reference is now made to FIGS. 3A-3D, which show planning of an exemplary trajectory for inserting and steering a medical instrument toward a target, according to some embodiments. The exemplary trajectory may be planned using a processor, such as the processor(s) of the insertion system described in FIG. IB, and the insertion and steering of the medical instrument toward the target according to the planned trajectory may be executed using an automated insertion device, such as the automated device of FIG. 1A.
The planning in FIGS. 3A-3D is shown on CT image-views, however it can be appreciated that the planning can be carried out similarly on images obtained from other imaging systems, such as ultrasound, MRI and the like. Shown in FIG. 3A are CT image- views of a subject, depicting at the left-hand panel an axial plane view and on the right-hand panel a sagittal plane view. Also indicated in the figure is an internal target 44 and an automated insertion device 40. Further indicated is a vertebra 46. In FIG. 3B, which shows the CT image- views of FIG. 3A, the insertion point 42 is indicated. Consequently, according to some embodiments, a linear trajectory 48 between the insertion point 42 and the internal target 44 may be calculated and displayed on each of the two views (for example, axial plane view and sagittal plane view). Typically, a linear trajectory is preferred, thus, if the displayed linear trajectory does not pass in close proximity to any potential obstacles, then the linear trajectory is determined as the planned trajectory for the insertion procedure. In FIG. 3C, a transverse process 462 of vertebra 46 is detected in close proximity to the calculated linear trajectory, and is identified and marked, in this example on the axial plane view, to allow considering the obstacle when planning the trajectory for the procedure. In FIG. 3D, the trajectory is re-calculated, so as to allow the instrument to avoid contacting the obstacle 462, resulting in a non-linear trajectory 48’. According to some embodiments, the planned trajectory may not be calculated until potential obstacles are marked on the image-view/s, either manually or automatically, until the user confirms that there are no potential obstacles and/or until the user manually initiates trajectory calculation. In such embodiments, if there are obstacles which necessitate a non-linear trajectory, an interim linear trajectory, similar to linear trajectory 48 of FIG. 3B, may not be calculated and/or displayed. According to some embodiments, a maximal allowable curvature level may be pre-set for the calculation of the non-linear trajectory. The maximal curvature threshold may depend, for example, on the trajectory parameters (e.g., distance between the entry point and the target) and on the type of instrument intended to be used in the procedure and its characteristics (for example, type, diameter (gauge), material, and the like). According to some embodiments, a maximal allowable lateral movement of the instrument at the entry point may be pre-set for the calculation of the non-linear trajectory. According to some embodiments, a maximal allowable proximity to obstacle(s) may be pre-set for the calculation of the non-linear trajectory. As further detailed below, the planned trajectory may be updated in real-time based on the real-time position of the medical instrument (for example, the tip thereof) and/or the real-time position of the target and/or the real-time position(s) of obstacle(s). In some embodiments, the planned trajectory may be updated in real-time based on a predicted/estimated position of the target and/or a predicted/estimated position(s) of obstacle(s).
According to some embodiments, the target 44, insertion point 42 and, optionally, obstacle/s, such as transverse process 462, are marked manually by the user. According to other embodiments, the processor of the insertion system (or of a separate system) may be configured to identify and mark at least one of the target, the insertion point and the obstacle/s, and the user may, optionally, be prompted to confirm or adjust the processor’s proposed markings. In such embodiments, the target and/or obstacle/s may be identified using known image processing techniques and/or data-analysis models/algorithms, optionally based also on data obtained from previous procedures. The insertion point may be suggested based solely on the obtained images, or, alternatively or additionally, on data obtained from previous procedures using data-analysis models/algorithms.
According to some embodiments, the trajectory may be calculated based solely on the obtained images and the marked locations of the entry point, target (and, optionally, obstacle/s). According to other embodiments, the calculation of the trajectory may be based also on data obtained from previous procedures, using data-analysis models/algorithms.
According to some embodiments, once the planned trajectory has been determined, checkpoints along the trajectory may be set. Checkpoints (not shown in FIGS. 3A-3D) may be used so that upon the medical instrument reaching a checkpoint, its insertion is paused and imaging of the region of interest is initiated (either manually by the user or automatically by the processor), to verify the position of the instrument (specifically, to verify that the instrument (e.g., the tip thereof) follows the planned trajectory), to monitor the location of the marked obstacles and/or identify previously unmarked obstacles along the trajectory, and to verify the target’s position, such that recalculation of the trajectory may be initiated, if the user chooses to do so, before advancing the instrument to the next checkpoint/the target. The checkpoints may be manually set by the user, or they may be automatically set or recommended by the processor, as described in further detail hereinbelow.
It can be appreciated that although axial and sagittal views are shown in FIGS. 3A- 3D, views pertaining to different planes or orientations (e.g., coronal, pseudo axial, pseudo sagittal, pseudo coronal, etc.) or additionally generated views (e.g., trajectory view, tool view, 3D view, etc.), may be used in order to perform and/or display the trajectory planning.
According to some embodiments, recalculation of the trajectory may also be required if the instrument deviated from the planned trajectory above a predetermined deviation threshold. In some embodiments, determining the actual real-time location of the instrument may require applying a correction to the determined location of the tip of the medical instrument, to compensate for deviations due to imaging artifacts. The actual location of the tip may be determined based on an instrument position compensation “look-up” table, which corresponds to the imaging modality and the medical instrument used, as disclosed, for example, in abovementioned co-owned International Patent Application No. PCT/IL2020/051219. In some embodiments, if the real-time location of the medical instrument indicates that the instrument has deviated from the planned trajectory, one or more checkpoints may be added and/or repositioned along the planned trajectory, either manually by the user or automatically by the processor, to direct the instrument back to the planned trajectory. In some embodiments, the processor may prompt the user to add and/or reposition checkpoint/s. In some embodiments, the processor may recommend to the user specific position/s for the new and/or repositioned checkpoints. Such a recommendation may be generated using data-analysis algorithm(s).
According to some embodiments, recalculation of the trajectory may also be required if, for example, an obstacle is identified along the trajectory. Such an obstacle may be an obstacle which was marked (manually or automatically) prior to the calculation of the planned trajectory but tissue movement, e.g., tissue movement resulting from the advancement of the instrument within the tissue, caused the obstacle to move such that it entered the planned path. In some embodiments, the obstacle may be a new obstacle, i.e., an obstacle which was not visible in the image (or set of images) based upon which the planned trajectory was calculated, and became visible during the insertion procedure.
In some embodiments, if the instrument deviated from the planned trajectory (e.g., above a predetermined deviation threshold), a new or repositioned obstacle is identified along the planned trajectory and/or the target has moved (e.g., above a predetermined threshold), the user may be prompted to initiate an update (recalculation) of the trajectory. In some embodiments, recalculation of the trajectory, if required, is executed automatically by the processor and the insertion of the instrument is automatically resumed based on the updated trajectory. In some embodiments, recalculation of the trajectory, if required, is executed automatically by the processor, however the user is prompted to confirm the recalculated trajectory before advancement of the instrument (e.g., to the next checkpoint) according to the updated trajectory can be resumed.
According to some embodiments, the trajectory may be updated during the procedure, as detailed herein below. In some embodiments, the trajectory may be updated according to various parameters and variables, including, for example, the actual (real-time) position of the target. In some embodiments, the trajectory may be updated according to predicted location of the target, as determined/calculated as detailed herein (for example, using suitable machine learning (or deep learning) algorithms and/or image processing techniques).
Reference is now made to FIG. 4, which illustrates steps in an exemplary method for planning and updating a trajectory of a medical instrument toward an internal target in a body of a subject, according to some embodiments. In some embodiments, the trajectory may be updated according to the actual (real-time) location of the target. As shown in FIG. 4, at step 50, the trajectory of the medical instrument is planned from an insertion point on the body of the subject to an internal target. In some embodiments, the target may be identified and marked on obtained image(s) manually by the user. In some embodiments, the target may be identified automatically by the processor, using image processing techniques and/or data- analysis algorithms. In some embodiments, the insertion point may be selected and marked on obtained image(s) manually by the user. In some embodiments, one or more optional (e.g., optimal) insertion points(s) may be identified by the processor, using image processing techniques and/or data-analysis algorithms. In such embodiments, the recommended insertion points may be displayed on the obtained image(s) and the user may be prompted to select one of the entry points and/or adjust the location of a recommended entry point. In some embodiments, the planned trajectory may be any type of trajectory, such as, 2D trajectory or 3D trajectory. In some embodiments, a planned 3D trajectory may be obtained by planning a route on each of two planes and superpositioning the two 2D routes on said planes, at their intersection line, to form the planned 3D trajectory. In some exemplary embodiments, the two planes are perpendicular. The planning/calculating of the trajectory may take into account various parameters, including but not limited to: type of medical instrument, characteristics of the medical instrument (material, length, gauge, etc.), type of imaging modality (such as, CT, CBCT, MRI, X-RAY, CT Fluoroscopy, Ultrasound and the like), insertion point, insertion angle, type of tissue(s), location of the internal target, size of the target, shape of target, tissue, obstacles along the route, milestone points (“secondary targets” through which the medical instrument should pass) and the like, or any combination thereof. In some embodiments, at least one of the milestone points may be a pivot point, i.e., a predefined point along the trajectory in which the deflection of the medical instrument is prevented or minimized, to maintain minimal pressure on the tissue (even if this results in a larger deflection of the instrument in other parts of the trajectory). In some embodiments, a maximal allowable curvature level may be pre-set for the planning of the trajectory. In some embodiments, a maximal allowable lateral movement of the instrument at the entry point may be pre-set for the planning of the trajectory. According to some embodiments, a maximal allowable proximity to obstacle(s) may be pre-set for the calculation of the non-linear trajectory. In some embodiments, the planned trajectory is an optimal trajectory based on one or more of these parameters.
Next, at step 52, the medical instrument is inserted into the body of the subject at the designated (selected) entry point and steered (in a suitable space) towards the predetermined target, according to the planned trajectory. As detailed herein, the insertion and steering of the medical instrument is facilitated by an automated device for inserting and steering, such as, for example, device 2 of FIG. 1 A.
At step 54, the real-time location/position (and optionally the orientation) of the medical instrument (e.g., the tip thereof) and/or the real-time location of one or more obstacles and/or the location of newly identified one or more obstacles along the trajectory and/or the real-time location of one or more of the milestone points (“secondary targets”) and/or the real-time location of the target are determined. Each possibility is a separate embodiment. In some embodiments, the determination of any of the above may be performed manually by the user. In some embodiments, the determination of any of the above may be performed automatically by one or more processors. In the latter case, the determination may be performed by any suitable methods known in the art, including, for example, using suitable image processing techniques and/or machine learning (or deep learning) algorithms, using data collected in previous procedures (procedures previously performed), as further described hereinbelow. Step 54 may optionally further include correcting the determined location of the tip of the medical instrument, to compensate for deviations due to imaging artifacts, in order to determine the actual location of the tip. Determining the actual location of the tip prior to updating the trajectory can in some embodiments vastly increase the accuracy of the procedure. In some embodiments, the determination of the real-time locations may be performed at any spatial and/or temporal distribution/pattem and may be continuous or at any time (temporal) or space (spatial) intervals. In some embodiments, the procedure may be paused, either automatically or selectively by the user, at spatio and/or temporal intervals, to allow processing, determining, changing and/or approving continuation of the procedure. For example, the determination of the real-time locations indicated above may be performed at one or more checkpoints. In some embodiments, the checkpoints may be predetermined and/or determined during the steering procedure. In some embodiments, the checkpoints may include spatial checkpoints (for example, regions or locations along the trajectory, including, for example, specific tissues, specific regions, length or location along the trajectory (for example, every 20-50 mm), and the like). In some embodiments, the checkpoints may be temporal checkpoints, i.e., a checkpoint performed at designated time points during the procedure (for example, every 2-5 seconds). In some embodiments, the checkpoints may include both spatial and temporal check points. In some embodiments, the checkpoints may be spaced apart, including the first checkpoint from the entry point and the last checkpoint from the target, at an essentially similar distance along the planned trajectory. According to some embodiments, the checkpoints may be manually set by the user. According to some embodiments, the checkpoints may be automatically set by the processor, using image processing or computer vision algorithms, based on the obtained images and the planned trajectory and/or also on data obtained from previous procedures using machine learning capabilities, as disclosed, for example, in co-owned International Patent Application No. PCT/IL2021/050441, which is incorporated herein by reference in its entirety. In such embodiments, the user may be required to confirm the checkpoints recommended by the processor or choose to adjust their location/timing. Upper and/or lower interval thresholds between checkpoints may be predetermined. For example, the checkpoints may be automatically set by the processor at, for example, about 20 mm intervals, and the user may be permitted to adjust the distance between each two checkpoints (or between the entry point and the first checkpoint and/or between the last checkpoint and the target) such that the maximal distance between them is, for example, about 30mm and/or the minimal distance between them is about 3mm.
Once the real-time location of any of the above parameters, or at least the real-time position of the target is determined, it is determined if there is a deviation from the initial (or expected) position and/or from the planned trajectory, and if a deviation is determined, then, at step 56, the trajectory is updated. The deviation may be determined compared to a previous time point or spatial point, as detailed above. In some embodiments, if a deviation in one or more of the abovementioned parameters is detected, the deviation is compared with a respective threshold, to determine if the deviation exceeds the threshold. The threshold may be, for example, a set value or a percentage reflecting a change in a value. The threshold may be determined by the user. The threshold may be determined by the processor, for example based on data collected in previous procedures and using machine learning algorithms. If deviation is detected, or if the detected deviation exceeds the set threshold, the trajectory may be updated according to the required change. In some embodiments, updating the trajectory may be executed by calculating the trajectory required to reach the target at its new location. The calculation of the updated trajectory may take into account the various parameters taken into account during the planning of the initial trajectory, as described above. In some embodiments, updating the trajectory may be executed using data-analysis algorithm(s), e.g., AI-based model(s). In some embodiments, if the trajectory is a 3D trajectory, the trajectory may be updated by updating the route, according to the required change in each of two planes (for example, planes perpendicular thereto) and thereafter superpositioning the two updated 2D routes on the two (optionally perpendicular) planes to form the updated 3D trajectory. In some embodiments, the updated route on each of the two planes may be performed by any suitable method, including, for example, utilizing a kinematics model. In some embodiments, if the real-time location of the medical instrument indicates that the instrument has deviated from the planned trajectory, the user may add and/or reposition one or more checkpoints along the planned trajectory, to direct the instrument back to the planned trajectory. In some embodiments, the processor may prompt the user to add and/or reposition checkpoint/s. In some embodiments, the processor may recommend to the user specific position/s for the new and/or repositioned checkpoints. Such a recommendation may be generated using image processing techniques and/or machine learning algorithms.
As detailed in step 58, the steering of the medical instrument is then continued, according to the updated trajectory, to facilitate the tip of the instrument reaching the internal target (and secondary targets along the trajectory, if such are required). It can be appreciated, that if no deviation in the abovementioned parameters was detected, the steering of the medical instrument can continue according to the planned trajectory.
As indicated in step 59, steps 54-58 may be repeated for any number of times, until the tip of the medical instrument reaches the internal target, or until a user terminates the procedure. In some embodiments, the number of repetitions of steps 54-58 may be predetermined or determined in real-time, during the procedure. According to some embodiments, at least some of the steps (or sub-steps) are performed automatically. In some embodiments, at least some of the steps (or sub-steps) may be performed manually, by a user. According to some embodiments, one or more of the steps are performed automatically. According to some embodiments, one or more of the steps are performed manually. According to some embodiments, one or more of the steps are supervised manually and may proceed after being approved by user.
According to some embodiments, the planning (and/or updating) of the trajectory is a dynamic planning (and/or updating), allowing automatically predicting changes (for example, predicted target change), obstacles (for example, bones and/or blood vessels which are to be avoided), milestones along the trajectory, and the like, and adjusting the steering of the medical instrument accordingly is in fully-automated or at least semi- automated manner. In some embodiments, the dynamic planning proposes a planned and/or updated trajectory to a user for confirmation prior to proceeding with any of the steps. According to some embodiments, the trajectory planning is a dynamic planning, taking into consideration expected cyclic changes in the position of the target, obstacles, etc., resulting from the body motion during the breathing cycle, as described, for example, in co-owned U.S. Patent No. 10,245,110, which is incorporated herein by reference in its entirety. Such dynamic planning may be based on sets of images obtained during at least one breathing cycle of the subject (e.g., using a CT system), or based on a video generated during at least one breathing cycle of the subject (e.g., using a CT fluoroscopy system or any other imaging system capable of continuous imaging).
According to some embodiments, the steering of the medical instrument to the target is achieved by directing the medical instrument (for example, the tip of the medical instrument), to follow, in real-time, the planned trajectory, which may be updated in real time, during the procedure, as needed. In some embodiments, the directing is affected by a control device/ controller unit configured to receive input and generate control data in response thereto, for controlling operation of the automatic medical device.
According to some embodiments, the term "real-time trajectory" of a medical instrument relates to the actual path the medical instrument transverses in the body of the subject, i.e., its actual position at each point in time during the steering procedure.
According to some exemplary embodiments, the trajectory planning and updating using the systems disclosed herein is facilitated using any suitable imaging device. In some embodiments, the imaging device is a CT imaging device. In some embodiments, the planning and/or real-time updating of the trajectory is performed based on CT images of the subject obtained before and/or during the procedure. According to some embodiments, when utilizing various imaging modalities in the procedure, inherent difficulties may arise in identifying the actual location of the tip of the medical instrument. In some embodiments, the accurate orientation and position of the tool are important for high accuracy steering. Further, by determining the actual position of the tip safety is increased, as the medical instrument is not inserted beyond the target or beyond what is defined by the user. Depending on the imaging modality, the tissue and the type of medical instrument, artifacts which obscure the actual location of the tip can occur.
For example, when utilizing CT imaging, streaks and dark bands due to beam hardening can occur, which result in a “dark” margin at the end of the scanned instrument. The voxels at the end of the medical instrument may have very low intensity levels even if the actual medium or adjacent objects would normally have higher intensity levels. Additionally, point spread function (PSF) can occur in which the visible borders of the medical instrument are extended beyond their actual boundaries. Such artifacts can depend on the object's materials, size, and medical instrument angle relative to the CT, as well as on the scan parameters (field of view (FOV), beam power values) and reconstruction parameters (kernel and other filters).
Thus, depending on the type of the medical instrument, the imaging modality and/or the tissue, the tip position may not be easily visually detected, and in some cases, the determination may vastly deviate, for example by over 2-3mm. In some embodiments, the actual and relatively exact location of the tip may be determined at below visualized pixel size level.
In some embodiments, the determination of the actual position of the tip may depend on the desired/required accuracy level, which may depend on several parameters, including, for example, but not limited to: the clinical indication (for example, biopsy vs. fluid drainage); the target size, target location and/or movement; the lesion size (for a biopsy procedure, for example); the anatomical location (for example, lungs/brain vs. liver/kidneys); the trajectory (for example, if it passes near delicate organs, blood vessels, etc.); and the like, or any combination thereof. According to some embodiments, the determination/correction of the actual location of the tip may be performed in real-time. According to some embodiments, the determination/correction of the actual location of the tip may be performed continuously and/or in time lapses on suitable images obtained from various imaging modalities. According to some embodiments, such artifacts and inaccuracies are compensated in real time in order to determine the actual location of the tip and ensure it meets the target end-point in an accurate and effective spatio-temporal manner.
Reference is now made to FIG. 5, which is a diagram 60 of a method of generating, deploying and using a data- analysis algorithm, according to some embodiments. As shown in FIG. 5, at step 61, automated medical procedure(s) are executed using automated medical device(s). Automated medical procedure(s) involve a plurality of datasets related thereto (as further detailed below). For example, some of the datasets directly relate to the operation of the medical device (such as operating parameters), some of the datasets relate to the clinical procedure, some of the datasets relate to the treated patient and some of the datasets relate to administrative related information. In some embodiments, in addition to the datasets related or generated during the medical procedure/s, datasets may be generated during training sessions performed by users on a dedicated simulator system. Such a simulator system may be configured to at least partially simulate a medical procedure, including enabling users to plan the procedure on existing images and then simulating the execution of the procedure according to the procedure plan via a virtual automated medical device and a virtual medical instrument. Next, at step 62, at least some of the generated datasets, values thereof and/or parameters related thereto are collected from the medical procedures and/or simulation sessions and stored in a centralized database. The collected datasets may be split/divided for use as training sets, validation sets and/or testing sets. Then, at step 63, the collected data is annotated, to thereby generate and train the data-analysis algorithm, at stage 64. At step 65, the data-analysis algorithm is validated and deployed. Once deployed, the results from the algorithm are obtained, at step 66, and the results are then used to provide, at stage 67, recommendations/operating instructions/predictions/alerts. Subsequent medical procedures executed by automated medical devices may implement at least some of the recommendations/operating instructions/predictions/alerts, thereby returning to step 61 and repeating the method. In some instances, the performance of the validated algorithm is monitored, at stage 68, and is further enhanced/improved, based on data stored in the centralized database and/or on newly acquired data.
According to some embodiments, the various obtained datasets may be used for the training, construction and/or validation of the algorithm. In some embodiments, the datasets may be selected from, but not limited to: medical device related dataset, clinical procedures related dataset, patient related dataset, administrative-related dataset, and the like, or any combination thereof.
According to some exemplary embodiments, the medical device related dataset may include such data parameters or values as, but not limited to: procedure steps timing, overall procedure time, overall steering time (of the medical instrument), entry point of the medical instrument, target point/region, target updates (for example, updating real-time depth and/or lateral position of the target), planned trajectory of the medical instrument, real-time trajectory of the medical instrument, (real-time) trajectory updates, number of checkpoints (CPs) along the planned or real-time-updated trajectory of the medical instrument, CPs positions/locations, CPs updates during the procedure, CPs errors (in 2D and/or in 3D), position of the medical device, insertion angles of the medical instrument (for example, insertion angle in the axial plane and off-axial angle), indication whether the planned (indicated) target has been reached during the procedure, target error (for example, lateral and depth, in 2D and /or in 3D), scans/images, parameters per scan, radiation dose per scan, total radiation dose in the steering phase of the medical instrument, total radiation dose the entire procedure, errors/warnings indicated during the procedure, software logs, motion control traces, medical device registration logs, medical instrument (such as, needle) detection logs, homing and BIT results, and the like, or any combination thereof. Each possibility is a separate embodiment. In some embodiments, one or more of the values may be configured to be collected automatically by the system. For example, values such as procedure steps timing, overall steering time, entry, target, target updates (depth and lateral), trajectory, trajectory updates, number of CPs, CP positions, CP updates, CP errors (2 planes and/or 3D), robot position, scans/images, parameters per scan, radiation dose, errors/wamings, software logs, motion control traces, medical device registration logs, medical instrument detection logs, homing and BIT results may be collected automatically.
According to some exemplary embodiments, the clinical procedures related dataset may include such data parameters or values as, but not limited to: procedure type (e.g., blood/fluid sampling, regional anesthesia, tissue biopsy, catheter insertion, cryogenic ablation, electrolytic ablation, brachy therapy, neurosurgery, deep brain stimulation, various minimally invasive surgeries, and the like), target organ, target dimensions, target type (tumor, abscess, and the like), type of medical instrument, parameters of medical instrument (e.g., gauge, length, material, tip type, etc.), complications before/during/after the procedure, adverse events before/during/after the procedure, respiration signals of the patient, movement of the patient, and the like, or any combination thereof. Each possibility is a separate embodiment. In some embodiments, one or more of the values may be configured to be collected automatically. For example, the type of medical instrument (for example, type of a needle), parameters of the medical instrument, respiration signal(s) of the patient, movement of the patient, movement traces of the automated medical device and system logs may be collected automatically. In some embodiments, one or more of the values may be configured to be collected manually by requesting the user to insert the data, information and/or visual marking using a graphic-user-interface (GUI), for example.
According to some exemplary embodiments, the patient related dataset may include such data parameters or values as, but not limited to: age, gender, race, relevant medical history, vital signs before/after/during the procedure, body dimensions (height, weight, BMI, circumference, etc.), current medical condition, pregnancy, smoking habits, demographic data, and the like, or any combination thereof. Each possibility is a separate embodiment.
According to some exemplary embodiments, the administrative related dataset may include such data parameters or values as, but not limited to: institution (healthcare facility) in which the procedure is performed, physician, staff, system serial numbers, disposables used, software/operating systems versions, configuration parameters, and the like, or any combination thereof. Each possibility is a separate embodiment.
According to some embodiments, by using one or more values of one or more datasets, and generating a data-analysis algorithm, various predictions, recommendations and/or implementations may be generated that can enhance further medical procedures. In some embodiments, based on the data used, the generated algorithm/s may be customized to a specific procedure, specific patient (or cohort of patients), or any other set of specific parameters.
According to some embodiments, the algorithm/s may be used for enhancing medical procedures, predicting clinical outcome and/or clinical complications and overall increasing safety and accuracy.
According to some exemplary embodiments, the data-analysis algorithms generated by the systems and methods disclosed herein may be used for, but not limited to: Tissue segmentation; Tissue reconstruction; Target detection; Target tracking; Predicting target movement and/or target location during and/or at the end of the procedure; predicting tissue/organs movement and/or tissue/organs location during and/or at the end of the procedure; Predicting obstacles location and/or movement during and/or at the end of the procedure; Predicting changes in the anatomical structure (e.g., deformation) of tissues/target/obstacles during and/or at the end of the procedure; Determining and/or recommending entry point location; Determining and/or recommending a trajectory for the insertion procedure; Updating a trajectory during the procedure; Optimizing checkpoint positioning along a trajectory (planned and/or updated trajectory), e.g., by recommending the best tradeoff between accuracy and radiation exposure/procedure time, as disclosed, for example, in abovementioned co-owned International Patent Application No. PCT/IL2021/050441; Determining or recommending “no-fly” zones, i.e., areas (obstacles and/or vital anatomical structures) to avoid during instrument insertion, as disclosed, for example, in co-owned International Patent Application No. PCT/IL2021/050437, which is incorporated herein by reference in its entirety; Predicting and/or detecting entrance into defined “no-fly” zones; Predicting “no-fly” zones location and/or changes during and/or at the end of the procedure Optimizing steering algorithm corrections; Optimizing medical device registration and instrument detection algorithms thereby improving system accuracy and allowing radiation reduction; Optimizing compensation calculations for determining the actual real-time location of the tip of the medical instrument, as disclosed, for example, in abovementioned co-owned International Application No. PCT/IL2020/051219; Evaluating procedure success (estimated success and/or estimated risk level) based on the current planning and similar past procedures; Utilizing force sensor measurements for evaluation of tissue compliance, early detection of clinical complications and/or optimizing instrument steering; Utilization of additional sensor measurements (e.g., accelerometer, radiation sensor, respiration sensor, etc.); and the like, or any combination thereof. Each possibility is a separate embodiment.
According to some embodiments, generated algorithms may be used for providing recommendations regarding various device functions and operations, including providing optimized routes or modes of operation. According to some embodiments, generated algorithms may be used for providing improved/optimized procedures, while taking into account various variables that may change during the procedure, such as, for example, predicting target movement, correlating body movement (breathing -related) and device operation, etc.
According to some embodiments, a training module (also referred to as "learning module") may be used to train an AI model (e.g., ML or DL-based model) to be used in an inference module, based on the datasets and/or the features extracted therefrom and/or additional metadata, in the form of annotations (e.g., labels, bounding-boxes, segmentation maps, visual locations markings, etc.). In some embodiments, the training module may constitute part of the inference module or it may be a separate module. In some embodiments, a training process (step) may precede the inference process (step). In some embodiments, the training process may be on-going and may be used to update/validate/enhance the inference step (see “active-learning” approach described herein). In some embodiments, the inference module and/or the training module may be located on a local server (“on premise”), a remote server (such as, a server farm or a cloud-based server) or on a computer associated with the automated medical device. According to some embodiments, the training module and the inference module may be implemented using separate computational resources. In some embodiments, the training module may be located on a server (local or remote) and the inference module may be located on a local computational resource (computer), or vice versa. According to some embodiments, both the training module and the inference module may be implemented using common computational resources, i.e., processors and memory components shared therebetween. In some embodiments, the inference module and/or the training module may be located or associated with a controller (or steering system) of an automated medical device. In such embodiments, a plurality of inference modules and/or learning modules (each associated with a medical device or a group of medical devices), may interact to share information therebetween, for example, utilizing a communication network. In some embodiments, the model(s) may be updated periodically (for example, every 1-36 weeks, every 1-12 months, etc.). In some embodiments, the model(s) may be updated based on other business logic. In some embodiments, the processor(s) of the automated medical device (e.g., the processor of the insertion system) may mn/execute the model(s) locally, including updating and/or enhancing the model(s).
According to some embodiments, during training of the model (as detailed below), the learning module (either implemented as a separate module or as a portion of the inference module), may be used to construct a suitable algorithm (such as, a classification algorithm), by establishing relations/connections/pattems/correspondences/correlations between one or more variables of the primary datasets and/or between parameters derived therefrom. In some embodiments, the learning may be supervised learning (e.g., classification, object detection, segmentation and the like). In some embodiments, the learning may be unsupervised learning (e.g., clustering, anomaly detection, dimensionality reduction and the like). In some embodiments the learning may be reinforcement learning. In some embodiments, the learning may use a self-learning approach. In some embodiments, the learning process is automatic. In some embodiments, the learning process is semi-automatic. In some embodiments, the learning is manually supervised. In some embodiments, at least some variables of the learning process may be manually supervised/confirmed, for example, by a user (such as a physician). In some embodiments, the training stage may be an offline process, during which a database of annotated training data is assembled and used for the creation of data-analysis model(s)/algorithm(s), which may then be used in the inference stage. In some embodiments, the training stage may be performed "online", as detailed herein.
According to some embodiments, the generated algorithm may essentially constitute at least any suitable specialized software (including, for example, but not limited to: image recognition and analysis software, statistical analysis software, regression algorithms (linear, non-linear, or logistic etc.), and the like). According to some embodiments, the generated algorithm may be implemented using an artificial neural network (ANN), such as a deep neural network (DNN), a convolutional neural network (CNN), a recurrent neural network (RNN) and the like, decision trees or graphs, association rule learning, support vector machines, inductive logic programming, Bayesian networks, instance-based learning, manifold learning, sub-space learning, and the like, or any combination thereof. The algorithm or model may be generated using machine learning tools, data wrangling tools, deep learning tools, and, more generally, data science and artificial intelligence (AI) learning tools, as elaborated hereinbelow.
Reference is now made to FIGS. 6A-6B, which show an exemplary training module (FIG. 6A) and an exemplary training process (FIG. 6B), according to some embodiments.
As shown in FIG. 6A, a training module 70 may include two main hardware components/units: at least one memory 72 and at least one processing unit 74, which are functionally and/or physically associated. Training module 70 may be configured to train a model based on data. Memory 72 may include any type of accessible memory (volatile and/or non-volatile), configured to receive, store and/or provide various types of data, to be processed by processing unit 74, which may include any type of at least one suitable processor, as detailed below. In some embodiments, the memory and the processing units may be functionally or physically integrated, for example, in the form of a Static Random Access Memory (SRAM) array. In some embodiments, the memory is a non-volatile memory having stored therein executable instructions (for example, in the form of a code, service, executable program and/or a model file). As shown in FIG. 6A, the memory unit 72 may be configured to receive, store and/or provide various types of data values or parameters related to the data. Memory 72 may store or accept raw (primary) data 722 that has been collected, as detailed herein. Additionally, metadata 724, related to the raw data 722 may also be collected/stored in memory 72. Such metadata may include a variety of parameters/values related to the raw data, such as, but not limited to: the specific device which was used to provide the data, the time the data was obtained, the place the data was obtained (such as a specific procedure/operating room, specific institution, etc.), and the like. Memory 72 may further be configured to store/collect data annotations (e.g., labels) 726. In some embodiments, the collected data may require additional steps for the generation of data- annotations that will be used for the generation of the machine-learning, deep-learning models or other statistical or predictive algorithms as disclosed herein. In some embodiments, such data annotations may include labels describing the clinical procedure’s characteristics, the automated device’s operation and computer-vision related annotations, such as segmentation masks, target marking, organs and tissues marking, and the like. The different annotations may be generated in an “online” manner, which is performed while the data is being collected, or in an “offline” manner, which is performed at a later time after sufficient data has been collected. The memory 72 may further include features database 728. The features database 728 may include a database ("store") of previously known or generated features that may be used in the training/generation of the models. The memory 72 of training module 70 may further, optionally, include pre-trained models 729. The pre-trained models 729 include existing pre-trained algorithms which may be used to automatically annotate a portion of the data and/or to ease training of new models using “transfer-learning” methods and/or to shorten training time by using the pre-trained models as starting points for the training process on new data and/or to evaluate and compare performance metrics of existing versus newly developed models before deployment of new model to production, as detailed hereinbelow. In some embodiments, processing unit 74 of training module 70 may include at least one processor, configured to process the data and allow/provide model training by various processing steps (detailed in FIG. 6B). Thus, as shown in FIG. 6A, processing unit 74 may be configured at least to perform pre-processing of the data 742. Pre-processing of the data may include actions for preparing the data stored in memory 72 for downstream processing, such as, but not limited to, checking for and handling null values, imputation, standardization, handling categorical variables, one-hot encoding, resampling, scaling, filtering, outlier removal etc. Processing unit 74 may further, optionally, be configured to perform feature extraction 744, in order to reduce the raw data dimension and/or add informative domain-knowledge into the training process and allow the use of additional machine-learning algorithms not suitable for training on raw data and/or optimization of existing or new models by training them on both the raw data and the extracted features. Feature extraction may be executed using dimensionality reduction methods, for example, Principal Components Analysis (PCA), Independent Component Analysis (ICA), Linear Discriminant Analysis (LDA), Locally Linear Embedding (LLE), t-distributed Stochastic Neighbor Embedding (t-SNE), Unified Manifold Approximation and Projection (UMAP) and/or Autoencoders, etc. Feature extraction may be executed using feature engineering methods in which mathematical tools are used to extract domain-knowledge features from the raw data, for example - statistical features, such as mean, variance, ratio, frequency etc. and/or visual features, such as dimension or shape of certain objects in an image. Another optional technique which may be executed by the processing unit 74 to reduce the number of features in the dataset is feature selection, in which the importance of the existing features in the dataset is ranked and the less important features are discarded (i.e., no new features are created). Processing unit 74 may further be configured to execute model training 746.
Reference is now made to FIG. 6B, which shows steps in an exemplary training process 76, executed by a suitable training module (such as training module 70 of FIG. 6A). As shown in FIG. 6B, at optional step 761, collected datasets may first require an Extract- Transform-Load (ETL) or ELT process that may be used to (1) Extract the data from a single or multiple data sources (including, but not limited to, the automated medical device itself, Picture Archiving and Communication System (PACS), Radiology Information System (RIS), imaging device, healthcare facility’s Electronic Health Record (EHR) system, etc.), (2) Transform the data by applying one or more of the following steps: handling missing values, checking for duplicates, converting data types as needed, encoding values, joining data from multiple sources, aggregating data, translating coded values etc. and (3) Load the data to a variety of data storage devices (on-premise or at a remote location (such as a cloud server)) and/or to a variety of data stores, such as file systems, SQL databases, no-SQL databases, distributed databases, object storage, etc. In some embodiments, the ETL process may be automatic and triggered with every new data collected. In other embodiments, the ETL process may be triggered at a predefined schedule, such as once a day or once a week, for example. In some embodiments, another business logic may be used to decide when to trigger the ETL process.
At step 762, the data may be cleaned to ensure high quality data by, for example removal of duplicates, removal or modification of incorrect and/or incomplete and/or irrelevant data samples, etc. At step 763, the data is annotated. The data annotations may include, for example, labels describing the clinical procedure’s characteristics, the automated device’s operation and computer- vision related annotations, such as segmentation masks, target marking, organs and tissues marking, existence of medical conditions/complications, existence of certain pathologies, etc. The different annotations may be generated in an “online” manner, which is performed while the data is being collected, or in an “offline” manner, which is performed at a later time after sufficient data has been collected. In some embodiments, the data annotations may be generated automatically using an “active learning” approach, in which existing pre-trained algorithms are used to automatically annotate a portion of the data. In some embodiments, the data annotations may be generated using a partially automated approach with “human in the loop”, i.e., human approval or human annotations will be required in cases where the annotation confidence is low, or per other business logic decision or metric. In some embodiments, the data annotations may be generated in a manual approach, i.e., using human annotators to generate the required annotations using convenient annotation tools. Next, at step 764, the annotated data is pre- processed, for example, by one or more of checking for and handling null values, imputation, standardization, handling categorical variables, one -hot encoding, resampling, scaling, filtering, outlier removal and other data manipulations, to prepare the data for further processing. At optional step 765, extraction (or selection) of various features of the data may be performed, as explained hereinabove. At step 766, the data and/or features extracted therefrom is divided to training data (“training set”), which will be used to train the model, and testing data (“testing set”), which will not be introduced into the model during model training so it can be used as “hold-out” data to test the final trained model before deployment. The training data may be further divided into a “train set” and a “validation set”, where the train set is used to train the model and the validation set is used to validate the model’s performance on unseen data, to allow optimization/fine-tuning of the training process’ configuration/hyperparameters during the training process. Examples for such hyperparameters may be the learning-rate, weights regularization, model architecture, optimizer selection, etc. In some embodiments, the training process may include the use of a Cross-Validation (CV) methods in which the training data is divided into a “train set” and a “validation set”, however, upon training completion, the training process may repeat multiple times with different selections of “train set” and “validation set” out of the original training data. The use of Cross-Validation (CV) may allow a better validation of the model during the training process as the model is being validated against different selections of validation data. At optional step 767, data augmentation is performed. Data augmentation may include, for example, generation of additional data from/based on the collected or annotated data. Possible augmentations that may be used for image data are: rotation, flip, noise addition, color distribution change, crop, stretch, etc. Augmentations may also be generated using other types of data, for example by adding noise or applying a variety of mathematical operations. In some embodiments, augmentation may be used to generate synthetic data samples using synthetic data generation approaches, such as distribution based, Monte-Carlo, Variational Autoencoder (VAE), Generative-Adversarial-Network (GAN), etc. Next, at step 768, the model is trained, wherein the training may be performed “from scratch” (i.e., an initial/primary model with initialized weights is trained based on all relevant data) and/or utilizing existing pre-trained models as starting points and training them only on new data. At step 769, the generated model is validated. Model validation may include evaluation of different model performance metrics, such as accuracy, precision, recall, FI score, AUC- ROC, etc., and comparison of the trained model against other existing models, to allow deployment of the model which best fits the desired solution. The evaluation of the model at this step is performed using the testing data (“test set”) which was not used for model training nor for hyperparameters optimization and best represents the real-world (unseen) data. At step 770, the trained model is deployed and integrated or utilized with the inference module to generate output based on newly collected data, as detailed herein. According to some embodiments, as more data is collected, the training database may grow in size and may be updated. The updated database may then be used to re-train the model, thereby updating/enhancing/improving the model’s output. In some embodiments, the new instances in the training database may be obtained from new clinical cases or procedures or from previous (existing) procedures that have not been previously used for training. In some embodiments, an identified shift in the collected data’s distribution may serve as a trigger for the re-training of the model. In other embodiments, an identified shift in the deployed model’s performance may serve as a trigger for the re-training of the model. In some embodiments, the training database may be a centralized database (for example, a cloud-based database), or it may be a local database (for example, for a specific healthcare facility). In some embodiments, learning and updating may be performed continuously or periodically on a remote location (for example, a cloud server), which may be shared among various users (for example, between various institutions, such as hospitals). In some embodiments, learning and updating may be performed continuously or periodically on a single or on a cohort of medical devices, which may constitute an internal network (for example, of an institution, such as a hospital). For example, in some instances, a validated model may be executed locally on processors of one or more medical systems operating in a defined environment (for example, a designated institution, such as a hospital), or on local online servers of the designated institution. In such case, the model may be continuously updated based on data obtained from the specific institution ("local data"), or periodically updated based on the local data and/or on additional external data, obtained from other resources. In some embodiments, federated learning may be used to update a local model with a model that has been trained on data from multiple facilities/tenants without requiring the local data to leave the facility or the institution.
Reference is now made to FIGS. 7A-7B, which show an exemplary inference module (FIG. 7A) and an exemplary inference process (FIG. 7B), according to some embodiments.
As shown in FIG. 7A, inference module 80 may include two main hardware components/units: at least one memory unit 82 and at least one processing unit 84, which are functionally and/or physically associated. Inference module 80 is essentially configured to run collated data into the trained model to calculate/process an output/prediction. Memory 82 may include any type of accessible memory (volatile and/or non-volatile), configured to receive, store and/or provide various types of data and executable instructions, to be processed by processing unit 84, which may include any type of at least one suitable processor. In some embodiments, the memory 82 and the processing unit 84 may be functionally or physically integrated, for example, in the form of a Static Random Access Memory (SRAM) array. In some embodiments, the memory is a non-volatile memory having stored therein executable instructions (for example, in the form of a code, service, executable program and/or a model file containing the model architecture and/or weights) that can be used to perform a variety of tasks, such as data cleaning, required pre-processing steps and inference operation (as detailed below) on new data to obtain the model’ s prediction or result. As shown in FIG. 7A, memory 82 may be configured to accept/receive, store and/or provide various types of data values or parameters related to the data as well as executable algorithms (in the case of machine learning based algorithms, these may be referred to as “trained models”). Memory unit 82 may store or accept new acquired data 822, which may be raw (primary) data that has been collected, as detailed herein. Memory module 82 may further store metadata 824 related to the raw data. Such metadata may include a variety of parameters/values related to the raw data, such as, but not limited to: the specific device which was used to provide the data, the time the data was obtained, the place the data was obtained (such as specific operation room, specific institution, etc.), and the like. Memory 82 may further store the trained model(s) 826. The trained models may be the models generated and deployed by a training module, such as training module 70 of FIG. 6A. The trained model(s) may be stored, for example in the form of executable instructions and/or model file containing the model’s weights, capable of being executed by processing unit 84. Processing unit 84 of inference module 80 may include at least one processor, configured to process the new obtained data and execute a trained model to provide corresponding results (detailed in FIG. 7B). Thus, as shown in FIG. 7A, processing unit 84 is configured at least to perform pre-processing of the data 842, which may include actions for preparing the data stored in memory 82 for downstream processing, such as, but not limited to, checking for and handling null values, imputation, standardization, handling categorical variables, one-hot encoding, resampling, scaling, filtering, outlier removal etc. In some embodiments, processing unit 84 may further be configured to extract features 844 from the acquired data, using techniques such as, but not limited to, Principal Components Analysis (PCA), Independent Component Analysis (ICA), Linear Discriminant Analysis (LDA), Locally Linear Embedding (LLE), t- distributed Stochastic Neighbor Embedding (t-SNE), Unified Manifold Approximation and Projection (UMAP) and/or Autoencoders, etc. Feature extraction may be executed using feature engineering methods in which mathematical tools are used to extract domain- knowledge features from the raw data, for example: statistical features such as mean, variance, ratio, frequency etc. and/or visual features such as dimension or shape of certain objects in an image. Alternatively, or additionally, the processing unit 84 may be configured to perform feature selection. Processing unit 84 may further be configured to execute the model on the collected data and/or features extracted therefrom, to obtain model results 846. In some embodiments, the processing unit 84 may further be configured to execute a business logic 848, which can provide further fine-tuning of the model results and/or utilization of the model’s results to a variety of automated decisions, guidelines or recommendations supplied to the user.
Reference is now made to FIG. 7B, which shows steps in an exemplary inference process 86, executed by a suitable inference module (such as inference module 80 of FIG. 7 A). As shown in FIG. 7B, at step 861, new data is acquired/collected from or related to newly executed medical procedures. The new data may include any type of raw (primary) data, as detailed herein. At optional step 862, suitable trained model(s) (generated, for example by a suitable training model in a corresponding training process) may be loaded, per task(s). This step may be required in instances in which computational resources are limited and only a subset of the required models or algorithms can be loaded into RAM memory to be used for inference. In such cases, the inference process may require an additional management step responsible to load the required models from storage memory for a specific subset of inference tasks/jobs, and once inference is completed, the loaded models are replaced with other models that will be loaded to allow an additional subset of inference tasks/jobs. Next, at step 863, the raw data collected in step 861 is pre-processed. In some embodiments, the pre-processing steps may be similar or identical to the pre-processing step preformed in the training process (by the training module), to thereby allow the data to be processed similarly by the two modules (i.e., training module and inference module). In some embodiments, this step may include actions such as, but not limited to, checking for and handling null values, imputation, standardization, handling categorical variables, one-hot encoding, etc., to prepare the input data for analysis by the model(s). Next, at optional step 864, extraction of features from the data may be performed using, for example, Principal Components Analysis (PCA), Independent Component Analysis (ICA), Linear Discriminant Analysis (LDA), Locally Linear Embedding (LLE), t-distributed Stochastic Neighbor Embedding (t-SNE), ETnified Manifold Approximation and Projection (UMAP) and/or Autoencoders, etc. Alternatively, or additionally, feature selection may be executed. At inference step 865, the results of the model are obtained, i.e., the model is executed on the processed data to provide corresponding results. At optional step 866, fine-tuning of the model results may be performed, whereby post-inference business logic is executed. Execution of post-inference business logic refers to the utilization of the model’s results to a variety of automated decisions, guidelines or recommendations supplied to the user. Post inference business logic may be configured to accommodate specific business and/or clinical needs or metrics, and can vary between different scenarios or institutions based on users’ or institutions’ requests or needs.
At step 867, the model results may be utilized in various means, including, for example, enhancing the operation of the automated medical device (e.g., enabling automatic target tracking and closed-loop steering based on the tracked real-time position of the target, etc.), providing recommendations regarding various device operations (including recommending one or more optimal entry points, recommending optimized trajectories or modes of operation, etc.), providing prediction, prevention and/or early detection of various clinical conditions (e.g., pneumothorax, breathing anomalies, bleeding, etc.), as disclosed, for example, in co-owned International Patent Application No. PCT/IL2021/050438, which is incorporated herein by reference in its entirety, and the like, as further detailed hereinabove.
In some embodiments, inference operation may be performed on a single data instance. In other embodiments, inference operation may be performed using a batch of multiple data instances to receive multiple predictions or results for all data instances in the batch. In some embodiments, an ensemble of models or algorithms can be used for inference, where the same input data is processed by a group of different models and results are being aggregated using averaging, majority voting or the like. In some embodiments, the model can be designed in a hierarchical manner where input data is processed by a primary model and based on the prediction or result of the primary model’s inference, the data is processed by a secondary model. In some embodiments, multiple secondary models may be used, and hierarchy may have more than two levels.
According to some embodiments, the methods and systems disclosed herein utilize data-driven methods to create algorithms based on various datasets, including, functional, anatomical, clinical, diagnostic, demographic and/or administrative datasets. In some embodiments, artificial intelligence (e.g., machine-learning) algorithms are used to learn the complex mapping/correlation/correspondence between the multimodal (e.g., data obtained from different modalities, such as images, logs, sensory data, etc.) input datasets parameters procedure, clinical, operation, patient related and/or administrative information, to optimize the clinical procedure’s outcome or any other desired functionalities. In some embodiments, the systems and methods disclosed herein determine such optimal mapping using various approaches, such as, for example, a statistical approach, and utilizing machine-learning algorithms to learn the mapping/correlation/correspondence from the training datasets.
In some embodiments, the algorithm may be a generic algorithm, which is agnostic to specific procedure characteristics, such as type of procedure, user, service provider or patient. In some embodiments, the algorithm may be customized to a specific user (for example, preferences of a specific healthcare provider), a specific service provider (for example, preferences of a specific hospital), a specific population (for example, preferences of different age groups), a specific patient (for example, preferences of a specific patient), and the like. In some embodiments, the algorithm may be combined a generic portion and a customized portion.
Reference is now made to FIG. 8, which shows a flowchart 90 illustrating the steps of an exemplary method of closed-loop steering of a medical instrument toward a predicted location of a target, according to some embodiments. At step 901, parameters related to the medical procedure, and in particular to the steering of a medical instruments are obtained or identified, based on image(s) of region(s) of interest in the subject’s body. Such parameters include the target to be reached, the entry point for the medical instrument and optionally, "no-fly" zones (which are regions to be avoided during the procedure). The parameters may be identified/determined automatically (for example, by image analysis and/or ML/DL algorithms) and/or obtained from a user (for example, a healthcare provider), who may mark one or more of the above parameters on the image(s). In some embodiments, creating AI models to detect/identify/recommend the above parameters may include a preliminary phase, in which one or more individual models are trained. For example, generating a “no-fly” zone model may include an accuracy estimation model, a procedure duration estimation model and/or a risk estimation model, as disclosed, for example, in abovementioned co-owned International Patent Application No. PCT/IL2021/050437. At step 902, a planned trajectory for the medical instrument from the entry point to the target is calculated/determined. The calculation of the planned trajectory may take into account various parameters, including but not limited to: type and characteristics of the medical instrument, type of imaging modality, selected insertion point, determined “no-fly” zones, type and characteristics of the tissue(s) through which the instrument is intended to advance, the characteristics of the target (type, dimensions, shape, etc.) and its location within the subject’s body, milestone points (i.e., “secondary targets” through which the medical instrument should pass) and the like, or any combination thereof. In some embodiments, a maximal allowable curvature level may be pre-set for the calculation of the planned trajectory. In some embodiments, a maximal allowable lateral movement of the instrument at the entry point may be pre-set for the calculation of the planned trajectory. In some embodiments, a maximal allowable proximity to obstacle(s) (or determined “no-fly” zones) may be pre-set for the calculation of the non linear trajectory. In some embodiments, once a trajectory is planned, checkpoints may be set along the trajectory, either manually or automatically. At step 903, the medical instrument is steered toward the target according to the planned trajectory. In some embodiments, steering of the medical instrument may be based on an inverse kinematics solution applied to a virtual springs model to calculate the required motion to be imparted to the instrument (or to an end effector of the automated device, as shown, for example, in Fig. 1A) in order for the instrument to follow the planned trajectory, as described in further detail hereinabove. At step 904, real-time images of the region of interest (which may include, for example, relevant tissues and/or target region) may be obtained. The images may be obtained at any suitable format and form, such as, for example, discrete images obtained at spatial and/or temporal intervals (e.g., upon reaching a checkpoint), semi-continuous images (for example, discrete images obtained at a high frequency), continuous images (for example, obtained as a video), and the like, depending on the utilized imaging modality. At step 905, the estimated tissue movement during the steering procedure is calculated/determined. In some embodiments, tissue movement during the entire procedure may be estimated. In some embodiments, tissue movement during a portion of the procedure (for example, between consecutive checkpoints) may be estimated. The determination of the estimated tissue movement may be performed by various algorithms, such as, for example, image analysis algorithms, machine learning algorithms, deep learning algorithms, and the like, or combinations thereof, as further detailed hereinbelow. As used herein, the term “tissue movement” may include changes in the location of the tissue during the procedure, as well as changes in the anatomical structure of the tissue, such as changes in shape and/or size of the tissue, and the like, or any combination thereof. In some embodiments, “tissue movement” may refer to changes in the location and/or structure of the tissue as determined at specific points in time during the procedure. In some embodiments, “tissue movement” may refer to a movement profile, i.e., to changes in the location and/or structure of the tissue occurring during the entire steering procedure or during a certain time interval during the procedure (i.e., location/structure as a function of time). In some embodiments, estimation of tissue movement may take into account tissue movement resulting from the subject’s breathing. At step 906, the real-time position of the medical instrument and the target may be determined. The determination of the real-time positions may be performed automatically (for example, by a processor performing image analysis or any other suitable algorithm, such as ML/DL model(s)) and/or manually (for example, by a healthcare provider). In some embodiments, step 905 and step 906 may be performed in parallel. At step 907 the estimated target movement during the steering procedure may be determined/calculated, as further detailed herein. As used herein, the term “target movement” may include changes in the location of the target during the procedure, as well as changes in the anatomical structure of the target, such as changes in shape and/or size of the target, and the like, or any combination thereof. In some embodiments, “target movement” may refer to changes in the location and/or structure of the target as determined at specific points in time during the procedure. In some embodiments, “target movement” may refer to a movement profile, i.e., to changes in the location and/or structure of the target occurring during the entire steering procedure or during a certain time interval during the procedure (i.e., location/structure as a function of time). In some embodiments, the expected target position and/or expected target movement profile, may be determined utilizing data analytics algorithms, such as, AI models, which may be generated using data obtained from past procedures, as further described hereinbelow. At step 908, the trajectory may be updated based on the estimated/predicated target movement and/or the estimated/predicted tissue movement, to facilitate the medical instrument reaching the predicted target location. In some embodiments, the trajectory may be updated using data- analysis algorithm(s), such as AI models, which may be generated using data obtained from past procedures, as further described hereinbelow. Next, at step 909, the updated trajectory is evaluated to determine if it is optimal. For the determination if the updated trajectory is optimal, various parameters and calculations may be utilized, including, for example, procedure/steering duration, accuracy (e.g., tip-to-target distance), target movement resulting from instrument-tissue interaction, safety (e.g., probability of complications), and the like, as further detailed hereinbelow. The determination if the updated trajectory is optimal may include comparing one or more parameters, or associated values, to a predetermined threshold. In some embodiments, the updated trajectory may be determined as optimal if a certain parameter (or associated value) exceeds its predetermined threshold. In some embodiments, the updated trajectory may be determined as optimal if a certain parameter (or associated value) is below its predetermined threshold. In some embodiments, the updated trajectory may be determined as optimal if a certain parameter exceeds its respective threshold and a different parameter is below its respective threshold. The determination if the updated trajectory is optimal may be performed automatically and/or manually. If the updated trajectory is determined not to be optimal, step 908 may be repeated, to further update the trajectory. Once a trajectory is found to be optimal, at step 910, the medical instrument is steered toward the target in accordance with the updated trajectory. Next, at step 911 it is determined if the target has been reached by the medical instrument. If the target has been reached, the procedure ends 912. If, however, the target has not yet been reached, steps 904- 911 may be repeated until the target is reached. In some embodiments, once an updated trajectory has been determined as optimal, relevant parameters indicative of the trajectory being optimal may be utilized as feedback in the calculation performed in step 907 (e.g., as feedback to an AI-based target movement model), to increase accuracy and reliability of such calculations in subsequent iterations of steps 904-911, as well as in further procedures.
Reference is now made to FIG. 9, which shows a flowchart 92 illustrating steps of an exemplary method of closed-loop steering of a medical instrument toward a predicted location of a target, utilizing a dynamic trajectory model (an “inference” process), according to some embodiments. At step 921, parameters related to the medical procedure, and in particular to the steering of a medical instruments are obtained or identified, based on images of region(s) of interest in the subject body obtained using an imaging system (e.g., CT, MRI, ultrasound, CT fluoroscopy, CBCT, etc.). Such parameters include the target to be reached, the entry point for the medical instrument and optionally, "no-fly" zones. The parameters may be identified/determined automatically (for example, by image analysis and/or AI-based algorithms) and/or obtained from a user (for example, a healthcare provider), who may mark one or more of the above parameters on the image(s). At step 922, a planned trajectory for the medical instrument from the entry point to the target is calculated/determined. In some embodiments, the planned trajectory may be determined using a data-analysis algorithm, for example a dynamic trajectory model (see step 926), which takes into account the predicted movement of the tissue, the predicted movement of the target, predicted movement of a tip of the medical instrument, and the like, to predict the location of the target at a desired time/space point (for example, at the end-point of the procedure) and plan the trajectory according thereto. At step 923, the medical instrument is steered toward the target according to the planned trajectory. In some embodiments, steering of the medical instrument may be based on an inverse kinematics solution applied to a virtual springs model to calculate the required motion to be imparted to the instrument (or to an end effector of the automated device) in order for the instrument to follow the planned trajectory, as described in further detail hereinabove. At step 924, real-time images of the region of interest may be obtained. The images may be obtained at any suitable format and form, such as, for example, discrete images obtained at spatial and/or temporal intervals (for example, at different checkpoints), semi-continuous images (for example, discrete images obtained at a high frequency), continuous images (for example, obtained as a video), and the like, depending on the utilized imaging modality. At step 925, the real-time position of the medical instrument, the target, and optionally other regions of interest, may be determined. In some embodiments, the real time position of previously determined “no-fly” zones may be determined. The determination of the real-time positions may be performed automatically (for example, by a processor performing image analysis or executing suitable algorithm(s), such as ML/DL model(s)) and/or manually (for example, by a healthcare provider). At step 926, a dynamic trajectory model is applied to update the trajectory (if needed). The dynamic trajectory model (DTM) may include one or more algorithms and/or AI-based models, each of which may be configured to provide information, predictions, estimations and/or calculations regarding various parameters and variables that may affect tissue, target and/or medical instrument movement and the consequent trajectory. Such algorithms and models may provide parameters such as, predicted/estimated tissue movement, predicted/estimated target movement, and/or predicted/estimated medical instrument movement, to ultimately predict the estimated target spatiotemporal location during and/or at the end of the procedure, to thereby allow the planning and/or updating of a corresponding trajectory to facilitate the medical instrument reaching the target at its predicted location. In some embodiments, estimation of tissue movement may take into account tissue movement resulting from the patient’s respiratory cycle. In some embodiments, the patient’s respiratory cycle and/or the tissue movement resulting from the patient’s respiratory cycle may be predicted using a separate algorithm/model. In some embodiments, estimation of tissue movement may take into account tissue movement resulting from the medical device steering toward the target. In some embodiments, the tissue movement resulting from the medical device steering may be predicted using a separate algorithm/model. In some embodiments, the dynamic trajectory model DTM may include algorithms/models to predict the movement of previously determined “no-fly” zones and/or algorithms/models to update the “no-fly” zones map according to the predicted tissue and target movement. In some embodiments, the DTM may include determining if a calculated trajectory is optimal, based on various parameters as described herein, such that the output of the model is the optimal trajectory. It can be appreciated that different trajectories may be considered as “optimal”, depending on the chosen parameters, the weight given to each parameter, user preferences, etc. After the appropriate trajectory has been determined utilizing the dynamic trajectory model, at step 927 the medical instrument may be steered toward the target in accordance with the updated trajectory. Next, at step 928 it is determined if the target has been reached by the medical instrument. If the target has been reached, the procedure ends 929. If, however, the target has not yet been reached, steps 924-928 may be repeated, until the target is reached.
Reference is now made to FIG. 10, which shows a block diagram 95 illustrating an exemplary method of generating (training) a tissue movement model for prediction of tissue movement during a medical procedure. As shown in FIG. 10, input data 951 from past procedures is used to train the tissue movement model 954 to predict tissue movement (output/result) 955 during a medical procedure. As described hereinabove, the term “tissue movement” may include various variables and parameters related to changes in the tissue during (continuously or at discrete stages) the medical procedure, including, but not limited to: locations of the tissue during the procedure, changes in the anatomical structure of the tissue (e.g., changes in the shape, dimensions, density and/or form of the tissue), and the like, or any combination thereof. Further, the term “tissue movement” may refer to a time- dependent profile, i.e., to changes in the location and/or structure of the tissue occurring during the entire procedure or during certain time interval(s) during the procedure (i.e., location/orientation/structure as a function of time). The input data 951 may be used directly with the tissue movement model and/or may be processed/analyzed by a tissue segmentation model 953. The input data may include any relevant parameters and/or datasets from previous procedures which are related to tissue movement, and the target variable (“ground truth”) for training the tissue movement model how the tissue actually moved or otherwise changed during these previous procedures. The input data 951 may include various data sets, selected from, for example, but not limited to: data related to the clinical procedure and patient related data, such as, tissues’ characteristics (e.g., types, boundaries between tissue types, dimensions, elasticity, etc.), medical instrument (e.g., needle) type and characteristics (e.g., gauge, length, tip type (e.g., diamond, bevel), etc.), relative angle of the medical instrument (for example, the relative angle of the medical instrument to the patient body and or to an axial plane and/or to a sagittal plane of images obtained for an imaging system, the relative angle of a tip of the medical instrument to the patient body and or to an axial plane and/or to a sagittal plane of images obtained for an imaging system, and the like), respiration signals (e.g., from respiration sensor or from a ventilator, if the patient was ventilated), respiration abnormalities, patient characteristics (age, gender, race, BMI, medical condition, smoking habits, ventilation, intubation, self-breathing, sedation etc.), data related to the medical device and its operation, including, for example, motors’ current traces (i.e. logs of motors’ performance data), procedure timing, skin to target time, entry and target positions, trajectory length, target movements and paths updates, number and position of checkpoints, errors and correction of checkpoints, images (e.g., CT scans) generated during the procedure (e.g., at checkpoints and/or at milestone points (“secondary targets”)), magnitude of lateral steering of the medical instrument, medical device position, instrument angles, distance of the instrument from the different tissues/organs, instrument insertion speed, patient’s position (e.g., supine, prone, decubitus), any other relevant data influencing tissue movement during the medical procedures (medical instrument steering in the subject’s body), and the like, or any combination thereof. In addition, data annotations may further be utilized for model training and validation. The data annotations may include values and/or parameters such as, but not limited to: organs segmentation masks and/or bounding boxes and/or location, tissues segmentation masks and/or bounding boxes and/or location, instrument segmentation masks and/or bounding box and/or location, and the like. In some embodiments, each or at least some of the parameters may be attributed an appropriate weight which is taken into account in generating the tissue movement model.
In some embodiments, the input data 951 may include multi-modal data collected from past procedures and arranged, where possible/applicable, as time-series together with the patient's parameters and medical history. The time-series structure may allow the analysis of time-dependency events in past procedures’ data to better predict the tissue movement during a procedure and better study the impact of the different factors and their correlation to the procedure timeline. In some embodiments, specialized tissue segmentation model 953 may be used to generate meaningful domain-knowledge features that may, in turn, be input to the primary tissue movement model during the training process. Such tissue segmentation models may utilize image segmentation and/or reconstruction, to identify/detect different tissues (e.g., organs). In some embodiments, tissue segmentation may be performed on 2D images (“slices”) which are then reconstructed to generate 3D images of the segmented tissues/organs. In some embodiments, the output of the tissue movement model may be tissue movement prediction 955, which may include one or more of predicted tissue location, change of tissue anatomical structure (change in shape, tissue deformation), and the like, during the input (current) procedure. The tissue movement prediction, together with ground- truth annotations regarding tissue movement during a procedure, may be used to calculate a loss function representing the error between the tissue movement prediction and the ground- truth data. During the training process, optimization of this loss function will allow the adjustment of the model’s weights. In some embodiments, the tissue movement prediction model may be trained in a multi-task and/or multi-output approach. In some embodiments, the tissue movement prediction model may be trained to predict the exact movement (or any other related changes, as detailed above) of the tissue at each point in time during the procedure. This may require corresponding time-based annotations of tissue related parameters (for example, location, size, shape, form, etc.) at desired points in time throughout the procedures in the dataset. In some embodiments, as further detailed herein, the tissue movement model may be used in the training or application of various other models, such as, for example, target movement prediction model, trajectory prediction model, and the like. In some embodiments, once trained, the tissue movement model may be deployed and used to estimate tissue movement during a medical procedure, such as steering of medical instrument toward a moving target. In some embodiments, the prediction of the tissue movement model may be for a given instrument trajectory, patient position (e.g., supine, prone, etc.) and/or patient respiration behavior.
Reference is now made to FIG. 11 which shows a block diagram 97 illustrating an exemplary method of generating (training) a target movement model for prediction of target movement during a medical procedure. As shown in FIG. 11, input data 971 from past procedures is used to train the target movement model 974 to predict target movement (output/result) 975 during a medical procedure. As described hereinabove, the term “target movement” may include various variables and parameters related to changes in the target during (continuously or at discrete stages of) the medical procedure, including, but not limited to: locations of the target during the procedure, changes in the anatomical structure of the target during the procedure (e.g., changes in the dimensions, shape, etc. of the target), and the like, or any combination thereof. Further, the term “target movement” may refer to a time-dependent profile, i.e., to changes in the location and/or structure of the target occurring during the entire procedure or during certain time interval(s) during the procedure (i.e., location/structure as a function of time). The input data 971 may include any relevant parameters and/or datasets from previous procedures which are related to target movement, and the target variable (“ground truth”) for training the target movement model may be how the target actually moved or otherwise changed during these past procedures. The input data 971 may include various data sets, selected from, for example, but not limited to: data related to clinical procedure and patient related data, such as, target type, target dimensions, target depth, target shape, tissue characteristics (e.g., types, boundaries, dimensions, elasticity, etc.), medical instrument (e.g., needle) type and characteristics (e.g., gauge, length, tip type (e.g., diamond, bevel)), respiration signals, respiration abnormalities, patient characteristics (age, gender, race, BMI, medical condition, smoking habits, ventilation, intubation, self-breathing, sedation etc.), data related to the medical device and its operation, including, for example, motors’ current traces (i.e. logs of motors’ performance data), procedure timing, skin to target time, entry and target positions, trajectory length, paths updates, number and position of checkpoints, errors and correction of checkpoints, images (e.g., CT scans) generated during the procedure (e.g., at checkpoints and/or at milestone points (“secondary targets”)), magnitude of lateral steering of the medical instrument, medical device position, angle of the instrument relative to the target, distance of the instrument from the target, instrument insertion speed, final tip-to-target distance, patient’s position (e.g., supine, prone, decubitus), any other relevant dataset influencing target movement during medical procedures (medical instrument steering in the subject’s body), and the like, or any combination thereof. In addition, data annotations may further be utilized for model training and validation. The data annotations may include values and/or parameters such as, but not limited to: organs segmentation masks and/or bounding boxes and/or location, tissues segmentation masks and/or bounding boxes and/or location, target contours and/or bounding boxes and/or location, instrument segmentation masks and/or bounding box and/or location, and the like. In some embodiments, each or at least some of the parameters may be attributed an appropriate weight which is taken into account in generating the target movement model.
In some embodiments, the input data 971 may include multi-modal data collected from past procedures and arranged, where possible/applicable, as time-series together with the patient's parameters and medical history. The time-series structure may allow the analysis of time-dependency events in past procedures’ data to better predict the target movement during a procedure and better study the impact of the different factors and their correlation to the procedure timeline.
In some embodiments, the input data 971 may be first processed by a target detection model 972, configured to detect the target in image(s). The target detection model may include any type of algorithm(s), including, for example, image analysis algorithm(s) and/or ML/DL algorithm(s), allowing the detection of a target in an image or a set of images (for example, a video stream). For training of the target detection model 972, data from previous procedures may be used, wherein the initial detection of a target may be manual (for example, by marking of a target by a healthcare provider on an image), semi-automatic and/or automatic (for example, by suitable image processing and/or data-analysis algorithms). In some embodiments, a combination of manual and automatic target detection may be utilized. For example, initial manual marking is performed by the user (e.g., a healthcare provider) and the identification/detection/recognition of the target in subsequent (following) images may be performed automatically utilizing a suitable algorithm. In some embodiments, the user may be required to confirm the automatically detected target. In some embodiments, the target detection model may optionally include re-identification of the target to increase accuracy of the model. The results of the target detection model may then be used as input to the target movement model 974. In some embodiments, the results of the target detection model may also be used as input to a trained tissue movement model 973, the results of which are then used as input to the target movement model 974. In some embodiments, the trained tissue movement model 973 may be the tissue movement model described in FIG. 10 hereinabove. Utilization of the tissue movement model in the training of the target movement model may increase accuracy, as it allows accurate target tracking, while minimizing artifacts which may be caused by tissue movement (for example neighboring tissues) that may otherwise be erroneously interpreted or considered as target movement. In some embodiments, the output of the thus generated target movement model 974 is a target movement prediction 975. The target movement prediction may include, for example, prediction of the location (e.g., spatiotemporal location) of the target at different stages of the procedure (including, for example, at the end-point of the procedure), prediction of related changes in the target during the procedure (for example, change in shape, size, structure, orientation, etc.) The target movement prediction, together with ground-truth annotations regarding target movement during a procedure, may be used to calculate a loss function representing the error between the target movement prediction and the ground-truth data. During the training process, optimization of this loss function will allow the adjustment of the model’s weights. In some embodiments, the target movement prediction model may be trained in a multi-task and/or multi-output approach. In some embodiments, the target movement prediction model may be trained to predict the exact movement/location (or any other related changes, as detailed above) of the target at each point in time during the procedure. This may require corresponding time-based annotations of target related parameters (for example, location, size, shape, form, etc.) at desired points in time throughout the procedures in the dataset. In some embodiments, as further detailed herein, the target movement model may be used in the training or application of various other models, such as, for example, a trajectory prediction model, and the like. In some embodiments, once trained, the target movement model may be deployed and used to estimate target movement during a medical procedure, such as steering of medical instrument toward a moving target. In some embodiments, the prediction of the target movement model may be for a given instrument trajectory, patient position (e.g., supine, prone, etc.) and/or patient respiration behavior.
Reference is now made to FIG. 12 which shows a block diagram 100 illustrating an exemplary method of generating (training) a data-analysis (e.g., AI-based) model for outputting a trajectory for steering a medical instrument toward a moving target. As shown in FIG. 12, input data 1001 is used to train the trajectory model 1005 to predict a trajectory prediction 1006. Various types of input data 1001 may be used for training the trajectory model. The input data 1001 may be used directly to train the trajectory model 1005 and may alternatively or additionally be used as input to one or more trained models, such as, tissue movement model 1002, target movement model 1003 and/or "no-fly" zones model 1004. The input data 1001 may include any type of datasets and/or parameters from previous procedures which are relevant for the prediction of the trajectory, and the target variable (“ground truth”) for training the trajectory model how the trajectory was actually adjusted/updated during these past procedures. In some embodiments, data annotations included in the data used for training and/or validating the trajectory model may include trajectories which were not updated, or were not correctly updated, when tissue/target movement occurred in previous procedures, such that the medical instrument did not reach the target. Such data annotations may also be artificially generating for the purpose of training the trajectory model. The input data may include, for example, data related to clinical procedure and patient related data, such as, target type, target dimensions, target depth, target shape, tissue characteristics (e.g., types, boundaries, dimensions, elasticity, density, etc.), medical instrument type and characteristics (e.g., gauge, length, material, tip type (e.g., diamond, bevel, etc.)), maximal allowable curvature, maximal allowable lateral movement of the instrument at the entry point, respiration signals, respiration abnormalities, patient characteristics (age, gender, race, BMI, etc.), data related to the medical device and its operation, including, for example, motors’ current traces, procedure timing, skin to target time, entry and target positions, trajectory length, target movements, number and position of checkpoints, errors and correction of checkpoints, images (e.g., CT scans) generated during the procedure (e.g., at checkpoints and/or at milestone points (“secondary targets”)), magnitude of lateral steering of the medical instrument, medical device position, relative angles of the medical instrument, distance of the instrument from the target, instrument insertion speed, final tip-to-target accuracy, patient’s position (e.g., supine, prone, decubitus), any other relevant dataset influencing the trajectory during the medical procedures (medical instrument steering in the subject’s body), and the like, or any combination thereof. In some embodiments, the input data 1001 may include multi-modal data collected from past procedures and arranged, where possible/applicable, as time-series together with the patient's parameters and medical history. The time-series structure may allow the analysis of time- dependency events in past procedures’ data to better predict trajectory adjustments during a procedure and better study the impact of the different factors and their correlation to the procedure timeline.
As detailed above, the relevant input data 1001 is used (directly or indirectly) for the training of the trajectory model which provides a prediction regarding a trajectory of a medical instrument during a medical procedure, which takes into account various variables and parameters, to facilitate the medical instrument reaching a moving target in the most efficient, safe and accurate manner. As detailed above, the trajectory model may be trained based on the input data directly and/or based on output of one or more trained models, such as, for example: tissue movement model 1002 (such as the model described in FIG. 10 above), target movement model 1003 (such as the model described in FIG. 11 above) and/or "no-fly" zones model, which is a model for predicting regions which are to be avoided during the medical procedure. An exemplary "no-fly" zone model is described, for example, in abovementioned co-owned International Patent Application No. PCT/IL2021/050437. In some embodiments, as detailed above, the trajectory may be any type of trajectory, such as, 2D trajectory or 3D trajectory. The generated trajectory prediction, together with ground- truth annotations regarding trajectory adjustments during a procedure, may be used to calculate a loss function representing the error between the trajectory prediction and the ground-truth data. During the training process, optimization of this loss function will allow the adjustment of the model’s weights. In some embodiments, the trajectory model may be trained in a multi-task and/or multi-output approach. In some embodiments, the trajectory model may be trained to predict the exact trajectory adjustments required at each point in time during the procedure. This may require corresponding time-based annotations of trajectory related parameters at desired points in time throughout the procedures used for model training. In some embodiments, once deployed, the output of the trajectory model facilitates the spatio-temporal reaching of a target by a medical instrument steered in accordance with the output trajectory, in medical procedures which require steering of a medical instrument toward a moving target (e.g., biopsy, ablation, fluid delivery, fluid drainage, etc.).
Reference is now made to FIG. 13, which shows a block diagram 110 illustrating an exemplary method of generating (training) a data-analysis (e.g., AI-based) model for outputting a trajectory in an image-guided procedure of inserting and steering a medical instrument toward a moving internal target, to facilitate the medical instrument accurately reaching the internal target, according to some embodiments. As described hereinabove, real time images (e.g., scans) may be obtained during the procedure, and should there be a need based on the real-time images, the trajectory can be updated as detailed herein. The generated (updated) trajectory may affect several parameters, such as the accuracy of the procedure (e.g., the tip-to-target distance), the tissue and/or target movement resulting from the interaction with the medical instrument as it follows the trajectory toward the target, the duration of the steering phase of the procedure, the risk level of the procedure (e.g., probability of complications), etc. Therefore, in some embodiments, the trajectory model should also take one or more of these parameters into account. To this end, in some embodiments, the training process of the trajectory model may include one or more phases of training, including, for example, training the basic trajectory model to output a trajectory prediction (as described, for example, in FIG. 12 hereinabove), as well as a phase of training one or more of the following individual models: an accuracy estimation model, an interaction movement estimation model, a duration estimation model, a risk estimation model, and any combination thereof. The input for training each of these individual models may include any relevant input obtained from previous procedures, such as, but not limited to, the data described in FIGS. 10-12 hereinabove. In some embodiments, the target variable (“ground truth”) for training the accuracy model is the actual procedure accuracy (e.g., instrument tip- to-target accuracy). In some embodiments, the target variable for training the interaction model is the actual movement of the target and/or tissue due to interaction (direct or indirect) with the medical instrument. In some embodiments, the target variable for training the duration model is the actual duration of the steering phase of the procedure. In some embodiments, the target variable for training the risk model is the occurrence of complications during the procedure. It can be appreciated that for each individual model the target variable is not included in the input variables used for the training process of the individual model.
According to some embodiments, training the basic trajectory model includes training the model to predict the trajectory as similar as possible to the ground truth trajectory (i.e., with minimal error from the actual successful trajectory in previous procedures). In some embodiments, the trajectory model is trained to output an optimized trajectory, which allows reaching the spatio-temporal location of the target in the most accurate, safe, fast, efficient, and/or reliable manner. In some embodiments, such optimal trajectory may result in the maximal possible tip-to-target accuracy, minimal movement of tissue and/or target due to insertion of the medical instrument through the tissue, minimal steering phase duration and/or minimal risk for clinical complications during instrument steering. In some embodiments, such optimal trajectory may have a minimal trajectory prediction error. In some embodiments, such training may be executed using a loss function, e.g., a Multi-Loss scheme. In some embodiments, such training may be executed using Ensemble Learning methods. In some embodiments, such training may be executed using a Multi-Output regression/classification approach. In some embodiments, Multi-Task learning may be used. As shown in FIG. 13, which illustrates training executed using a Multi-Loss scheme, input data 1102, such as the data described above, is used to train the trajectory model 1104 to predict the trajectory 1106. In some embodiments, as shown in FIG. 12, for example, tissue movement model 1130, target movement model 1132 and/or "no-fly" zones model 1134 may also be used in the training of the trajectory model. The trajectory prediction 1106, together with the original input data 1102 (or portions of these datasets and/or additional data not used as input to the trajectory model), is then used as input to the individual trained models 1108- 1114, which may include one or more of the accuracy model 1108, interaction model 1110, duration model 1112 and risk model 1114, to generate accuracy prediction 1116, interaction movement prediction 1118, duration prediction 1120 and risk prediction 1122, respectively. The individual models’ predictions (1116, 1118, 1120 and/or 1122), together with the trajectory model’s prediction (or respective scores), are then used to calculate a loss function 1124, aimed to minimize the trajectory prediction error, maximize the tip-to-target accuracy, minimize the interaction resultant movement, minimize the procedure duration and minimize the risk. The generated weighted loss represents the model’s prediction error, which may be used to fine-tune or adjust the trajectory model’s 1104 weights as part of the training process.
In some embodiments, only one or more of the individual models described above are used in the training process of the trajectory model. For example, in some embodiments only the accuracy and duration models may be used, whereas in other embodiments only the accuracy and interaction movement models may be used. Further, the weights/coefficients used in the Multi-Loss function 1124 may be adjusted according to certain needs and/or preferences. For example, if minimal interaction movement and/or minimal duration have a higher priority than minimized risk or tip-to-target accuracy, the interaction movement and duration may be given higher coefficients during the training process, such that they will have a greater impact on the generated trajectory. In some embodiments, different trajectory models may be trained for different needs and/or preferences. For example, one trajectory model may be trained to generate a trajectory that will allow the highest achievable tip-to- target accuracy, another trajectory model may be trained to generate a trajectory that will result in the lowest achievable interaction movement, a further trajectory model may be trained to generate a trajectory that will result in the shortest achievable procedure duration, etc. In some embodiments, a single trajectory model may be trained and deployed, and the coefficients used in the Multi-Loss function 1124 may be adjusted during inference, i.e., during use of the trajectory model to generate a trajectory for (or during) a specific procedure. The need/preference upon which the coefficients may be fine-tuned may be associated with, for example, a specific procedure type (e.g., biopsy, ablation, fluid drainage, etc.), a specific target type, a specific user, a specific population, and the like.
In some embodiments, as detailed herein, the systems and methods disclosed herein may allow automatic or semi-automatic steering corrections, for example, if the tip of the medical instrument deviates and/or if the tissue or the target does not move as predicted.
According to some embodiments, the systems and methods disclosed herein may allow automatic steering through different tissue types and crossing tissue layer boundaries. In some embodiments, the insertion speed may be automatically or semi-automatically adjusted (e.g., decreased) prior to the instrument crossing a boundary between tissue layers, especially when there is a risk of clinical complications, such as pneumothorax when crossing the pleura.
According to some embodiments, the maximal curvature of the medical instrument may be continuously verified against a pre-set threshold, so as to ensure safety and accuracy.
In some embodiments, data obtained from various sensors may be utilized with the methods disclosed herein, including, for example, force sensor, respiration sensor, imaging unit sensor, camera, and the like.
In some embodiments, the trajectory for the medical instrument may be pre- operatively calculated using a trajectory model, i.e., taking into account estimated tissue and target movements during the procedure. In some such embodiments, steering of the medical instrument may be carried out automatically and continuously from the insertion point to the target, with the steering being paused only if indication of risk to the patient or of substantial deviation from the planned trajectory, etc., is received. In some embodiments, such indications may be generated using sensors disposed on the medical instrument, the medical device (e.g., force sensor) or at the procedure room (for example, examining bed, portable device), and external camera, or the like. According to some embodiments, respiration monitoring and/or prediction may be utilized for the trajectory calculation and/or for the timing of insertion of the medical instrument, to increase safety, accuracy and reliability of the procedures. In some embodiments, respiration synchronization may be performed, either manually (for example, by instructing the subject to hold breath), automatically or semi automatically, for example, by determining the respiration cycle and synchronizing insertion and/or steering steps therewith. According to some embodiments, one or more of the models disclosed herein (for example, tissue movement model, target movement model, trajectory model) may take into account the respiration cycle. According to some embodiments, if insertion of the medical instrument is synchronized with the respiration cycle, such that the instrument is advanced only at a specific point (or stage) of the respiration cycle, tissue and/or target movement due to respiration movement may be ignored.
Reference is now made to FIGS. 14A-14D, which demonstrate real-time updating of a trajectory and steering a medical instrument according thereto, based on predicted movement of target, according to some embodiments. As detailed above, the exemplary planned and updated trajectories presented may be calculated using a processor executing the models and methods disclosed herein, such as the processor(s) of the insertion system described in FIG. IB, and the insertion and steering of the medical instrument toward the predicted target location according to the planned and updated trajectories may be executed using an automated insertion device, such as the automated device of FIG. 1A. In some embodiments, the automated device may be body-mountable, for example by attachment to a subject’s body using an attachment apparatus, such as the attachment apparatus described in abovementioned co-owned International Patent Application Publication No. WO2019/234,748. The automated insertion device is marked as automated device 150, which is body mountable in FIGS. 14A-14D.
The trajectories shown in FIGS. 14A-14D are shown on CT image- views, however it can be appreciated that the planning can be carried out similarly on images obtained from other imaging systems, such as ultrasound, MRI and the like.
FIG. 14A shows an automated insertion device 150 mounted on a subject’s body (a cross-section of which is shown in FIGS. 14A-14D) and a planned (initial) trajectory 160 from an entry point toward the initial position of an internal target 162. The trajectory may have checkpoints marked thereon. In some embodiments, the planned trajectory 160 is a linear or substantially linear trajectory. In some embodiments, if necessitated (for example, due to obstacles), the planned trajectory may be a non-linear trajectory. As further detailed below, the planned trajectory may be updated in real-time based on the real-time position of the medical instrument (for example, the tip thereof) and/or the real-time position of the target and/or the real-time positions of obstacle/s and/or based on predictions generated by one or more machine learning models, such as those detailed herein, for example, in FIGS. 12-13. The initial target location may be obtained manually and/or automatically. In some embodiments, the target position may be determined as detailed in FIG. 11 utilizing the target detection model. FIG. 14B shows medical instrument 174 being inserted and steered into the subject’s body, along the planned trajectory 160. As shown in FIG. 14B, the target has moved from its initial position to new (updated) position 162’ during and as a result of the advancement of the medical instrument within the tissue, as detailed herein. In some embodiments, the determination of the real-time location of the target may be performed manually by the user, i.e., the user visually identifies the target in images (continuously or manually or automatically initiated, for example when the instrument reaches a checkpoint), and marks the new target position on the image using the GUI. In some embodiments, the determination of the real-time target location may be performed automatically by a processor using image processing techniques and/or data-analysis algorithm(s), such as detailed hereinabove. In some embodiments, the trajectory may be updated based on the determined real-time position of the target. In some embodiments, the subsequent movement of the target is predicted, for example using a target movement model, as detailed hereinabove, and the trajectory is then updated based on the predicted location (e.g., the end-point location) of the target. In some embodiments, the updating of the trajectory based on the predicted location of the target may be performed automatically, by utilizing one or more of the AI models disclosed herein, including the tissue movement model, target movement model, trajectory model and any suitable sub-model (or individual model) disclosed herein.
According to some embodiments, recalculation of the trajectory may also be required if, for example, an obstacle is identified along the trajectory. Such an obstacle may be an obstacle which was marked (manually or automatically) prior to the calculation of the planned trajectory but tissue movement, e.g., tissue movement resulting from the advancement of the instrument within the tissue, caused the obstacle to move such that it entered the planned path. In some embodiments, the obstacle may be a new obstacle, i.e., an obstacle which was not visible in the image (or set of images) based upon which the planned trajectory was calculated, and became visible during the insertion procedure.
In some embodiments, the user may be prompted to initiate an update (recalculation) of the trajectory. In some embodiments, recalculation of the trajectory, if required, is executed automatically by the processor and the insertion of the instrument automatically continues according to the updated trajectory. In some embodiments, recalculation of the trajectory, if required, is executed automatically by the processor, however the user is prompted to confirm the recalculated trajectory before advancement of the instrument (e.g., to the next checkpoint or to the target) according to the updated trajectory can be resumed.
As shown in FIG. 14C, an updated trajectory 160' has been calculated based on the predicted end-point location of the target 162”, to facilitate the medical instrument 170 reaching the target at its end-point location. As shown, although the preplanned trajectory 160 was linear, the recalculation of the trajectory, e.g., using the trajectory model, due to movement of the target, resulted in the medical instrument 170, specifically the tip of the instrument, following a non-linear trajectory to accurately reach the target.
FIG. 14D summarizes the target movement during the procedure shown in FIGS. 14A-14C, from an initial target location 162 to an updated target location 162’ and finally to an end-point target location 162”. As detailed herein, the target movement during the procedure may be predicted by the target movement model, which may be further used (optionally with additional models, such as, tissue movement model, “no-fly” zones model) to update the trajectory utilizing the trajectory model, to thereby facilitate the medical instrument 170 reaching the target at its endpoint location in an optimal manner, as detailed herein. Also shown in FIG. 14D are the planned trajectory 160 and the updated trajectory 160’, which allowed the medical instrument 170 to reach the moving target, without having to remove and re-insert the instrument.
According to some embodiments, the target, insertion point and, optionally, obstacle/s, may be marked manually by the user. According to other embodiments, the processor of the insertion system (or of a separate system) may be configured to identify and mark at least one of the target, the insertion point and the obstacle/s, and the user may, optionally, be prompted to confirm or adjust the processor’s proposed markings. In such embodiments, the target and/or obstacle/s may be identified using known image processing techniques and/or data-analysis models/algorithms, based on data obtained from previous procedures. The insertion point may be suggested based solely on the obtained images, or, alternatively or additionally, on data obtained from previous procedures using data-analysis models/algorithms .
Implementations of the systems, devices and methods described above may further include any of the features described in the present disclosure, including any of the features described hereinabove in relation to other system, device and method implementations.
According to some embodiments, there is provided computer-readable storage medium having stored therein data-analysis algorithm(s), executable by one or more processors, for generating one or more models for providing recommendations, operating instructions and/or functional enhancements related to operation of automated medical devices.
According to some embodiments there is provided a control device (control unit) configured to receive input from a corresponding processor (processing unit), and generate control data in response thereto, for controlling operation of an automated medical device. According to some embodiments, the processing unit and the control unit may be physically and/or functionally associated. In some embodiments, the processing unit and the control unit may be part of the same system (for example, insertion system), or separate systems, capable of interacting therewith.
The embodiments described in the present disclosure may be implemented in digital electronic circuitry, or in computer software, firmware or hardware, or in combinations thereof. The disclosed embodiments may be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on computer storage medium for execution by, or to control the operation of, one or more data processing apparatus. Alternatively or in addition, the computer program instructions may be encoded on an artificially generated propagated signal, for example, a machine-generated electrical, optical or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus for execution by a data processing apparatus. A computer storage medium can be, or be included in, a computer-readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of any one or more of the above. Furthermore, while a computer storage medium is not a propagated signal, a computer storage medium can be a source or destination of computer program instructions encoded in an artificially generated propagated signal. The computer storage medium can also be, or be included in, one or more separate physical components or media (for example, multiple CDs, disks, or other storage devices).
The operations described in the present disclosure can be implemented as operations performed by a data processing apparatus on data stored on one or more computer-readable storage devices or received from other sources.
The terms “processor” and/or “data processing apparatus” as used herein may encompass all types of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip/s, or combinations thereof. The data processing apparatus can include special purpose logic circuitry, for example, an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit). The apparatus can also include, in addition to hardware, code that creates an execution environment for the computer program in question, for example, code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross -platform runtime environment, a virtual machine, or combinations thereof. The apparatus and execution environment can realize various different computing model infrastructures, such as web services, distributed computing and grid computing infrastructures.
A computer program (also referred to as a program, software, software application, script or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a standalone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A computer program may, but need not, correspond to a file in a file system. A computer program can be stored in a portion of a file that holds other programs or data, in a single file dedicated to the program in question, or in multiple coordinated files (for example, files that store one or more modules, sub programs or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
The processes and logic flows described herein can be performed by one or more programmable processors, executing one or more computer programs to perform actions by operating on input data and generating output. The processes and logic flows can also be performed by, and an apparatus can also be implemented as, special purpose logic circuitry, for example, an FPGA or an ASIC. Processors suitable for the execution of a computer program include both general and special purpose microprocessors, and any one or more processors of any type of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data. A computer may, optionally, also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, for example, magnetic, magneto optical discs, or optical discs. Moreover, a computer can be embedded in another device, for example, a mobile phone, a tablet, a personal digital assistant (PDA, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (for example, a USB flash drive). Devices suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including semiconductor memory devices, for example, EPROM, EEPROM, random access memories (RAMs), including SRAM, DRAM, embedded DRAM (eDRAM) and Hybrid Memory Cube (HMC), and flash memory devices; magnetic discs, for example, internal hard discs or removable discs; magneto optical discs; read-only memories (ROMs), including CD-ROM and DVD-ROM discs; solid state drives (SSDs); and cloud-based storage. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
The processes and logic flows described herein may be performed in whole or in part in a cloud computing environment. For example, some or all of a given disclosed process may be executed by a secure cloud-based system comprised of co-located and/or geographically distributed server systems. The term “cloud computing” is generally used to describe a computing model which enables on-demand access to a shared pool of computing resources, such as computer networks, servers, software applications, and services, and which allows for rapid provisioning and release of resources with minimal management effort or service provider interaction.
Unless specifically stated otherwise, as apparent from the disclosure, it is appreciated that, according to some embodiments, terms such as “processing”, “computing”, “calculating”, “determining”, “estimating”, “assessing” or the like, may refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data, represented as physical (e.g. electronic) quantities within the computing system’s registers and/or memories, into other data similarly represented as physical quantities within the computing system’s memories, registers or other such information storage, transmission or display devices.
In some embodiments, the terms “medical instrument” and “medical tool" may be used interchangeably.
In some embodiments, the term “moving target” relates to a mobile target, i.e., a target that is capable of being moved within the body of the subject, independently of, or at least partially due to or during a medical procedure.
In some embodiments, the terms “automated medical device” and “automated device” may be used interchangeably.
In some embodiments, the terms “image-guided insertion procedure” and ‘image- guided procedure” may be used interchangeably.
In some embodiments, the term "model", "algorithm", “data-analysis algorithm” and “data-based algorithm” may be used interchangeably.
In some embodiments, the terms "user", “doctor”, “physician”, “clinician”, “technician”, “medical personnel” and “medical staff’ are used interchangeably throughout this disclosure and may refer to any person taking part in the performed medical procedure.
It can be appreciated that the terms “subject” and “patient” may be used interchangeably, and they may refer either to a human subject or to an animal subject.
In the description and claims of the application, the words “include” and “have”, and forms thereof, are not limited to members in a list with which the words may be associated.
Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. In case of conflict, the patent specification, including definitions, governs. As used herein, the indefinite articles “a” and “an” mean “at least one” or “one or more” unless the context clearly dictates otherwise.
It is appreciated that certain features of the disclosure, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the disclosure, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination or as suitable in any other described embodiment of the disclosure. No feature described in the context of an embodiment is to be considered an essential feature of that embodiment, unless explicitly specified as such.
Although steps of methods according to some embodiments may be described in a specific sequence, methods of the disclosure may include some or all of the described steps carried out in a different order. The methods of the disclosure may include a few of the steps described or all of the steps described. No particular step in a disclosed method is to be considered an essential step of that method, unless explicitly specified as such.
The phraseology and terminology employed herein are for descriptive purpose and should not be regarded as limiting. Citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the disclosure. Section headings are used herein to ease understanding of the specification and should not be construed as necessarily limiting.

Claims

What is claimed is:
1. A computer-implemented method of generating a trajectory model for determining a trajectory for steering a medical instrument toward a moving target in a body of a subject in an image-guided procedure, the method comprising: collecting one or more datasets, at least one of the one or more datasets being related to an automated medical device configured to steer a medical instrument toward a target in the body of a subject and/or to operation thereof; creating a training set comprising at least a portion of the one or more datasets and one or more target parameters relating to planned and/or updated trajectories in one or more previous image-guided procedures for steering a medical instrument toward a moving target in a body of a subject; training the trajectory model to output a trajectory that will reach a moving target at a predicted location using the training set; calculating a trajectory prediction error; and optimizing the trajectory model using the calculated trajectory prediction error.
2. The computer-implemented method of claim 1, wherein the one or more datasets further comprise one or more of: clinical procedure related dataset, subject related dataset and administrative related dataset.
3. The computer- implemented method of any one of the previous claims, wherein training the trajectory model comprises using one or more of: loss function, Ensemble Learning methods, Multi-Task Learning, Multi-Output regression and Multi-Output classification.
4. The computer-implemented method of any one of the previous claims, further comprising executing one or more of a tissue movement model, a target movement model and a “no-fly” zones model using at least a portion of the one or more datasets.
5. The computer- implemented method of any one of the previous claims, further comprising: executing one or more individual models using at least a portion of the one or more datasets and the trajectory generated by the trajectory model; and obtaining one or more predictions from the one or more individual models.
6. The computer-implemented method of claim 5, further comprising: calculating a loss function using the trajectory prediction error and the one or more predictions generated by the one or more individual models; and optimizing the trajectory model using the loss function.
7. The computer-implemented method of either one of claims 5 or 6, further comprising training the one or more individual models.
8. The computer-implemented method of any one of claims 5 to 7, wherein the one or more individual models are selected from: a model for predicting an accuracy of an image-guided insertion procedure, a model for predicting target and/or tissue movement resulting from an interaction between the medical instrument and the tissue and/or target movement; a model for predicting a duration of an image-guided insertion procedure or part thereof, a model for predicting a risk level of an image-guided insertion procedure, or any combination thereof.
9. The computer-implemented method of claim 8, wherein calculating the loss function comprises minimizing one or more of the trajectory prediction error, the predicted interaction movement, the predicted duration of an image-guided insertion procedure or part thereof and the predicted risk of an image-guided insertion procedure.
10. The computer-implemented method of claim 9, wherein calculating the loss function further comprises maximizing the predicted accuracy of the image-guided insertion procedure.
11. The computer- implemented method of any one of claims 6 to 10, further comprising adjusting one or more coefficients of one or more terms used in the calculation of the loss function, the one or more terms being associated with at least one of the trajectory prediction error and the one or more predictions generated by the one or more individual models. 12. The computer- implemented method of claim 11, wherein the adjusting of the one or more coefficients is executed during training of the trajectory model.
13 The computer- implemented method of claim 11, wherein the adjusting of the one or more coefficients is executed during execution of the trajectory model.
14. The computer-implemented method of claim 11, wherein the adjusting of the one or more coefficients being related to one or more of: a specific procedure type, a specific target type, a specific user, a specific population.
15. The computer-implemented method of any one of the previous claims, wherein generating the trajectory model is executed by a training module comprising a memory and one or more processors.
16. The computer-implemented method of any one of the previous claims, wherein the automated medical device is configured to allow real-time updating of the trajectory in accordance with predicted target movement and steer the medical instrument toward the target according to the updated trajectory.
17. The computer-implemented method of any one of the previous claims, wherein the automated medical device is configured to allow real-time updating of the trajectory in accordance with predicted movement of tissues along the trajectory and steer the medical instrument toward the target according to the updated trajectory. 18. A system for generating a trajectory model for determining a trajectory for steering a medical instrument toward a moving target in image-guided procedures, the system comprising: a training module comprising: a memory configured to store one or more datasets; and one or more processors configured to execute the method of any one of claims 1 to
17.
19. The system of claim 18, wherein the training module is located on a remote server, an “on premise” server or a computer associated with the automated medical device.
20. The system of claim 19, wherein the remote server is a cloud server. 21. A method of closed-loop steering a medical instrument toward a moving target within a body of a subject, the method comprising: calculating a planned trajectory for the medical instrument from an entry point to an initial target location in the body of the subject, steering the medical instrument toward the initial target location according to the planned trajectory; determining the real-time location of the target and the medical instrument; predicting movement of the target; and updating the trajectory based on the predicted movement of the target, such that the medical instrument will reach the target at a predicted location of the target; and steering the medical instrument toward the predicted location of the target according to the updated trajectory.
22. The method according to claim 21, wherein predicted location comprises temporal and/or spatial location.
23. The method according to claim 21, wherein the predicting movement of the target is executed by using a dynamic trajectory model.
24. The method according to any one of claims 21-23, wherein updating the trajectory is executed using a dynamic trajectory model. 25. The method according to any one of claims 21-24, wherein calculating the planned trajectory is executed using a trajectory model.
26. The method according to any one of claims 23-24, wherein the dynamic trajectory model further comprises predicting a movement of a tissue of the body and/or predicting a movement of tip of the medical instrument. 27. The method according to any one of claims 21 to 26, wherein the trajectory model is trained using the method of any one of claims 1 to 17.
28. The method according to any one of claims 21 to 27, wherein the steering of the medical instrument toward the target is executed utilizing an automated medical device.
29. The method according to any one of claims 21 to 28, wherein the planned trajectory and/or the updated trajectory are a 2D trajectory or a 3D trajectory.
30. The method according to any one of claims 21 to 29, further comprising obtaining one or more images of a region of interest within the body of the subject by means of an imaging system, selected from: a CT system, an X-ray fluoroscopy system, an MRI system, an ultrasonic system, a cone -beam CT system, a CT fluoroscopy system, an optical imaging system and an electromagnetic imaging system.
31. A system for steering a medical instrument toward a moving target in a body of a subject, the system comprising: an automated device configured for steering the medical instrument toward a moving target, the automated device comprising one or more actuators and a control head configured for coupling the medical instrument thereto; and a processor configured for executing the method of any one of claims 21 to 30.
32. The system according to claim 31, further comprising a controller configured to control the operation of the device.
PCT/IL2022/050581 2021-06-02 2022-06-01 Closed-loop steering of a medical instrument toward a moving target WO2022254436A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202163195736P 2021-06-02 2021-06-02
US63/195,736 2021-06-02

Publications (1)

Publication Number Publication Date
WO2022254436A1 true WO2022254436A1 (en) 2022-12-08

Family

ID=84322914

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2022/050581 WO2022254436A1 (en) 2021-06-02 2022-06-01 Closed-loop steering of a medical instrument toward a moving target

Country Status (1)

Country Link
WO (1) WO2022254436A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150150591A1 (en) * 2012-06-26 2015-06-04 Canon Kabushiki Kaisha Puncture control system and method therefor
WO2020056086A1 (en) * 2018-09-12 2020-03-19 Orthogrid Systems, Inc. An artificial intelligence intra-operative surgical guidance system and method of use

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150150591A1 (en) * 2012-06-26 2015-06-04 Canon Kabushiki Kaisha Puncture control system and method therefor
WO2020056086A1 (en) * 2018-09-12 2020-03-19 Orthogrid Systems, Inc. An artificial intelligence intra-operative surgical guidance system and method of use

Similar Documents

Publication Publication Date Title
US20230044419A1 (en) Optimizing checkpoint locations along an insertion trajectory of a medical instrument using data analysis
EP3549103B1 (en) System and method for navigation to a target anatomical object in medical imaging-based procedures
CN107403446B (en) Method and system for image registration using intelligent artificial agents
US20230157762A1 (en) Extended Intelligence Ecosystem for Soft Tissue Luminal Applications
US20230157757A1 (en) Extended Intelligence for Pulmonary Procedures
US20210059758A1 (en) System and Method for Identification, Labeling, and Tracking of a Medical Instrument
AU2018214141B2 (en) System and method for navigation to a target anatomical object in medical imaging-based procedures
WO2021052552A1 (en) Training a machine learning algorithm using digitally reconstructed radiographs
EP4186455A1 (en) Risk management for robotic catheter navigation systems
US20230363821A1 (en) Virtual simulator for planning and executing robotic steering of a medical instrument
WO2022254436A1 (en) Closed-loop steering of a medical instrument toward a moving target
WO2023067587A1 (en) Respiration analysis and synchronization of the operation of automated medical devices therewith
EP4345829A1 (en) Computer-implemented soft tissue emulation system and method
WO2023239738A1 (en) Percutaneous coronary intervention planning
WO2023239734A1 (en) Percutaneous coronary intervention planning
Fried Towards Robot Autonomy in Medical Procedures via Visual Localization and Motion Planning
Gao Fluoroscopic Navigation for Robot-Assisted Orthopedic Surgery
WO2024076892A1 (en) System for ablation zone prediction
WO2024058835A1 (en) Assembly of medical images from different sources to create a 3-dimensional model
WO2024058836A1 (en) Virtual procedure modeling, risk assessment and presentation
WO2024058837A1 (en) Procedure information overlay over angiography data
CN115916088A (en) System and associated method for assessing alignability of anatomical model

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22815502

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 28/03/2024)