CN116456924A - Automated procedure assessment - Google Patents

Automated procedure assessment Download PDF

Info

Publication number
CN116456924A
CN116456924A CN202180077710.2A CN202180077710A CN116456924A CN 116456924 A CN116456924 A CN 116456924A CN 202180077710 A CN202180077710 A CN 202180077710A CN 116456924 A CN116456924 A CN 116456924A
Authority
CN
China
Prior art keywords
robotic
user
counting
video
basket
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180077710.2A
Other languages
Chinese (zh)
Inventor
E·艾瓦利
H·拉菲-塔里
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Auris Health Inc
Original Assignee
Auris Surgical Robotics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Auris Surgical Robotics Inc filed Critical Auris Surgical Robotics Inc
Priority claimed from PCT/IB2021/060596 external-priority patent/WO2022106991A1/en
Publication of CN116456924A publication Critical patent/CN116456924A/en
Pending legal-status Critical Current

Links

Abstract

The present disclosure provides a robotic system configured to evaluate an identified stage of a medical procedure. The robot system includes: a video capturing device; a robotic manipulator; one or more sensors; an input device; a data store; and a control circuit. The control circuit is configured to: determining a first state of the robotic manipulator based on sensor data from the one or more sensors; identifying a first input from the input device for initiating a first action of the robotic manipulator; performing a first analysis of video of the patient site captured by the video capture device; identifying a first phase of the medical procedure based at least in part on the first state of the robotic manipulator, the first input, and the first analysis of the video; and generating an assessment of the first phase of the medical procedure based on one or more metrics associated with the first phase.

Description

Automated procedure assessment
Cross Reference to Related Applications
The present application claims the benefit of U.S. provisional application No. 63/116,798, filed on 11/20/2020, and U.S. application No. 63/132,875, filed on 12/31/2020, each of which is hereby incorporated by reference in its entirety.
Background
Technical Field
The present disclosure relates to the field of medical devices and procedures and artificial intelligence assisted data processing.
Description of the Related Art
Various medical procedures involve the use of robotic systems that are assisted with one or more medical instruments configured to penetrate the human anatomy to reach a treatment site. Certain procedures may involve inserting the one or more medical devices through the skin or orifice of the patient to reach the treatment site and withdraw a subject (such as a urethral stone) from the patient.
Disclosure of Invention
One or more systems, devices, and/or methods are described herein for assisting a physician or other medical professional in controlling the access of medical instruments to a subject located within the human anatomy (such as a urinary tract stone).
For purposes of summarizing the present disclosure, certain aspects, advantages, and novel features have been described. It will be appreciated that not necessarily all such advantages may be achieved in accordance with any particular embodiment. Thus, the disclosed embodiments may be carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other advantages as may be taught or suggested herein.
A system having one or more computers may be configured to perform particular operations or actions by having software, firmware, hardware, or a combination thereof installed on the system that in operation causes the system to perform the actions. One or more computer programs may be configured to perform particular operations or acts by including instructions that, when executed by a data processing apparatus, cause the apparatus to perform the acts. One general aspect includes a robotic system for evaluating an identified stage of a medical procedure performed by the robotic system. The robotic system further includes a video capture device; a robotic manipulator; one or more sensors configured to determine a configuration of the robotic manipulator; an input device configured to receive one or more user interactions and initiate one or more actions of the robotic manipulator; a data store configured to store metrics associated with phases of a medical procedure; and control circuitry communicatively coupled to the input device and the robotic manipulator. The control circuit is configured to: determining a first state of the robotic manipulator based on sensor data from the one or more sensors; identifying a first input from the input device for initiating a first action of the robotic manipulator; performing a first analysis of video of the patient site captured by the video capture device; identifying a first phase of the medical procedure based at least in part on the first state of the robotic manipulator, the first input, and the first analysis of the video; and generating an assessment of the first phase of the medical procedure based on one or more metrics associated with the first phase. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each computer program configured to perform the actions of these methods.
Implementations may include one or more of the following features. This first stage of the medical procedure may include one of ureteroscopic drive, ureteroscopic laser lithotripsy, ureteroscopic basket loading, and percutaneous needle insertion. This first stage may include ureteroscopy basket loading, and generating the assessment may include: counting the number of basket operations; counting the number of ureteroscope retractions; determining a ratio of the number of basket operations to the number of ureteroscope retractions; and comparing the determined ratio to other ratios from a previous ureteroscopy basket procedure. The first stage may include a ureteroscopy drive, and generating the assessment may include: counting the times of manual driving of the peep mirror by a user; counting the number of times the user robot drives the scope; determining a ratio of the number of times the user manually actuates the scope to the number of times the user robotically actuates the scope; and comparing the determined ratio to other ratios from a previous ureteroscopy basket procedure. The first stage may include percutaneous needle insertion, and generating the assessment may include: counting the number of times a user attempts to insert a needle until the user successfully inserts the needle; and comparing the counted number of times with the recorded needle insertion attempts from the previous percutaneous needle insertion operation. The first stage may include percutaneous needle insertion, and generating the assessment may include: counting the time it takes to survey the kidneys before selecting a target cup for percutaneous access; and comparing the counted time with a recorded time from a previous percutaneous needle insertion operation. The first stage may include percutaneous needle insertion, and generating the assessment may include: counting the number of times the navigation field generator for tracking the needle is repositioned; and comparing the counted number of times with the recorded number of repositioning times from a previous percutaneous needle insertion operation. The first stage may include percutaneous needle insertion, and generating the assessment may include: counting the number of times that automated alignment of the end effector of the robotic manipulator with the catheter is initiated; and comparing the counted number of times with the recorded number of automated alignments from the previous operation. This first stage may include ureteroscopic laser lithotripsy, and generating the assessment may include: counting the laser lithotripsy time of the stones; determining the size of the stone; and comparing the ratio of the laser lithotripsy time to the size of the stone to previous ratios from other operations. This first stage may include ureteroscopic laser lithotripsy, and generating the assessment may include: determining the type of the stone; and aggregating statistics based on the type of the stone across surgical procedures. This first stage may include ureteroscopic laser lithotripsy, and generating the assessment may include: counting the number of times the view of the video capture device becomes obscured by dust from stone fragmentation; and comparing the counted number of times with the recorded number of dust shadows from the operation. Implementations of the described technology may include hardware, methods or processes, or computer software on a computer-accessible medium.
One general aspect includes a method for evaluating an identified stage of a medical procedure performed by a robotic system, which may include a video capture device. The method also includes determining a first state of the robotic manipulator based on sensor data from the one or more sensors; identifying a first input from the input device for initiating a first action of the robotic manipulator; performing a first analysis of video of the patient site captured by the video capture device; identifying a first phase of the medical procedure based at least in part on the first state of the robotic manipulator, the first input, and the first analysis of the video; and generating an assessment of the first phase of the medical procedure based on one or more metrics associated with the first phase. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each computer program configured to perform the actions of these methods.
Implementations may include one or more of the following features. The method wherein the first stage may comprise ureteroscopy basket loading, and generating the assessment may comprise: counting the number of basket operations; counting the number of ureteroscope retractions; determining a ratio of the number of basket operations to the number of ureteroscope retractions; and comparing the determined ratio to other ratios from previous ureteroscopic basket operations. The first stage may include a ureteroscopy drive, and generating the assessment may include: counting the times of manual driving of the peep mirror by a user; counting the number of times the user robot drives the scope; determining a ratio of the number of times the user manually actuates the scope to the number of times the user robotically actuates the scope; and comparing the determined ratio to other ratios from previous ureteroscopic basket operations. The first stage may include percutaneous needle insertion, and generating the assessment may include: counting the number of times a user attempts to insert a needle until the user successfully inserts the needle; and comparing the counted number of times with the recorded needle insertion attempts from the previous percutaneous needle insertion operation. The first stage may include percutaneous needle insertion, and generating the assessment may include: counting the time it takes to survey the kidneys before selecting a target cup for percutaneous access; and comparing the counted time with a recorded time from a previous percutaneous needle insertion operation. The first stage may include percutaneous needle insertion, and generating the assessment may include: counting the number of times the navigation field generator for tracking the needle is repositioned; and comparing the counted number of times with the recorded number of repositioning times from a previous percutaneous needle insertion operation. The first stage may include percutaneous needle insertion, and generating the assessment may include: counting the number of times that automated alignment of the end effector of the robotic manipulator with the catheter is initiated; and comparing the counted number of times with the recorded number of automated alignments from the previous operation. This first stage may include percutaneous antegrade ureteroscopy laser lithotripsy, and generating the assessment may include: counting the laser lithotripsy time of the stones; determining the size of the stone; and comparing the ratio of the laser lithotripsy time to the size of the stone to previous ratios from other operations. This first stage may include ureteroscopic laser lithotripsy, and generating the assessment may include: determining the type of the stone; and aggregating statistics based on the type of the stone across surgical procedures. This first stage may include ureteroscopic laser lithotripsy, and generating the assessment may include: counting the duration for which the view of the video capture device becomes obscured by dust from stone fragmentation; and comparing the counted duration with the recorded duration from the previous operation. Implementations of the described technology may include hardware, methods or processes, or computer software on a computer-accessible medium.
One general aspect includes a control system for a robotic device for evaluating an identified stage of a medical procedure. The control system also includes a communication interface configured to receive sensor data, user input data, and video data from the robotic device; a memory configured to store the sensor data, the user input data, and the video data; and one or more processors configured to: determining a first state of a manipulator of the robotic device based on sensor data from the one or more sensors; identifying a first input from the user input data for initiating a first action of the manipulator; performing a first analysis of video of the patient site captured by the video capture device; a first phase of the medical procedure is identified based at least in part on the first state of the manipulator, the first input, and the first analysis of the video. The system may also generate an assessment of the first phase of the medical procedure based on one or more metrics associated with the first phase. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each computer program configured to perform the actions of these methods.
Drawings
For purposes of illustration, various embodiments are depicted in the drawings and should in no way be construed to limit the scope of the disclosure. In addition, various features of the different disclosed embodiments can be combined to form additional embodiments that are part of the present disclosure. Throughout the drawings, reference numerals may be repeated to indicate corresponding relationships between reference elements.
FIG. 1 illustrates an exemplary medical system for performing or assisting in performing a medical procedure according to certain embodiments.
Fig. 2A-2B illustrate perspective views of a medical system in performing a urethral stone capture procedure according to certain embodiments.
Fig. 3 illustrates a block diagram of a control system of a medical system having associated inputs and outputs, according to some embodiments.
Fig. 4A illustrates a block diagram of a control system configured to generate output from video data utilizing machine learning, in accordance with certain embodiments.
FIG. 4B illustrates a block diagram of a control system configured to generate output from several types of data using machine learning, in accordance with certain embodiments.
Fig. 5 is a flow chart of a stage identification process according to some embodiments.
Fig. 6 is a flow chart of a triggering process for an automated robotic action, according to some embodiments.
Fig. 7 is a diagram illustrating different types of triggered actions of a robotic system according to some embodiments.
FIG. 8 is a flow chart of an evaluation process for tasks performed during an identified phase, according to some embodiments.
Fig. 9 is a flow chart of a scoring process for medical tasks according to some embodiments.
Fig. 10 is a flow chart of another scoring process for medical tasks according to some embodiments.
Fig. 11 illustrates exemplary details of a robotic system according to certain embodiments.
Fig. 12 shows exemplary details of a control system according to certain embodiments.
Detailed Description
The headings provided herein are for convenience only and do not necessarily affect the scope or meaning of the disclosure. Although certain preferred embodiments and examples are disclosed below, the present subject matter extends beyond the specifically disclosed embodiments to other alternative embodiments and/or uses and modifications and equivalents thereof. Therefore, the scope of the claims that may appear herein is not limited by any one of the specific embodiments described below. For example, in any method or process disclosed herein, the acts or operations of the method or process may be performed in any suitable sequence and are not necessarily limited to any particular disclosed sequence. Various operations may then be described as multiple discrete operations in a manner that may be helpful in understanding particular embodiments. However, the order of description should not be construed as to imply that these operations are order dependent. Additionally, the structures, systems, and/or devices described herein may be embodied as integrated components or as separate components. For purposes of comparing various embodiments, specific aspects and advantages of these embodiments are described. Not necessarily all such aspects or advantages may be achieved by any particular implementation. Thus, for example, various embodiments may be performed by way of accomplishing one advantage or a set of advantages taught herein without necessarily achieving other aspects or advantages as may be taught or suggested herein.
With respect to the preferred embodiments, certain standard positional anatomical terms may be used herein to refer to the anatomy of an animal, i.e., a human). Although specific spatially relative terms such as "exterior," "interior," "upper," "lower," "below," "above," "vertical," "horizontal," "top," "bottom," and the like are used herein to describe the spatial relationship of one device/element or anatomical structure to another device/element or anatomical structure, it should be understood that these terms are used herein for convenience of description to describe the positional relationship between the elements/structures as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the elements/structures in use or operation in addition to the orientation depicted in the figures. For example, an element/structure described as being "above" another element/structure may represent a position below or beside such other element/structure relative to an alternative orientation of the subject patient or element/structure, and vice versa.
SUMMARY
The present disclosure relates to techniques and systems for collecting and analyzing data from robotic-assisted medical procedures, such as those performed by robotic systems for stone management (e.g., reclaiming urinary tract stones, sucking stone fragments, etc.) or performing other medical procedures. Medical procedures may be advanced through several different stages. For example, in ureteroscopy, the stages may include percutaneous insertion of medical devices into the body, advancement to the site of a urethral stone, laser lithotripsy of a urethral stone, and/or basket of broken stones. Robotic systems typically have several sensors and input devices, allowing for the generation of large amounts of data during medical procedures. The protocol data may be used to automatically determine the different phases of operation. By identifying these phases, the robotic system can anticipate actions and prepare for a medical professional operating the robotic system during a medical procedure.
A medical system including a robotic system may also allow video material of a procedure to be annotated with metadata identifying different phases. This allows the user to more easily review the video material and allows for finer analysis of the video material using Artificial Intelligence (AI). This may make it easier to evaluate and score actions performed by a user or operator by comparing those actions with similar actions from corresponding stages performed during other protocols. For example, the video material and associated data may be analyzed by the AI system to generate statistics of the operation, such as pre-success attempts at each stage or the entire procedure, time of each stage, number of articulation commands provided by the operator, accuracy of needle insertion, and the like. Furthermore, the data may be aggregated for several operations and used to generate statistics of general operation types, such as success rate, average operation time per stage or whole procedure, etc. Such medical systems may also provide additional benefits, such as by generating example summaries.
In one exemplary scenario, there are different phases during percutaneous kidney entry or other procedures. In an exemplary workflow, the user drives the scope to a desired cup, marks the nipple and retracts the scope to see the target nipple. The user then grasps the needle, selects an insertion site, and aligns the needle trajectory with the target nipple using a graphical user interface ("GUI"). Finally, the user inserts the needle to access the kidney through the target nipple while following the graphical user interface. To improve procedure efficiency and evaluate user skills, the medical system may mark the beginning and end of these events and obtain basic truth data regarding whether percutaneous access ("pec") attempts were successful.
After dividing the instance data into different phases and generating a phase transition chart showing the phases, the transition chart may be used to evaluate the procedure. For example, one exemplary transition chart may show that the physician has selected a target and an insertion site, but is not moving forward with respect to the targeting step, but is instead driving to a different cup to select a new target. The chart may show that the physician does not obtain visual access confirmation in the first percutaneous access attempt and drives the scope to position the needle. The chart may show that the physician makes another percutaneous access attempt to the same target and that visual confirmation is obtained this time. Such charts may be displayed on a GUI of the medical system, as a digital or printed report, on a mobile application and/or similar type of output.
Another potential benefit is providing basic truth (success/failure) comments. Phase segmentation may enable prediction of whether a certain percutaneous access attempt was successful, thus serving as a base truth value for this example. The medical system may track a set of feature descriptors during the needle insertion phase to determine whether percutaneous access has been successful. The feature descriptors may include various parameters or metrics measured by the medical system, such as needle and scope speed, and the relative pose of the needle with respect to the scope. It may also include scope articulation commands and features that are detected by a computer vision algorithm that detects whether the needle is visible in the camera view and quantifies how much anatomical motion is present. For example, there may be a direct correlation between visual confirmation and success. In one scenario, if the computer vision algorithm detects a needle in the endoscopic view, the percutaneous access attempt may be annotated or otherwise indicated as successful. In another scenario, the distance between the needle and the scope may be very small, but there is no visual confirmation of the needle on the scope. If the scope starts to move, this means that the percutaneous access attempt is unsuccessful and the user is looking for a needle or driving to another to select a new target. Thus, detection of scope movement in this case may be used to annotate or otherwise indicate a percutaneous access attempt as unsuccessful.
Another potential benefit is providing skill assessment. Phase segmentation may enable phase-specific data analysis to be run either intra-operatively or post-operatively to evaluate physician skill and calculate instance statistics. The following table shows some post-operative measurements of the percutaneous entry stage. For example, by knowing when needle insertion begins (e.g., identified via video capture, sensor data, etc.), the medical system may determine an entry point on the skin (e.g., using kinematic data, video analysis, etc.) and calculate a site selection metric such as tubing length (e.g., distance from the skin to the nipple).
For example, during the scope actuation phase, the user's skill may be evaluated based on the number of articulation commands received by the system. If fewer commands are received, this means that the operation has proceeded smoothly, indicating a higher skill. If more commands are received, this means that several attempts have to be performed, indicating that there is room for lifting. These metrics may also provide information about which portions of the anatomy the user is struggling to navigate. The number of articulation commands may be recorded and/or displayed for the operation or for multiple operations (all instances, all instances in a particular period of time, all instances performed by a user, etc.). For example, the medical system may generate metrics that are compared by location over time in multiple operations for a given instance, across physicians, and/or for the same physician.
In another example, in the needle insertion phase, the skill of the user may be assessed based on success rate and/or needle insertion accuracy. The success rate may be calculated further specifically based on the kidney position, such as at the inferior, medial, or superior pole. The needle insertion accuracy may be compared to an average of the expert. Needle insertion accuracy may be recorded and/or displayed for this operation or for multiple operations (e.g., all instances in a particular period of time, all instances performed by a user, etc.).
In another example, in the location selection phase, the skill of the user may be assessed based on the location selection time or the time spent by the user selecting a location and the average conduit length. The site selection time may be compared to the average of the expert. The site selection time may be recorded and/or displayed for the operation or for multiple operations (all instances, all instances in a particular period of time, all instances performed by a user, etc.). The average conduit length may further be calculated specifically based on the kidney position, such as at the inferior, medial, or superior pole. The length of the patient's tubing can be used as an indicator of the patient's Body Mass Index (BMI). This may allow for example performance to be aggregated based on patient population characteristics (such as BMI values or ranges).
The above table shows just a few examples of possible metrics that may be evaluated. Furthermore, the above table shows only some of the specificities that can be applied to these metrics. For example, some specificities applied to one metric may be applied to other metrics. In some embodiments, needle insertion accuracy may be further resolved based on kidney position. Success rates may be shown with more specificity by comparison with expert averages or across multiple operations (e.g., all instances within a particular period of time, all instances performed by a user, etc.).
Another potential benefit of such medical systems is providing skill assessment workflow optimization. Workflow analysis may show a correlation between the order of workflow steps and the success and efficiency of percutaneous access. For example, the algorithm may compare the instance of performing the site selection prior to the site selection with the instance of performing the target selection prior to the site selection and evaluate the impact on percutaneous access time and accuracy.
Such medical systems can be used for several types of procedures, including ureteroscopy. Kidney lithiasis (also known as urolithiasis) is a relatively common medical condition that involves the formation of solid masses in the urinary tract, known as "kidney stones" (or "urinary stones") or "kidney stones" (or "urinary stones"). Urinary stones can form and/or be found in the kidneys, ureters and bladder (referred to as "vesical stones"). Such urinary stones are formed as a result of the concentrated minerals and can cause significant abdominal pain once they reach a size sufficient to prevent urine from flowing through the ureter or urethra. Urinary stones may be formed from calcium, magnesium, ammonia, uric acid, cysteine, and/or other compounds.
To remove urinary stones from the bladder and ureter, the surgeon may insert a ureteroscope into the urinary tract through the urethra. Typically, a ureteroscope includes an endoscope at its distal end that is configured to enable visualization of the urinary tract. Ureteroscope may also include a lithotomy mechanism (such as a basket retrieval device) to capture or break up urethral stones. During a ureteroscopy procedure, one physician/technician may control the position of the ureteroscope, while another physician/technician may control the stone removal mechanism.
In many embodiments, these techniques and systems are discussed in the context of a minimally invasive procedure. However, it should be understood that these techniques and systems may be implemented in the context of any medical procedure, including, for example, percutaneous procedures, non-invasive procedures, therapeutic procedures, diagnostic procedures, non-percutaneous procedures, or other types of procedures that access a target location by making a puncture and/or small incision in the body to insert a medical instrument. For example, such techniques may be used for tumor biopsy or ablation for urology and bronchoscopy, where an automated biopsy procedure may be triggered when the system detects that a suspicious site is approached. The endoscopic procedure may include bronchoscopy, ureteroscopy, gastroscopy, nephroscopy, nephrolithiasis, etc. Furthermore, in many embodiments, these techniques and systems are discussed as being implemented as a robot-assisted procedure. However, it should also be appreciated that these techniques and systems may be implemented in other protocols, such as in a fully robotic medical protocol.
For ease of illustration and discussion, these techniques and systems are discussed in the context of removing urinary tract stones (such as kidney stones) from the kidneys. However, as noted above, these techniques and systems may be used to perform other procedures.
Medical system
FIG. 1 illustrates an exemplary medical system 100 for performing or assisting in performing a medical procedure in accordance with one or more embodiments. Embodiments of the medical system 100 may be used in surgical and/or diagnostic procedures. The medical system 100 includes a robotic system 110 configured to engage and/or control a medical instrument 120 to perform a procedure on a patient 130. The medical system 100 also includes a control system 140 configured to interact with the robotic system 110, provide information about the procedure, and/or perform a variety of other operations. For example, the control system 140 may include a display 142 to present a user interface 144 to assist the physician 160 in using the medical device 120. Further, the medical system 100 may include a table 150 configured to hold the patient 130 and/or imaging sensor 180, such as a camera, x-ray, computed Tomography (CT), magnetic Resonance Imaging (MRI), positron Emission Tomography (PET) device, or the like.
In some embodiments, the physician performs a minimally invasive medical procedure, such as ureteroscopy. The physician 160 may interact with the control system 140 to control the robotic system 110 to navigate the medical device 120 (e.g., basket retrieval device and/or scope) from the urethra to the kidney 170 where the stone 165 is located. The control system 140 may provide information about the medical instrument 120 via the display 142 to assist the physician 160 in navigating, such as real-time images from the medical instrument 120 or the imaging sensor 180. Once at the site of the kidney stones, the medical device 120 may be used to break up and/or capture the urinary tract stones 165.
In some implementations using the medical system 100, the physician 160 may perform a percutaneous procedure. To illustrate, if the patient 130 has a kidney stone 165 in the kidney 170 that is too large to be removed via the urinary tract, the physician 160 can perform a procedure to remove the kidney stone via a percutaneous access point on the patient 130. For example, physician 160 may interact with control system 140 to control robotic system 110 to navigate medical device 120 (e.g., scope) from the urethra to kidney 170 where stone 165 is located. The control system 140 may provide information about the medical instrument 120 via the display 142 to assist the physician 160 in navigating the medical instrument 120, such as real-time images from the medical instrument 120 or the imaging sensor 180. Once the site of the kidney stone is reached, the medical device 120 may be used to designate a target location for percutaneous access to the kidney (e.g., a desired point for access to the kidney) for a second medical device (not shown). To minimize damage to the kidney, the physician 160 may designate a particular nipple as the target location for accessing the kidney using a second medical device. However, other target locations may be specified or determined. Once the second medical device has reached the target location, physician 160 may use the second medical device and/or another medical device to remove kidney stones from patient 130 (such as through a percutaneous access point). Although the above-described percutaneous procedure is discussed in the context of using the medical device 120, in some implementations, the percutaneous procedure may be performed without the assistance of the medical device 120. In addition, the medical system 100 may be used to perform various other procedures.
Minimally invasive surgery offers the possibility of video recording of the procedure, as a camera (e.g., a scope for medical instrument 120) may be inserted into the body during surgery. Additional cameras and sensors located outside the body may be used to capture video and/or data of the patient and medical system 100. For example, an Operating Room (OR) camera may capture video of activity in the operating room, such as movement of an operator OR physician's hand, needle position, replacement of a fluid bag, bleeding of a patient, and so forth. Details such as the number of contrast agent injections during fluoroscopy may also be captured by the OR camera and used to estimate radiation exposure to the patient. Audio recorded in the video may also be used to aid in the recognition stage. For example, some robotic systems beep or otherwise emit audible noise when laser lithotripsy occurs. These videos may be archived and used later for reasons such as cognitive training, skill assessment, and workflow analysis.
Computer vision, a form of Artificial Intelligence (AI), allows quantitative analysis of video by a computer to identify objects and patterns. For example, in endoscopic surgery, AI video systems may be used for gesture/task classification, skill assessment, tool type recognition, shot/event detection and retrieval. AI systems can view video of surgical procedures to track movement and timing of instruments used during the procedure. AI systems may use metrics to track the timing of the tool, such as when and which instrument to use and for how long. Additionally, the AI system may track the path of the instrument, which may be used to evaluate the procedure or identify stages in the procedure. The AI system may determine the distance the tool is discharged within the surgical field, which may be related to the quality of the surgery, as better surgeons tend to place instruments in the focal region. The AI system may also determine metrics for measuring aspects of the performance of the medical professional, including their athletic economy, their frequency of toggling between instruments, and their efficiency at each step of the procedure.
In the example of fig. 1, the medical device 120 is implemented as a basket retrieval device. Thus, for ease of discussion, the medical device 120 is also referred to as a "basket retrieval device 120". However, medical device 120 may be implemented as various types of medical devices including, for example, a scope (sometimes referred to as an "endoscope"), a needle, a catheter, a guidewire, a lithotripter, a clamp, a vacuum, a scalpel, combinations thereof, and the like. In some embodiments, the medical device is a steerable device, while other embodiments, the medical device is a non-steerable device. In some embodiments, a surgical tool refers to a device, such as a needle, scalpel, guidewire, or the like, configured to puncture or be inserted through a human anatomy. However, surgical tools may refer to other types of medical instruments. In some embodiments, a variety of medical devices may be used. For example, an endoscope may be used with basket retrieval device 120. In some embodiments, medical device 120 may be a composite device incorporating several devices, such as a vacuum, basket retrieval device, scope, or various combinations of devices.
In some embodiments, medical device 120 may include a Radio Frequency Identification (RFID) chip for identifying medical device 120. The medical system 100 may include an RFID reader to read an RFID chip in a medical instrument to help identify the instrument. Such information may be used to facilitate the identification procedure and stage. For example, if the RFID data identifies the instrument as a needle, the stage may be related to needle insertion, but determining the exact stage may require combining the RFID data with additional data such as video, device status, telemetry (e.g., magnetic tracking, robotic data, fluid data, etc.).
The robotic system 110 may be configured to facilitate medical procedures. The robotic system 110 may be arranged in a variety of ways, depending on the particular procedure. The robotic system 110 may include one or more robotic arms 112 (a), 112 (b), 112 (c)) to engage with and/or control the medical instrument 120 to perform a procedure. As shown, each robotic arm 112 may include a plurality of arm segments coupled to joints, which may provide a plurality of degrees of movement. In the example of fig. 1, robotic system 110 is positioned proximate to the lower torso of patient 130, and robotic arm 112 is actuated to engage and position medical instrument 120 in order to access an access point (such as the urethra of patient 130). With the robotic system 110 properly positioned, the medical device 120 may be robotically inserted into the patient 130 by the physician 160 using the robotic arm 112, manually, or a combination of both.
The robotic system 110 may also include a base 114 coupled to the one or more robotic arms 112. The base 114 may include various subsystems such as control electronics, power supplies, pneumatics, light sources, actuators (e.g., motors for moving the robotic arm), control circuitry, memory, and/or a communication interface. In some embodiments, base 114 includes an input/output (I/O) device 116 configured to receive input (such as user input for controlling robotic system 110) and provide output (such as patient status, medical instrument position, etc.). The I/O device 116 may include a controller, mouse, keyboard, microphone, touch pad, other input device, or a combination thereof. The I/O device may include an output component such as a speaker, a display, a haptic feedback device, other output devices, or a combination of the above. In some embodiments, the robotic system 110 is movable (e.g., the base 114 includes wheels) such that the robotic system 110 can be positioned in a location suitable or desired for the procedure. In other embodiments, robotic system 110 is a stationary system. Furthermore, in some embodiments, robotic system 110 is integrated into workstation 150.
The robotic system 110 may be coupled to any component of the medical system 100, such as the control system 140, the table 150, the imaging sensor 180, and/or the medical instrument 120. In some embodiments, the robotic system is communicatively coupled to the control system 140. In one example, robotic system 110 may receive control signals from control system 140 to perform operations, such as positioning robotic arm 112 in a particular manner, maneuvering a scope, and the like. In response, the robotic system 110 may control components of the robotic system 110 to perform the operations. In another example, robotic system 110 may receive an image from scope depicting the internal anatomy of patient 130 and/or send the image to control system 140 (which may then be displayed on control system 140). Further, in some embodiments, the robotic system 110 is coupled to a component of the medical system 100 (such as the control system 140) to receive data signals, power, and the like. Other devices (such as other medical instruments, iv bags, blood bags, etc.) may also be coupled to the robotic system 110 or other components of the medical system 100, depending on the medical procedure being performed.
The control system 140 may be configured to provide various functionalities to assist in performing medical procedures. In some embodiments, the control system 140 may be coupled to the robotic system 110 and interoperate with the robotic system 110 to perform a medical procedure on the patient 130. For example, control system 140 may communicate with robotic system 110 via a wireless or wired connection (e.g., to control robotic system 110, basket retrieval device 120, receive images captured by a scope, etc.), control the flow of fluid through robotic system 110 via one or more fluid channels, provide power to robotic system 110 via one or more electrical connections, provide optical signals to robotic system 110 via one or more optical fibers or other components, and so forth. Further, in some embodiments, control system 140 may communicate with the scope to receive sensor data. Further, in some embodiments, the control system 140 may communicate with the table 150 to position the table 150 in a particular orientation or otherwise control the table 150.
As shown in FIG. 1, the control system 140 includes various I/O devices configured to assist a physician 160 or other person in performing a medical procedure. In some embodiments, the control system 140 includes an input device 146 employed by a physician 160 or another user to control the basket retrieval device 120. For example, the input device 146 may be used to navigate the basket retrieval device 120 within the body of the patient 130. The physician 160 may provide input via the input device 146 and, in response, the control system 140 may send control signals to the robotic system 110 to manipulate the medical instrument 120.
In some implementations, the input device 146 is a controller similar to a game controller. The controller may have a plurality of axes and buttons that may be used to control the robotic system 110. Although the input device 146 is shown as a controller in the example of fig. 1, the input device 146 may be implemented as various types of I/O devices or combinations thereof, such as a touch screen/pad, mouse, keyboard, microphone, smart speaker, etc. As also shown in fig. 1, the control system 140 may include a display 142 to provide various information regarding the procedure. For example, control system 140 may receive real-time images captured by the scope and display the real-time images via display 142. Additionally or alternatively, the control system 140 may receive signals (e.g., analog signals, digital signals, electrical signals, acoustic/sonic signals, pneumatic signals, tactile signals, hydraulic signals, etc.) from medical monitors and/or sensors associated with the patient 130, and the display 142 may present information regarding the health of the patient 130 and/or the environment of the patient 130. Such information may include information displayed via a medical monitor, including, for example, heart rate (e.g., electrocardiogram (ECG), heart Rate Variability (HRV), etc.), blood pressure/rate, muscle biosignals (e.g., electromyogram (EMG)), body temperature, oxygen saturation (e.g. ,SpO 2 ) Carbon dioxide (CO) 2 ) Brain waves (e.g., electroencephalogram (EEG)), ambient temperature, and the like.
Fig. 1 also illustrates various anatomical structures of a patient 130 associated with certain aspects of the present disclosure. Specifically, patient 130 includes a kidney 170 fluidly connected to bladder 171 via ureter 172, and a urethra 173 fluidly connected to bladder 171. As shown in the enlarged view of kidney 170, the kidney includes calyx 174 (including calyx and calyx minor), renal papilla (including renal papilla 176, also referred to as "papilla 176"), and renal cone (including renal cone 178). In these examples, kidney stones 165 are located near nipple 176. However, kidney stones may be located elsewhere within kidney 170.
As shown in fig. 1, to remove kidney stones 165 in an exemplary minimally invasive procedure, physician 160 may position robotic system 110 at the foot of table 150 to initiate delivery of medical device 120 into patient 130. In particular, the robotic system 110 may be positioned near the lower abdominal region of the patient 130 and aligned to directly linearly access the urethra 173 of the patient 130. The robotic arm 112 (B) may be controlled from the bottom of the table 150 to provide access to the urethra 173. In this example, the physician 160 inserts the medical device 120 at least partially into the urethra along the direct linear access path (sometimes referred to as a "virtual track"). Medical device 120 may include a lumen configured to receive a scope and/or basket retrieval device, thereby facilitating insertion of these devices into the anatomy of patient 130.
Once robotic system 110 is properly positioned and/or medical device 120 is at least partially inserted into urethra 173, the scope may be robotically, manually, or a combination of both inserted into patient 130. For example, physician 160 may connect medical instrument 120 to robotic arm 112 (C). The physician 160 may then interact with the control system 140 (such as the input device 146) to navigate the medical instrument 120 within the patient 130. For example, physician 160 may provide input via input device 146 to control robotic arm 112 (C) to navigate basket retrieval device 120 through urethra 173, bladder 171, ureter 172, and up to kidney 170.
The control system 140 may include various components (sometimes referred to as "subsystems") to facilitate its functionality. For example, the control system 140 may include various subsystems such as control electronics, power supplies, pneumatic devices, light sources, actuators, control circuitry, memory, and/or communication interfaces. In some embodiments, control system 140 comprises a computer-based control system that stores executable instructions that, when executed, perform various operations. In some embodiments, the control system 140 is mobile, as shown in fig. 1, while in other embodiments, the control system 140 is a stationary system. Although various functions and components implemented by control system 140 are discussed, any of these functions and/or components may be integrated into and/or performed by other systems and/or devices, such as robotic system 110 and/or workstation 150.
The medical system 100 may provide a variety of benefits, such as providing guidance to assist a physician in performing a procedure (e.g., instrument tracking, patient status, etc.), enabling a physician to perform a procedure from an ergonomic position without clumsy arm movements and/or positions, enabling a single physician to perform a procedure using one or more medical instruments, avoiding radiation exposure (e.g., associated with fluoroscopy techniques), enabling a procedure to be performed in a single surgical environment, providing continuous aspiration to more effectively remove an axis (e.g., remove kidney stones), etc. Furthermore, the medical system 100 may provide non-radiation based navigation and/or positioning techniques to reduce physician exposure to radiation and/or to reduce the amount of equipment in the operating room. Further, the medical system 100 may divide the functions into a control system 140 and a robotic system 110, each of which may be independently movable. Such partitioning of functionality and/or mobility may enable the control system 140 and/or the robotic system 110 to be placed at a location that is optimal for a particular medical procedure, which may maximize the work area around the patient and/or provide an optimal location for a physician to perform the procedure. For example, many aspects of the procedure may be performed by the robotic system 110 (which is positioned relatively close to the patient), while the physician manages the procedure from the comfort of the control system 140 (which may be positioned farther).
In some embodiments, the control system 140 may function even if located in a different geographic location than the robotic system 110. For example, in a telemedicine implementation, the control system 140 is configured to communicate with the robotic system 110 over a wide area network. In one scenario, the physician 160 may be located in one hospital with the control system 140, while the robotic system 110 is located in a different hospital. The physician may then remotely perform the medical procedure. This may be beneficial if the remote hospitals (such as hospitals in rural areas) have limited expertise in particular protocols. These hospitals can then rely on more experienced physicians elsewhere. In some implementations, the control system 140 can be paired with various robotic systems 110, for example, by selecting a particular robotic system and forming a secure network connection (e.g., using a password, encryption, authentication token, etc.). Thus, a physician at one location can perform a medical procedure at a variety of different locations by establishing a connection with robotic system 110 at each of these different locations.
In some embodiments, robotic system 110, workstation 150, medical device 120, needle, and/or imaging sensor 180 are communicatively coupled to each other by a network, which may include a wireless network and/or a wired network. Exemplary networks include one or more Personal Area Networks (PANs), one or more Local Area Networks (LANs), one or more Wide Area Networks (WANs), one or more Internet Area Networks (IAN), one or more cellular networks, the internet, and the like. Further, in some embodiments, control system 140, robotic system 110, workstation 150, medical device 120, and/or imaging sensor 180 are connected via one or more support cables for communication, fluid/gas exchange, power exchange, and the like.
Although not shown in fig. 1, in some embodiments, the medical system 100 includes and/or is associated with a medical monitor configured to monitor the health of the patient 130 and/or the environment in which the patient 130 is located. For example, the medical monitors may be located in the same environment in which the medical system 100 is located, such as an operating room. The medical monitors may be physically and/or electrically coupledOne or more sensors are coupled to the one or more sensors and configured to detect or determine one or more physical, physiological, chemical, and/or biological signals, parameters, attributes, states, and/or conditions associated with the patient 130 and/or the environment. For example, the one or more sensors may be configured to determine/detect any type of physical attribute, including temperature, pressure, vibration, tactile (haptic) characteristics, sound, optical level or characteristics, load or weight, flow rate (e.g., of a target gas and/or liquid), amplitude, phase and/or orientation of magnetic and electric fields, concentration of constituents associated with a substance in gas, liquid, or solid form, and the like. The one or more sensors may provide sensor data to the medical monitor, and the medical monitor may present information regarding the health of the patient 130 and/or the environment of the patient 130. Such information may include information displayed via a medical monitor, including, for example, heart rate (e.g., ECG, HRV, etc.), blood pressure/rate, muscle biosignals (e.g., EMG), body temperature, oxygen saturation (e.g., spO) 2 )、CO 2 Brain waves (e.g., EEG), ambient temperature, etc. In some embodiments, the medical monitor and/or the one or more sensors are coupled to the control system 140, and the control system 140 is configured to provide information regarding the health of the patient 130 and/or the environment of the patient 130.
Urethral stone capture
Fig. 2A-2B illustrate perspective views of the medical system 100 when performing a urethral stone capture procedure. In these examples, medical system 100 is disposed in an operating room to remove kidney stones from within patient 130. In many examples of this procedure, the patient 130 is positioned in a modified supine position with the patient 130 tilted slightly to one side to access the back or side of the patient 130. As shown in fig. 1, the urethral stone capture procedure can also be performed with the patient in a normal supine position. Although fig. 2A-2B illustrate the use of the medical system 100 to perform a minimally invasive procedure to remove kidney stones from the body of the patient 130, the medical system 100 may be used to otherwise remove kidney stones and/or perform other procedures. In addition, the patient 130 may be disposed at other locations as desired by the protocol. Various actions performed by the physician 160 are described in fig. 2A-2B and throughout the disclosure. It should be appreciated that these actions may be performed directly by the physician 160, indirectly by the physician via the medical system 100, by a user under the direction of the physician, by another user (e.g., a technician), and/or by any other user.
Although a particular robotic arm of robotic system 110 is shown as performing particular functions in the context of fig. 2A-2B, any robotic arm 112 may be used to perform these functions. Further, any additional robotic arms and/or systems may be used to perform this procedure. Further, the robotic system 110 may be used to perform other parts of the procedure.
As shown in fig. 2A, the basket retrieval device 120 is maneuvered into the kidney 170 to access the urethral stone 165. In some cases, the physician 160 or other user uses the input device 146 to directly control movement of the basket retrieval device 120. Such directly controlled movement may include insertion/retraction, bending the basket retrieval device 120 to the left or right, rotation, and/or regular opening/closing of the basket. The basket retrieval device 120 is placed adjacent to the stone using various movements.
In some embodiments, a laser, shock wave device, or other device is used to break up the stone. The laser or other device may be incorporated into the basket retrieval device 120, or may be a separate medical instrument. In some cases, the stones 165 are small enough that breaking up the stones into smaller pieces is not required.
As shown in fig. 2B, the open basket is maneuvered to enclose the urethral stone 165 or smaller pieces of the urethral stone. Basket retrieval device 120 is then removed from kidney 170 and then removed from the patient.
If additional stones (or large fragments of broken stones 165) are present, the basket retrieval device 120 may be reinserted into the patient to capture the remaining large fragments. In some embodiments, a vacuum instrument may be used to facilitate removal of debris. In some cases, the stones may be small enough that the patient can naturally expel the stones.
Stage segmentation and stage differentiationIdentification device
Automated surgical workflow analysis can be used to detect different stages in a procedure and evaluate surgical skills and procedure efficiency. Data (e.g., video data) collected during a procedure may be partitioned into multiple portions using, for example, machine learning methods, including but not limited to Hidden Markov Models (HMMs) and long-term short memory (LTSM) networks.
In surgical phase segmentation, captured medical procedure data is automatically segmented into phases by identifying phases using input data from an operating room. Segmentation may be done in real-time during the procedure or performed on recorded data after surgery. In one embodiment, the surgical data may be pre-processed using dynamic time warping to divide the phases into equal comparable segments. The input data may include instrument signals, annotations, tracking of instruments (e.g., EM), or information obtained from video.
The identification of the surgical workflow may be performed at different granularity levels, depending on the procedure. The stages and steps may be identified (higher level) or gestures and activities may be identified (lower level). Surgical phase recognition may be performed on time series, kinematic data, and video data using machine learning methods such as HMM, gaussian Mixture Model (GMM), and Support Vector Machine (SVM), as well as deep learning-based methods that use Convolutional Neural Networks (CNNs) for phase recognition from video data. For surgical gesture and activity recognition, similar methods (SVM, markov models) may be used primarily for video data or combinations of video and kinematic data, as well as more recent deep learning based methods (such as CNN) that may be used to recognize tool presence, tasks, and activities in video data. Phase segmentation may use multiple data sources to segment instance data into different subtasks (as shown in fig. 3) or a single data source (such as video) to classify the current phase (as shown in fig. 4). In fig. 4, additional data (e.g., sensor data or UI data) may then be incorporated to further refine the output produced by the control system 140.
In fig. 3, the control system 140 receives various input data from the medical system 100. Such inputs may include video data 305 captured by the imaging sensor 180, robot sensor data 310 from one or more sensors of the robotic system 110, and User Interface (UI) data received from the input device 146.
Video data 305 may include video captured from a scope deployed in a patient, video captured from a camera in an operating room, and/or video captured by a camera of robotic system 110. The robot sensor data 310 may include kinematic data (e.g., using vibration, accelerometer, positioning, and/or gyroscopic sensors), device status, temperature, pressure, vibration, tactile/haptic characteristics, sound, optical level or characteristics, load or weight, flow rate (e.g., of a target gas and/or liquid), amplitude, phase, and/or orientation of magnetic and electric fields, concentration of constituents associated with a substance in gas, liquid, or solid form, etc. from the robotic system 110. UI data 315 may include button presses, menu selections, page selections, gestures, voice commands, etc., made by a user and captured by an input device of medical system 100. Patient sensor data, such as those described above in fig. 1, may also be used as input to the control system 140.
The control system 140 may analyze the video data 305 (e.g., using a machine learning algorithm) and identify phases of the medical procedure using the robot sensor data 310 and the UI data 315. In one example, a medical procedure such as ureteroscopy includes several tasks (e.g., task 1 through task 5). Each task may be performed in one or more phases of a medical procedure. In the example shown in fig. 3, task 1 is performed in phase 1. Task 2 is performed in phases 2 and 4. Task 3 is performed in stage 3 and stage 5. Task 4 is performed in stages 6 and 8. Task 5 is performed in phase 7. Time 1 (T1) represents the time taken to complete phase 1, time 2 (T2) represents the time taken to complete phase 2, and time 3 (T3) represents the time taken to complete phase 3. Other protocols may have different numbers of tasks and/or different numbers of phases.
For robotic procedures where manual and automated tasks exist, surgical stage detection can be used to automatically and seamlessly transition between manual and automated tasks. For example, T1 may correspond to a manual task, T2 may be an automated task, and T3 may likewise be a manual task. In one embodiment, the target selection step may be performed autonomously by the robot driving the scope while the target selection phase is active. Alternatively, the user may perform site selection by picking points on the skin using EM markers, and the robot may autonomously align the needle to the target insertion trajectory.
Fig. 4A illustrates a block diagram of a control system 140 configured to generate output from video data from a medical procedure utilizing machine learning, in accordance with certain embodiments. In some embodiments, the control system 140 is configured to first process the video data 305 using a machine learning algorithm. In one embodiment, the video data 305 is processed by the CNN 405 to generate an output 412 to identify features recorded in the video, such as surgical tools, stones, human anatomy (e.g., nipple), and the like. Such identified features 415 may be provided as input to a Recurrent Neural Network (RNN) 410 along with the original video. The RNN 410 can then process the video data 305 and the identified features 415 to generate output 412 to identify stage 420 in the medical procedure.
Supplemental data such as robot sensor data 310 or UI data 315 may then be used to further refine (e.g., increase accuracy or increase the number of identifications) the identified features 415 and the identified stages 420. In other embodiments, the robot sensor data 310 and/or the UI data 315 may be used to narrow down possible options considered by the control system 140 before the video data 305 is processed by the control system 140. For example, the supplemental data may be used to identify a particular procedure, which reduces the population of possible tasks and phases to those corresponding to the particular procedure. The control system 140 may then limit the identified features 415 and the identified phases 420 to those corresponding to the particular procedure. For example, if a task is initially identified by the control system 140 in the video data 305, but the task is not associated with a particular procedure, the control system 140 may reprocess the video until the task is re-identified as a task corresponding to the particular procedure.
After completing the processing of the video data 305, the control system 140 may generate an annotated video that includes the identified features 415 and/or the identified stages 420. Such annotations may be stored as part of the video (e.g., in the same video file), as metadata stored with the video, in a database and/or other data format.
By creating metadata enhanced video, the video becomes easier to use for reviewing medical procedures. For example, instead of manually searching for the time at which a particular phase occurs, the viewer may jump forward or backward to the particular phase of interest. In addition, multiple videos may be more easily processed to aggregate data and generate metrics. For example, multiple videos may be searched for an instance of a particular stage (e.g., needle insertion or stone capture) and analyzed to generate metrics (e.g., success rate, average attempt, number of attempts, etc.) about the stage.
Although fig. 4A shows the video data 305 being processed by the control system 140, other types of data may be processed by the control system 140 serially or in series with each other. For example, such data may include instrument positioning as measured by electromagnetic tracking sensors, robotic system 110 data, such as the distance of scope insertion, the manner in which the scope is articulated, whether the basket is open or closed, the distance of basket insertion, and/or the connection status of the robotic system. The data may be provided as input to a single neural network or to multiple neural networks. For example, each different type of sensor (e.g., video, device status, telemetry, such as: magnetic tracking, robotic data, and/or fluid data) may have its own network, and the outputs of these networks may be cascaded prior to the final stage classification layer to obtain a single stage prediction.
Fig. 4B illustrates one such embodiment, wherein different types of data from different devices and/or sensors are processed by different neural networks. The video data 305 may be processed by a first neural network 425 (e.g., CNN and/or RNN as described in fig. 4A), the robot sensor data 310 may be processed by a second neural network 430, and the UI data may be processed by a third neural network 435. The outputs from the different neural networks may then be combined to generate an output 412 (e.g., phase prediction) for the medical system 100.
Stage identification process
Fig. 5 is a flow diagram of a stage identification process 500 according to some embodiments. The phase identification process 500 may be performed by the control system 140 or another component of the medical system 100 of fig. 1. While one possible sequence of the process is described below, other embodiments may perform the process in a different order or may include additional steps or may exclude one or more of the steps described below.
At block 505, the control system 140 identifies an input from the UI of the robotic system. For example, input may be received from an input device 146, such as a controller or touch screen. Possible inputs may include selection of a procedure stage or UI screen associated with a particular procedure stage. For example, a first screen may list options for a first procedure and a second screen may list options for a second procedure. If the user makes selections on the first screen, these selections indicate that the user is to perform the first procedure. If the user makes selections on the second screen, these selections indicate that the user will perform the second procedure. Thus, by sorting the screen of the UI to correspond to a particular phase, the control system 140 may obtain phase information based on the user's selection. In another example, one embodiment of the medical system 100 may include a UI with a first screen showing an alternative stone management procedure, such as ureteroscopy, percutaneous access, or minimally invasive percutaneous nephroscopy lithotomy (PCNL). If the user selects ureteroscopy, control system 140 may determine that these phases are associated with ureteroscopy (e.g., basket, laser lithotripsy, and/or survey kidneys). Likewise, the selection of other stone management procedure indication phases is associated with the corresponding procedure.
At block 510, the control system 140 determines a procedure from a set of procedures based on at least one of the UI input and the sensor data. As described above, input from the UI interface can be used to identify the currently possible procedure phases. In addition, robot sensor data may also be used for the identification procedure. For example, if it is determined that the arm of the robotic system 110 is approaching the patient while holding the surgical instrument, the control system 140 may determine that the current procedure is related to the insertion of the medical instrument.
At block 515, the control system 140 may narrow down the set of identifiable procedure phases to a subset of procedure phases based on the determined procedure. For example, laser lithotripsy may be associated with a task or stage such as activating a laser or stopping a laser. Basket loading may be associated with tasks or stages such as capturing stones or retrieving baskets. Insertion of the medical instrument 120 may be associated with aligning the instrument with a target and inserting the instrument into a target site. In one example, if the control system 140 determines that the current procedure is basket loading during ureteroscopy, the control system 140 may narrow the possible stages to capture stones or retrieve baskets.
At block 520, the control system 140 may determine a position of a robotic manipulator (e.g., robotic arm 112) from sensor data of the robotic system 110. As depicted in fig. 3, various types of sensors may be used to generate sensor data, which may then be used to determine position.
At block 525, the control system 140 may perform an analysis of the captured video. In some embodiments (such as those described in fig. 4), machine learning algorithms are used to perform analysis and generate output, such as identified features and temporary stage identification. The output may include an identification of a physical object, such as a surgical tool or a portion of an anatomical structure. For example, if the control system 140 identifies a ureter in the captured video, this indicates that this stage is not relevant to percutaneous access. Similarly, identifying the nipple indicates that this stage is not associated with basket loading. Identification of other types of anatomical structures may similarly be used to eliminate the possibility of certain phases.
At block 530, the control system 140 may identify phases from a subset of the protocol phases based at least on the position of the robotic manipulator and the analysis performed. For example, if the control system 140 is receiving basket input via the controller, the control system 140 may determine that this phase is one of the basket phases. Additionally, if the performed analysis identifies that the captured video is showing a basket proximate to broken kidney stones, the control system 140 may determine that the current stage is capturing stones. In another example, if the performed analysis identifies that the captured video is showing a basket being removed from broken kidney stones, the control system 140 may determine that the current stage is to retract the basket into the sheath. In another example, the kinematic data from the robotic system 110 may indicate that the medical instrument is being removed from the patient, and the control system 140 may determine that the current stage is to retract the basket into the sheath.
At block 535, the control system 140 may generate a videomark for the identified stage for the captured video. The videomark may be embedded as metadata in the same file as the video, as a separate file associated with the video file, as metadata stored in a database for video annotations, and so forth.
In some implementations, the video file is annotated so that a viewer of the video file can jump to a particular stage in the video. For example, a video may be divided into chapters or segments corresponding to different phases. In one embodiment, a search bar of a video may be marked with color segments corresponding to different phases, where each phase is represented by a different color.
At block 550, the control system 140 may determine whether the end of video is reached. If so, process 500 may end. If not, the process 500 may loop back to block 520 to continue identifying additional phases. For example, the process 500 may cycle one, two, three, or more times to identify a first stage, a second stage, a third stage, or more stages. The captured video may then end with one or more videomarks, depending on the number of identified phases.
Triggering an automated action
Fig. 6 is a flow diagram of a triggering process 600 for an automated robotic action, according to some embodiments. The triggering process 600 may be performed by the control system 140 or another component of the medical system 100 of fig. 1. While one possible sequence of the process is described below, other embodiments may perform the process in a different order or may include additional steps or may exclude one or more of the steps described below.
At block 605, the control system 140 may determine a state of a robotic manipulator (e.g., robotic arm 112) from sensor data (e.g., kinematic data) of the robotic system 110. As depicted in fig. 3, various types of sensors may be used to generate sensor data, which may then be used to determine the position or other status of the robotic manipulator.
At block 610, the control system 140 may determine an input to initiate an action of the robotic manipulator. For example, the input may be from a user manipulating the controller to control the basket apparatus. In another example, the input may be a screen selection or a menu selection on a UI of the medical system 100.
At block 615, the control system 140 may perform an analysis of the captured video. In some embodiments (such as those described in fig. 4), machine learning algorithms are used to perform analysis and generate output, such as identified features and temporary stage identification.
At block 620, the control system 140 may identify a stage of the medical procedure based at least on the state of the manipulator, the identified input, and the performed analysis. For example, if the control system 140 is receiving basket input via the controller, the control system 140 may determine that this phase is one of the basket phases. Additionally, if the performed analysis identifies that the captured video is showing a basket proximate to broken kidney stones, the control system 140 may determine that the current stage is capturing stones. In another example, if the performed analysis identifies that the captured video is showing a basket being removed from broken kidney stones, the control system 140 may determine that the current stage is to retract the basket into the sheath. In another example, the kinematic data from the robotic system 110 may indicate that the medical instrument is being removed from the patient, and the control system 140 may determine that the current stage is to retract the basket into the sheath.
At block 625, the control system 140 may trigger an automatic action of the robotic system 110 based on the identified stage. The action triggered may vary based on the type of procedure being performed. Some possible actions are shown in blocks 630, 635, and 640. At block 630, robotic system 110 performs an action during ureteroscopy laser lithotripsy. At block 635, the robotic system 110 performs an action during insertion of a medical instrument (such as a needle). At block 635, the robotic system 110 performs an action during ureteroscopy basket loading. After causing the action of the robotic system 110, the triggering process 600 may end. Fig. 7 depicts additional details regarding specific actions that may be triggered.
Fig. 7 is a diagram illustrating different types of triggered actions of robotic system 110 according to some embodiments. These actions may be triggered in response to identifying the current stage of the operation or identifying a user action. In some embodiments, the action may be fully automatic and performed without requiring additional input from the user. In other embodiments, the actions may be partially automated, requiring confirmation from the user prior to execution by the robotic system 110. Different combinations of stages may be performed based on the procedure being performed by the robotic system 110. Some exemplary procedures include (retrograde) ureteroscopy, percutaneous nephrolithotomy (PCNL), minimally invasive PCNL, and the like. For example, ureteroscopy may include a reconnaissance phase (not shown), a laser lithotripsy phase, and a basket phase. PCNL may include a percutaneous access phase, a reconnaissance phase, a laser lithotripsy phase, and a basket phase. Minimally invasive PCNL may include additional alignment and/or aspiration phases.
For example, during laser lithotripsy 705, actions that may be triggered include applying laser light to the stone 710 and stopping the laser light 715 when the laser light is not directed at the stone. In one scenario, by using various sensors (e.g., cameras), the robotic system 110 may detect when the laser is directed at a stone. It may then determine the size of the stone, for example, by using machine learning algorithms that have been trained using a record of previous ureteroscopy procedures, or by using conventional computer vision algorithms (e.g., comparing the known size of the basket with the size of the stone). Based on the determined size, the robotic system 110 may then determine an initial laser lithotripsy time based on the laser lithotripsy times recorded for stones of similar size and/or type. The robotic system 110 may then stop the laser after the determined laser lithotripsy time or if it detects that the stone has broken. In other scenarios, the user may provide additional input, such as setting a laser break time or providing permission for activation of the laser by the robotic system.
In other scenarios, the application laser may be triggered by a user, while the stop laser is triggered automatically by the robotic system 110. For example, using its sensors, the robotic system 110 may detect when the aiming of the laser drifts from the stone or is otherwise not focused on the stone and in response, stop the laser.
In another example, during loading 725, actions that may be triggered include capturing stones within the basket 730 and retrieving the basket into the sheath 735. In one scenario, when the robotic system 110 detects that the basket 730 is aligned with a stone and within a specified distance, it may trigger actuation of the basket 730. Basket 730 may then be actuated to capture the stone. Then, using its sensors (e.g., a camera or pressure sensor), the robotic system 110 may determine if stones are captured inside the basket 730 and trigger retraction of the basket into the sheath 735. The user may then withdraw the sheath from the patient, thereby removing the stone. In another example, during percutaneous access 740, actions that may be triggered include target (cup) selection 745, insertion site selection 750, and needle insertion 755 into the target site. In one scenario, the robotic system 110 may determine the target and the insertion site at the target (e.g., marked by the user or identified by the system). The robotic system 110 may then wait for confirmation from the user to proceed. After receiving the confirmation, the robotic system 110 may then insert a needle (or other instrument) into the target site.
In another example, during a minimally invasive PCNL procedure, additional stages may include aligning 765 with the robot of the PCNL sheath and laser lithotripsy 770 with active irrigation and aspiration. Triggered actions in these phases may include aligning the instrument with the PCNL sheath and increasing aspiration. For example, if the robotic system 110 detects an increase in stone fragments during laser lithotripsy or otherwise detects larger dust that limits visibility, the robotic system 110 may increase suction or aspiration to remove more stone fragments. Once the visibility or field of view is increased, the robotic system 110 may reduce suction.
Although some examples and scenarios of automatic actions of robotic system 110 that may be triggered based on the identified phases have been discussed above, triggerable actions are not limited to the actions discussed above. The robotic system 110 may be programmed to perform other triggerable actions based on the needs of the user and patient.
Evaluating tasks performed during a phase
FIG. 8 is a flow diagram of an evaluation process 800 for tasks performed during an identified phase, according to some embodiments. The evaluation process 800 may be performed by the control system 140 or another component of the medical system 100 of fig. 1. While one possible sequence of the process is described below, other embodiments may perform the process in a different order or may include additional steps or may exclude one or more of the steps described below.
At block 805, the control system 140 may determine a state of a robotic manipulator (e.g., robotic arm 112) from sensor data of the robotic system 110. As depicted in fig. 3, various types of sensors may be used to generate sensor data, which may then be used to determine the position or other status of the robotic manipulator.
At block 810, the control system 140 may determine an input to initiate an action of the robotic manipulator. For example, the input may be from a user manipulating the controller to control the basket apparatus. In another example, the input may be a screen selection or a menu selection on a UI of the medical system 100.
At block 815, the control system 140 may perform an analysis of the captured video. In some embodiments (such as those described in fig. 4), machine learning algorithms are used to perform analysis and generate output, such as identified features and temporary stage identification.
At block 820, the control system 140 may identify a stage of the medical procedure based at least on the state of the manipulator, the identified input, and the performed analysis. For example, if the control system 140 is receiving basket input via the controller, the control system 140 may determine that this phase is one of the basket phases. Additionally, if the performed analysis identifies that the captured video is showing a basket proximate to broken kidney stones, the control system 140 may determine that the current stage is capturing stones. In another example, if the performed analysis identifies that the captured video is showing a basket being removed from broken kidney stones, the control system 140 may determine that the current stage is to retract the basket into the sheath. In another example, the kinematic data from the robotic system 110 may indicate that the medical instrument is being removed from the patient, and the control system 140 may determine that the current stage is to retract the basket into the sheath.
At block 825, the control system 140 may generate an evaluation of the identified phases based on the one or more metrics. The stage being evaluated may vary based on the type of procedure being performed. Some possible stages are shown in blocks 830, 835 and 840. At block 830, the control system 140 evaluates the ureteroscopic laser lithotripsy phase. At block 835, the control system 140 evaluates the medical instrument insertion phase. At block 840, the control system 140 evaluates the ureteroscopy basket phase. Some specific examples of the various evaluations are described below.
Fig. 9 is a flow diagram of a scoring process 900 for medical tasks according to some embodiments. The scoring process 900 may be performed by the control system 140 or another component of the medical system 100 of fig. 1. While one possible sequence of the process is described below, other embodiments may perform the process in a different order or may include additional steps or may exclude one or more of the steps described below.
At block 905, the control system 140 counts the number of times the first protocol task is performed. At block 910, the control system 140 counts the number of times the second protocol task is performed. At block 915, the control system 140 determines a ratio of the count of the first protocol task to the count of the second protocol task. At block 920, the control system 140 may compare the determined ratio to a historical ratio. For example, the historical ratios may be generated by analyzing the history of the same protocol to determine an average or median ratio.
In one example, during ureteroscopic basket loading, the control system 140 may count the number of basket operations and count the number of ureteroscope retractions. The control system 140 can then determine a ratio of the number of basket operations to the number of ureteroscope withdrawals, and compare the determined ratio to other ratios from previous ureteroscopy basket procedures.
In one example of ureteroscopic drive, control system 140 may count the number of times the user manually drives the scope and the number of times the user robotically drives the scope. Manual actuation is commonly used to survey the kidneys. At the same time, the scope is typically docked to a robotic system for performing basket loading. Control system 140 may then determine a ratio of the number of times the user manually drives the scope to the number of times the user robotically drives the scope, and compare the determined ratio to other recorded ratios from previous ureteroscopy procedures. The ratio may measure the level of user adaptation to the robotic ureteroscopy.
In another example, during a ureteroscopic laser lithotripsy, the control system 140 may count the laser lithotripsy time of the stone and determine the size and/or type of the stone. The control system 140 may then determine a ratio of laser lithotripsy time of the stone to the size of the stone and compare the determined ratio to previous ratios from other operations. By determining the type of stone (e.g., uric acid, calcium oxalate monohydrate, struvite, cystine, brushite, etc.), the control system 140 can aggregate statistics across surgical procedures based on the type of stone. For example, laser lithotripsy duration and protocol duration may be divided by type of stone.
At block 925, the control system 140 may generate an output of the comparison. Such output may be reports, visual indicators, guidelines, scores, charts, and the like. For example, the control system 140 may indicate that the user is executing at, below, or above the median or average of the ratio as compared to the recorded ratio from the previous operation. In some embodiments, the output may compare the current user with a record of previous operations of the user to track the user's personal performance. In some embodiments, the output may compare the user to other medical professionals.
In one embodiment, the output may include a real-time indicator showing a comparison of the user's current performance with a previous operation. Such output may assist the user during surgery by giving user input regarding the length of time to perform laser lithotripsy, e.g., based on the size of the stone. Other outputs may provide other relevant information to the user.
The scoring process 900 may be used to evaluate various types of procedure tasks. For example, some ratios may include the number of basket operations compared to the number of ureteroscope retractions, the number of times the user manually drives the scope compared to the number of times the user robotically drives the scope, and the laser lithotripsy time of the stone compared to the size of the stone.
Fig. 10 is a flow chart of another scoring process for medical tasks according to some embodiments. Scoring process 1000 may be performed by control system 140 or another component of medical system 100 of fig. 1. While one possible sequence of the process is described below, other embodiments may perform the process in a different order or may include additional steps or may exclude one or more of the steps described below.
At block 1005, the control system 140 may count the first protocol tasks. At block 1010, the control system 140 may compare the first protocol task to a historical count of the first protocol. For example, during a ureteroscopic drive, the control system 140 may count the number of times the user attempts to insert the needle until the user is successful, and compare the count to recorded needle insertion attempts from previous percutaneous needle insertion operations.
In another example, during percutaneous needle insertion, the control system 140 may count the time it takes to survey the kidney before selecting a target cup for percutaneous access, and compare the counted time with the recorded time from a previous percutaneous needle insertion operation. The control system 140 may also count the number of times that automated alignment of the robotic manipulator with the catheter is initiated during percutaneous needle insertion, and compare the counted number of times with the recorded number of automated alignments from previous operations.
During minimally invasive PCNL alignment, the control system 140 may count the number of times that automated alignment of the end effector of the robotic manipulator with the catheter or sheath is initiated and compare the counted number of times with the recorded number of automated alignments from previous operations. In another example, during ureteroscopic laser lithotripsy, the control system 140 may count the number of times the view of the video capture device becomes obscured by dust from the stone fragmentation, and compare the counted number of times to the recorded number of dust obscurations from the previous operation.
At block 1015, the control system 140 may generate an output of the comparison. Such output may be reports, visual indicators, guidelines, scores, charts, and the like. For example, the control system 140 may indicate that the user is performing at, below, or above a median or average value as compared to the recorded metrics from previous operations. In some embodiments, the output may compare the current user with a record of previous operations of the user to track the user's personal performance. In some implementations, the output may compare the user with other users.
In one embodiment, the output may include a real-time indicator showing a comparison of the user's current performance with a previous operation. Such output may assist the user during surgery by, for example, indicating whether the amount of dust from the fragmentation exceeds a normal value. Other outputs may provide other relevant information to the user.
Scoring process 1000 may be used to evaluate various types of procedure tasks. For example, some tasks may include: the number of times the user attempts to insert the needle until the user successfully inserts the needle; counting the time it takes to survey the kidneys before selecting a target cup for percutaneous access; counting the number of times the navigation field generator for tracking the needle is repositioned; counting the number of times that automated alignment of the robotic manipulator with the catheter is initiated; and counting the number of times the view of the video capture device becomes obscured by dust from stone fragmentation.
Exemplary robot System
Fig. 11 illustrates exemplary details of the robotic system 110 in accordance with one or more embodiments. In this example, the robotic system 110 is shown as a mobile cart-type robotic-enabled system. However, the robotic system 110 may be implemented as a stationary system, integrated into a table top, etc.
The robotic system 110 may include a support structure 114 that includes an elongated section 114 (a) (sometimes referred to as a "column 114 (a)") and a base 114 (B). The column 114 (a) may include one or more brackets, such as bracket 1102 (alternatively referred to as "arm support 1102"), for supporting the deployment of one or more robotic arms 112 (three shown in the figures). The bracket 1102 may include individually configurable arm mounts that rotate along a vertical axis to adjust the base of the robotic arm 112 for positioning relative to the patient. The bracket 1102 may also include a bracket interface 1104 that allows the bracket 1102 to translate vertically along the column 114 (a). The bracket interface 1104 is connected to the post 114 (a) by slots, such as slot 1106, positioned on opposite sides of the post 114 (a) to guide vertical translation of the bracket 1102. The slot 1106 includes a vertical translation interface to position and maintain the bracket 1102 at various vertical heights relative to the base 114 (B). The vertical translation of the carriage 1102 allows the robotic system 110 to adjust the reach of the robotic arm 112 to meet various table heights, patient sizes, physician preferences, and the like. Similarly, the individually configurable arm mounts on the bracket 1102 allow the robotic arm base 1108 of the robotic arm 112 to be angled in a variety of configurations. The column 114 (a) may internally include mechanisms (such as gears and/or motors) designed to mechanically translate the carriage 1102 using vertically aligned lead screws in response to control signals generated in response to user inputs (such as inputs from the I/O device 116).
In some embodiments, the slot 1106 may be supplemented with a slot cover that is level with and/or parallel to the slot surface to prevent dust and/or fluid from entering the interior chamber of the post 114 (a) and/or the vertical translation interface as the carrier 1102 is vertically translated. The slot cover may be unwound by a pair of spring reels positioned near the vertical top and bottom of the slot 1106. As the carriage 1102 translates vertically upward and downward, the cover may wind within the spool until unwound to extend and retract from its wound state. The spring loading of the spool can provide a force to retract the cover into the spool as the bracket 1102 translates toward the spool, while also maintaining a tight seal as the bracket 1102 translates away from the spool. The cover may be connected to the bracket 1102 using, for example, brackets in the bracket interface 1104 to ensure that the cover is properly extended and retracted as the bracket 1102 translates.
The base 114 (B) may balance the weight of the column 114 (a), the bracket 1102, and/or the arm 112 on a surface such as a floor. Thus, base 114 (B) may house heavier components, such as one or more electronics, motors, power supplies, etc., as well as components that enable robotic system 110 to move and/or be stationary. For example, base 114 (B) may include rollable wheels 1116 (also referred to as "casters 1116") that allow robotic system 110 to move within a room for a procedure. After reaching the proper position, the casters 1116 may be immobilized using a wheel lock to hold the robotic system 110 in place during the procedure. As shown, the robotic system 110 also includes a handle 1118 to assist in maneuvering and/or stabilizing the robotic system 110.
The robotic arm 112 may generally include a robotic arm base 1108 and an end effector 1110 separated by a series of links 1112, which are connected by a series of joints 1114. Each knuckle 1114 may include an independent actuator, and each actuator may include an independently controllable motor. Each independently controllable joint 1114 represents an independent degree of freedom available to the robotic arm 112. For example, each arm 112 may have seven joints, providing seven degrees of freedom. However, any number of joints may be implemented with any degree of freedom. In an example, multiple joints may produce multiple degrees of freedom, allowing for "redundant" degrees of freedom. The redundant degrees of freedom allow the robotic arms 112 to position their respective end effectors 1110 at specific locations, orientations, and/or trajectories in space using different link positions and/or joint angles. In some embodiments, the end effector 1110 may be configured to engage and/or control a medical instrument, device, subject, or the like. The freedom of movement of the arm 112 may allow the robotic system 110 to position and/or guide medical instruments from a desired point in space, and/or allow a physician to move the arm 112 to a clinically advantageous position away from a patient to form a passageway while avoiding arm collisions.
As shown in FIG. 11, robotic system 110 may also include I/O device 116. The I/O device 116 may include a display, a touch screen, a touch pad, a projector, a mouse, a keyboard, a microphone, a speaker, a controller, a camera (e.g., for receiving gesture input), or another I/O device for receiving input and/or providing output. The I/O device 116 may be configured to receive touch, voice, gestures, or any other type of input. The I/O device 116 may be positioned at a vertical end of the column 114 (a) (e.g., a top of the column 114 (a)) and/or provide a user interface for receiving user input and/or for providing output. For example, the I/O device 116 may include a touch screen (e.g., a dual-purpose device) to receive input and provide pre-operative and/or intra-operative data to a physician. Exemplary pre-operative data may include pre-operative planning, navigation, and/or mapping data derived from pre-operative Computed Tomography (CT) scans, and/or records derived from pre-operative patient interviews. Exemplary intraoperative data may include optical information provided from tools/instruments, sensors, and/or coordinate information from sensors, as well as important patient statistics (such as respiration, heart rate, and/or pulse). The I/O device 116 may be positioned and/or tilted to allow a physician to access the I/O device 116 from various locations, such as the side of the post 114 (a) opposite the bracket 1102. From this location, the physician may view the I/O device 116, the robotic arm 112, and/or the patient while operating the I/O device 116 from behind the robotic system 110.
The robotic system 110 may include a variety of other components. For example, the robotic system 110 may include one or more control electronics/circuitry, a power source, a pneumatic device, a light source, an actuator (e.g., a motor for moving the robotic arm 112), a memory, and/or a communication interface (e.g., for communicating with another device). In some implementations, the memory may store computer-executable instructions that, when executed by the control circuitry, cause the control circuitry to perform any of the operations discussed herein. For example, the memory may store computer-executable instructions that, when executed by the control circuitry, cause the control circuitry to receive input and/or control signals related to manipulation of the robotic arm 112 and, in response, control the robotic arm 112 to position and/or navigate a medical instrument connected to the end effector 1110 in a particular arrangement.
In some embodiments, robotic system 110 is configured to engage and/or control a medical instrument, such as basket retrieval device 120. For example, robotic arm 112 may be configured to control the position, orientation, and/or tip articulation of a scope (e.g., a sheath and/or guide of the scope). In some embodiments, robotic arm 112 may be configured/configured to be able to manipulate a scope using an elongated moving member. The elongate moving member may include one or more pull wires (e.g., pull wires or push wires), cables, fibers, and/or flexible shafts. To illustrate, robotic arm 112 may be configured to actuate a plurality of wires coupled to the scope to deflect the tip of the scope. The pull wire may comprise any suitable or desired material, such as metallic and/or non-metallic materials, such as stainless steel, kevlar (Kevlar), tungsten, carbon fiber, etc. In some embodiments, the scope is configured to exhibit non-linear behavior in response to forces applied by the elongate moving member. The non-linear behavior may be based on the stiffness and compressibility of the scope, as well as the relaxation or variability in stiffness between different elongated moving members.
Exemplary control System
FIG. 12 illustrates exemplary details of the control system 140 in accordance with one or more embodiments. As shown, the control system 140 may include one or more of the following components, devices, modules, and/or units (referred to herein as "components"), individually/individually and/or in combination/collectively: control circuitry 1202, data storage/memory 1204, one or more communication interfaces 1206, one or more power supply units 1208, one or more I/O components 1210, and/or one or more wheels 1212 (e.g., casters or other types of wheels). In some embodiments, the control system 140 may include a housing/casing configured and/or sized to house or contain at least a portion of one or more components of the control system 140. In this example, the control system 140 is shown as a cart-type system that is movable by the one or more wheels 1212. In some cases, after the proper position is reached, the one or more wheels 1212 may be immobilized using a wheel lock to hold the control system 140 in place. However, the control system 140 may be implemented as a fixed system, integrated into another system/device, or the like.
Although certain components of the control system 140 are shown in fig. 12, it should be understood that additional components not shown may be included in embodiments according to the present disclosure. For example, a Graphics Processing Unit (GPU) or other dedicated embedded chip for running a neural network may be included. Furthermore, certain illustrated components may be omitted in some embodiments. Although the control circuit 1202 is shown as a separate component in the diagram of fig. 12, it should be understood that any or all of the remaining components of the control system 140 may be at least partially embodied in the control circuit 1202. That is, the control circuitry 1202 may include various devices (active and/or passive), semiconductor materials and/or regions, layers, regions and/or portions thereof, conductors, leads, vias, connections, etc., wherein one or more other components of the control system 140 and/or portions thereof may be at least partially formed and/or implemented in/by such circuit components/devices.
The various components of the control system 140 may be electrically and/or communicatively coupled using some connection circuitry/devices/features that may or may not be part of the control circuit 1202. For example, the connection features may include one or more printed circuit boards configured to facilitate the mounting and/or interconnection of at least some of the various components/circuits of the control system 140. In some implementations, two or more of the control circuitry 1202, the data storage/memory 1204, the communication interface 1206, the power supply unit 1208, and/or the input/output (I/O) component 1210 can be electrically and/or communicatively coupled to each other.
As shown, the memory 1204 may include an input device manager 1216 and a user interface component 1218 configured to facilitate the various functions discussed herein. In some implementations, the input device manager 1216 and/or the user interface component 1218 can include one or more instructions executable by the control circuitry 1202 to perform one or more operations. While many embodiments are discussed in the context of components 1216-1218 including one or more instructions executable by control circuitry 1202, any of components 1216-1218 may be implemented, at least in part, as one or more hardware logic components, such as one or more Application Specific Integrated Circuits (ASICs), one or more Field Programmable Gate Arrays (FPGAs), one or more program specific standard products (ASSPs), one or more Complex Programmable Logic Devices (CPLDs), and the like. Furthermore, although components 1216-1218 are shown as being included within control system 140, any of components 1216-1218 may be at least partially implemented within another device/system, such as robotic system 110, workstation 150, or another device/system. Similarly, any other components of the control system 140 may be implemented at least partially within another device/system.
The input device manager 1216 may be configured to receive input from the input device 146 and convert it into actions executable by the robotic system 110. For example, preprogrammed movements (such as quick-open, quick-close, and jolt movements) may be stored in the input device manager 1216. These preprogrammed movements can then be assigned to desired inputs (e.g., single or double button presses, voice commands, joystick movements, etc.). In some implementations, the preprogrammed movement is determined by the manufacturer. In other implementations, the user may be able to modify existing preprogrammed movements and/or create new movements.
The user interface component 1218 may be configured to facilitate one or more user interfaces (also referred to as "graphical user interface(s)"). For example, the user interface component 1218 may generate: a configuration menu for assigning a preprogrammed movement to an input; or a setup menu for enabling certain modes of operation or disabling selected preprogrammed movements in certain situations. The user interface component 1218 can also provide user interface data 1222 for display to a user.
The one or more communication interfaces 1206 may be configured to communicate with one or more devices/sensors/systems. For example, the one or more communication interfaces 1206 may transmit/receive data wirelessly and/or by wire via a network. Networks according to embodiments of the present disclosure may include Local Area Networks (LANs), wide Area Networks (WANs) (e.g., the internet), personal Area Networks (PANs), body Area Networks (BANs), and the like. In some embodiments, the one or more communication interfaces 1206 may implement wireless technology, such as bluetooth, wi-Fi, near Field Communication (NFC), and the like.
The one or more power supply units 1208 may be configured to manage power for the control system 140 (and/or the robotic system 110 in some cases). In some embodiments, the one or more power supply units 1208 include one or more batteries, such as lithium-based batteries, lead-acid batteries, alkaline batteries, and/or another type of battery. That is, the one or more power supply units 1208 may include one or more devices and/or circuits configured to provide power and/or provide power management functionality. Further, in some embodiments, the one or more power supply units 1208 include a main power connector configured to couple to an Alternating Current (AC) or Direct Current (DC) main power source.
One or more of the I/O components 1210 may include various components to receive input and/or provide output for interaction with a user. The one or more I/O components 1210 may be configured to receive touch, voice, gesture, or any other type of input. In an example, one or more I/O components 1210 may be used to provide input regarding control of the device/system in order to control the robotic system 110, navigate a scope or other medical instrument attached to the robotic system 110, control the table 150, control the fluoroscopy device 190, and so forth. As shown, the one or more I/O components 1210 may include the one or more displays 142 (sometimes referred to as the "one or more display devices 142") configured to display data. The one or more displays 142 may include one or more Liquid Crystal Displays (LCDs), light Emitting Diode (LED) displays, organic LED displays, plasma displays, electronic paper displays, and/or any other type of technology. In some embodiments, the one or more displays 142 include one or more touch screens configured to receive input and/or display data. Further, the one or more I/O components 1210 may include one or more input devices 146, which may include a touch screen, a touch pad, a controller, a mouse, a keyboard, a wearable device (e.g., an optical head-mounted display), a virtual or augmented reality device (e.g., a head-mounted display), and so forth. In addition, the one or more I/O components 1210 may include: one or more speakers 1226 configured to output sound based on the audio signal; and/or one or more microphones 1228 configured to receive sound and generate an audio signal. In some embodiments, the one or more I/O components 1210 include or are implemented as a console.
Although not shown in fig. 9, control system 140 may include and/or may control other components, such as one or more pumps, flow meters, valve controls, and/or fluid access components, to provide controlled irrigation and/or aspiration capabilities to a medical device (e.g., a scope), a device deployable through a medical device, and the like. In some embodiments, irrigation and aspiration capabilities may be delivered directly to the medical device through separate cables. Further, the control system 140 may include a voltage and/or surge protector designed to provide filtered and/or protected power to another device (such as the robotic system 110), thereby avoiding placement of power transformers and other auxiliary power components in the robotic system 110, thereby forming a smaller, more mobile robotic system 110.
The control system 140 may also include support devices for sensors deployed throughout the medical system 100. For example, control system 140 may include an optoelectronic device for detecting, receiving, and/or processing data received from an optical sensor and/or camera. Such optoelectronic devices may be used to generate real-time images for display in any number of devices/systems, including in control system 140.
In some embodiments, control system 140 may be coupled to robotic system 110, workstation 150, and/or medical instrument (such as scope and/or basket retrieval device 120) by one or more cables or connectors (not shown). In some implementations, the support functionality from the control system 140 may be provided through a single cable, thereby simplifying and eliminating confusion in the operating room. In other implementations, specific functions may be coupled in separate cables and connectors. For example, while power may be provided through a single power cable, support for control, optical, fluid, and/or navigation may be provided through separate cables for control.
The term "control circuit" is used herein in accordance with its broad and ordinary meaning and may refer to one or more processors, processing circuits, processing modules/units, chips, dies (e.g., semiconductor die, including one or more active and/or passive devices and/or connection circuits), microprocessors, microcontrollers, digital signal processors, microcomputers, central processing units, graphics processing units, field programmable gate arrays, programmable logic devices, state machines (e.g., hardware state machines), logic circuits, analog circuits, digital circuits, and/or any devices that manipulate signals (analog and/or digital) based on hard coding of circuits and/or operational instructions. The control circuitry may also include one or more memory devices, which may be embodied in a single memory device, multiple memory devices, and/or embedded circuitry of the device. Such data storage devices may include read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, data storage registers, and/or any device that stores digital information. It should be noted that in embodiments where the control circuitry includes a hardware state machine (and/or implements a software state machine), analog circuitry, digital circuitry, and/or logic circuitry, the data storage/registers storing any associated operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry.
The term "memory" is used herein in accordance with its broad and ordinary meaning and may refer to any suitable or desired type of computer-readable medium. For example, a computer-readable medium may include one or more volatile data storage devices, nonvolatile data storage devices, removable data storage devices, and/or non-removable data storage devices implemented using any technology, layout, and/or data structure/protocol, including any suitable or desired computer-readable instructions, data structures, program modules, or other types of data.
Computer-readable media that may be implemented in accordance with embodiments of the present disclosure includes, but is not limited to, phase change memory, static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transitory medium which can be used to store information for access by a computing device. As used in some contexts herein, computer readable media may not generally include communication media such as modulated data signals and carrier waves. Accordingly, computer-readable media should be generally understood to refer to non-transitory media.
Additional embodiments
Depending on the implementation, the particular actions, events, or functions of any of the processes or algorithms described herein may be performed in a different order, may be added, combined, or ignored entirely. Thus, not all described acts or events are necessary for the practice of the process in certain embodiments.
Unless specifically stated otherwise or otherwise understood within the context of use, conditional language such as "may," "capable," "might," "may," "for example," etc., as used herein refer to their ordinary meaning and are generally intended to convey that a particular embodiment comprises and other embodiments do not include a particular feature, element, and/or step. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or steps are included in or are to be performed in any particular embodiment. The terms "comprising," "including," "having," and the like are synonymous and used in their ordinary sense, and are used inclusively in an open-ended fashion, and do not exclude additional elements, features, acts, operations, etc. Moreover, the term "or" is used in its inclusive sense (rather than in its exclusive sense) such that when used, for example, to connect a series of elements, the term "or" refers to one, some, or all of the series of elements. A connective term such as the phrase "at least one of X, Y and Z" is understood in the general context of use to convey that an item, term, element, etc. may be X, Y or Z, unless specifically stated otherwise. Thus, such conjunctive words are generally not intended to imply that certain embodiments require at least one of X, at least one of Y, and at least one of Z to each be present.
It should be appreciated that in the foregoing description of embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, this method of the present disclosure should not be construed as reflecting the following intent: any claim has more features than are expressly recited in that claim. Furthermore, any of the components, features, or steps illustrated and/or described in particular embodiments herein may be applied to or used with any other embodiment. Furthermore, no element, feature, step, or group of elements, features, or steps is essential or necessary for each embodiment. Therefore, it is intended that the scope of the invention herein disclosed and hereinafter claimed should not be limited by the particular embodiments described above, but should be determined only by a fair reading of the claims that follow.
It should be appreciated that a particular ordinal term (e.g., "first" or "second") may be provided for ease of reference and does not necessarily imply physical properties or ordering. Thus, as used herein, ordinal terms (e.g., "first," "second," "third," etc.) for modifying an element such as a structure, a component, an operation, etc., do not necessarily indicate a priority or order of the element relative to any other element, but may generally distinguish the element from another element having a similar or identical name (but for use of the ordinal term). In addition, as used herein, the indefinite articles "a" and "an" may indicate "one or more" rather than "one". Furthermore, operations performed "based on" a certain condition or event may also be performed based on one or more other conditions or events not explicitly recited.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which exemplary embodiments belong. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Unless explicitly stated otherwise, comparative and/or quantitative terms such as "less", "more", "larger", and the like, are intended to cover the concept of an equation. For example, "less" may refer not only to "less" in the most strict mathematical sense, but also to "less than or equal to".

Claims (22)

1. A robotic system for evaluating an identified stage of a medical procedure performed by the robotic system, the robotic system comprising:
a video capturing device;
a robotic manipulator;
one or more sensors configured to detect a configuration of the robotic manipulator;
an input device configured to receive one or more user interactions and initiate one or more actions by the robotic manipulator;
A data store configured to store metrics associated with phases of a medical procedure; and
a control circuit communicatively coupled to the input device and the robotic manipulator, the control circuit configured to:
determining a first state of the robotic manipulator based on sensor data from the one or more sensors;
identifying a first input from the input device for initiating a first action of the robotic manipulator;
performing a first analysis of video of a patient site captured by the video capture device;
identifying a first phase of the medical procedure based at least in part on the first state of the robotic manipulator, the first input, and the first analysis of the video; and
an assessment of the first phase of the medical procedure is generated based on one or more metrics associated with the first phase.
2. The robotic system of claim 1, wherein the first stage comprises one of a ureteroscopy drive, a ureteroscopy laser lithotripsy, a ureteroscopy basket, and a percutaneous needle insertion.
3. The robotic system of claim 1, wherein the first stage comprises a ureteroscopy basket, and generating the assessment comprises:
counting the number of basket operations;
counting the number of ureteroscope retractions;
determining a ratio of the number of basket operations to the number of ureteroscope retractions; and
the determined ratio is compared to other ratios from previous ureteroscopy basket procedures.
4. The robotic system of claim 1, wherein the first stage comprises a ureteroscopy drive, and generating the assessment comprises:
counting the times of manual driving of the peep mirror by a user;
counting the number of times the user robot drives the scope;
determining a ratio of the number of times the user manually drives the scope to the number of times the user robotically drives the scope; and
the determined ratio is compared to other ratios from previous ureteroscopy basket procedures.
5. The robotic system of claim 1, wherein the first stage comprises percutaneous needle insertion, and generating the assessment comprises:
Counting the number of times a user attempts to insert a needle until the user successfully inserts the needle; and
the counted number of times is compared with recorded needle insertion attempts from previous percutaneous needle insertion operations.
6. The robotic system of claim 1, wherein the first stage comprises percutaneous needle insertion, and generating the assessment comprises:
counting the time it takes to survey the kidneys before selecting a target cup for percutaneous access; and
the counted time is compared to the recorded time from the previous percutaneous needle insertion operation.
7. The robotic system of claim 1, wherein the first stage comprises percutaneous needle insertion, and generating the assessment comprises:
counting the number of times the navigation field generator for tracking the needle is repositioned; and
the counted number is compared to the recorded number of repositioning from a previous percutaneous needle insertion operation.
8. The robotic system of claim 1, wherein the first stage comprises percutaneous needle insertion, and generating the assessment comprises:
counting the number of times an automated alignment of the end effector of the robotic manipulator with a catheter is initiated; and
The counted number of times is compared with the recorded number of automated alignments from previous operations.
9. The robotic system of claim 1, wherein the first stage comprises ureteroscopic laser lithotripsy, and generating the assessment comprises:
counting the laser lithotripsy time of the stones;
determining the size of the stone; and
the laser lithotripsy time is compared to the ratio of the size of the stone and previous ratios from other operations.
10. The robotic system of claim 1, wherein the first stage comprises ureteroscopic laser lithotripsy, and generating the assessment comprises:
determining the type of the stone; and
statistics are aggregated across surgical procedures based on the type of the stone.
11. The robotic system of claim 1, wherein the first stage comprises ureteroscopic laser lithotripsy, and generating the assessment comprises:
counting the number of times the view of the video capture device becomes obscured by dust from stone fragmentation; and
the counted number is compared with the recorded number of dust shadows from the operation.
12. A method for evaluating an identified stage of a medical procedure performed by a robotic system, the robotic system including a video capture device, a robotic manipulator, one or more sensors, and an input device, the method comprising:
determining a first state of the robotic manipulator based on sensor data from the one or more sensors;
identifying a first input from the input device for initiating a first action of the robotic manipulator;
performing a first analysis of video of a patient site captured by the video capture device;
identifying a first phase of the medical procedure based at least in part on the first state of the robotic manipulator, the first input, and the first analysis of the video; and
an assessment of the first phase of the medical procedure is generated based on one or more metrics associated with the first phase.
13. The method of claim 12, wherein the first stage comprises ureteroscopy basket loading, and generating the assessment comprises:
counting the number of basket operations;
counting the number of ureteroscope retractions;
Determining a ratio of the number of basket operations to the number of ureteroscope retractions; and
the determined ratio is compared to other ratios from previous ureteroscopic basket operations.
14. The method of claim 12, wherein the first stage comprises a ureteroscopy drive, and generating the assessment comprises:
counting the times of manual driving of the peep mirror by a user;
counting the number of times the user robot drives the scope;
determining a ratio of the number of times the user manually drives the scope to the number of times the user robotically drives the scope; and
the determined ratio is compared to other ratios from previous ureteroscopic basket operations.
15. The method of claim 12, wherein the first stage comprises percutaneous needle insertion, and generating the assessment comprises:
counting the number of times a user attempts to insert a needle until the user successfully inserts the needle; and
the counted number of times is compared with recorded needle insertion attempts from previous percutaneous needle insertion operations.
16. The method of claim 12, wherein the first stage comprises percutaneous needle insertion, and generating the assessment comprises:
Counting the time it takes to survey the kidneys before selecting a target cup for percutaneous access; and
the counted time is compared to the recorded time from the previous percutaneous needle insertion operation.
17. The method of claim 12, wherein the first stage comprises percutaneous needle insertion, and generating the assessment comprises:
counting the number of times the navigation field generator for tracking the needle is repositioned; and
the counted number is compared to the recorded number of repositioning from a previous percutaneous needle insertion operation.
18. The method of claim 12, wherein the first stage comprises percutaneous needle insertion, and generating the assessment comprises:
counting the number of times an automated alignment of the end effector of the robotic manipulator with a catheter is initiated; and
the counted number of times is compared with the recorded number of automated alignments from previous operations.
19. The method of claim 12, wherein the first stage comprises percutaneous antegrade ureteroscopy laser lithotripsy, and generating the assessment comprises:
counting the laser lithotripsy time of the stones;
Determining the size of the stone; and
the laser lithotripsy time is compared to the ratio of the size of the stone and previous ratios from other operations.
20. The method of claim 12, wherein the first stage comprises ureteroscopic laser lithotripsy, and generating the assessment comprises:
determining the type of the stone; and
statistics are aggregated across surgical procedures based on the type of the stone.
21. The method of claim 12, wherein the first stage comprises ureteroscopic laser lithotripsy, and generating the assessment comprises:
counting a duration for which a view of the video capture device becomes obscured by dust from stone fragmentation; and
the counted duration is compared with the recorded duration from the previous operation.
22. A control system for a robotic device for evaluating an identified stage of a medical procedure, the control system comprising:
a communication interface configured to receive sensor data, user input data, and video data from the robotic device;
a memory configured to store the sensor data, the user input data, and the video data; and
One or more processors configured to:
determining a first state of a manipulator of the robotic device based on sensor data from the one or more sensors;
identifying a first input from the user input data for initiating a first action of the manipulator;
performing a first analysis of video of a patient site captured by the video capture device;
identifying a first phase of the medical procedure based at least in part on the first state of the manipulator, the first input, and the first analysis of the video; and
an assessment of the first phase of the medical procedure is generated based on one or more metrics associated with the first phase.
CN202180077710.2A 2020-11-20 2021-11-16 Automated procedure assessment Pending CN116456924A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US63/116798 2020-11-20
US202063132875P 2020-12-31 2020-12-31
US63/132875 2020-12-31
PCT/IB2021/060596 WO2022106991A1 (en) 2020-11-20 2021-11-16 Automated procedure evaluation

Publications (1)

Publication Number Publication Date
CN116456924A true CN116456924A (en) 2023-07-18

Family

ID=87126014

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180077710.2A Pending CN116456924A (en) 2020-11-20 2021-11-16 Automated procedure assessment

Country Status (1)

Country Link
CN (1) CN116456924A (en)

Similar Documents

Publication Publication Date Title
US11439419B2 (en) Advanced basket drive mode
CN114901192A (en) Alignment technique for percutaneous access
CN114929148A (en) Alignment interface for percutaneous access
US20230080060A1 (en) Automated procedure evaluation
US20220061941A1 (en) Robotic collision boundary determination
US20220096183A1 (en) Haptic feedback for aligning robotic arms
US20230082310A1 (en) Ai-based triggering of automated actions
US20230093555A1 (en) Ai-assisted workflow segmentation
CN116456924A (en) Automated procedure assessment
CN116456923A (en) AI-assisted workflow segmentation
CN116669648A (en) AI-based triggering of automated actions
WO2022064369A1 (en) Haptic feedback for aligning robotic arms
US20230225802A1 (en) Phase segmentation of a percutaneous medical procedure
US20230381399A1 (en) Catheter tip
US20230202040A1 (en) Robotic instrument drive control
US20240000530A1 (en) Robotic and manual aspiration catheters
WO2022049491A1 (en) Robotic collision boundary determination
JP2023553816A (en) Visualization adjustment for instrument rotation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination