WO2015095715A1 - Simulator system for medical procedure training - Google Patents
Simulator system for medical procedure training Download PDFInfo
- Publication number
- WO2015095715A1 WO2015095715A1 PCT/US2014/071521 US2014071521W WO2015095715A1 WO 2015095715 A1 WO2015095715 A1 WO 2015095715A1 US 2014071521 W US2014071521 W US 2014071521W WO 2015095715 A1 WO2015095715 A1 WO 2015095715A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- simulation
- surgical
- signals
- instruments
- processing component
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B23/00—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
- G09B23/28—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B23/00—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
- G09B23/28—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
- G09B23/285—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for injections, endoscopy, bronchoscopy, sigmoidscopy, insertion of contraceptive devices or enemas
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
Definitions
- Disclosed features concern medical training equipment and methods, and more particularly medical training equipment and methods used for training in minimally invasive surgical procedures and techniques.
- Medical procedures on patients can involve a variety of different tasks by one or more medical personnel. Some medical procedures are minimally-invasive surgical procedures performed using one or more devices, including teleoperated medical devices. In some such systems, a surgeon operates controls via a console, which remotely and precisely control surgical instruments that interact with the patient to perform surgery and other procedures. In some systems, various other components to the system can also be used to perform a procedure. For example, the surgical instruments can be provided on a separate instrument device or cart that is positioned near or over a patient, and a video output device and other equipment and devices can be provided on one or more additional units.
- a simulator unit for example, can be coupled to a surgeon console instead of the actual other system components, to provide a surgeon with a simulation of performing the procedure. With such a system, the surgeon can learn how simulated instruments respond to manipulation of the console controls.
- assistants may move and position teleoperated arms and instruments of an instrument unit in the correct positions, which can have a significant effect on the procedure. It can be beneficial for assistants to also be able to find needed information quickly during a procedure. In addition, it can be beneficial to quantify training and performance of such tasks by surgeons and assistants, thereby enabling such personnel to track progress and improve performance.
- a system includes a simulation processing component including at least one processor and generating a virtual environment using position signals that describe at least one of a position and a configuration of a physical surgical instrument relative to a physical surgical site.
- the simulation processing component updates the virtual environment according to changes in the position signals and according to control signals corresponding to inputs by a user of the system.
- the updating of the virtual environment comprises moving a virtual surgical instrument within the virtual environment, where an interaction of the virtual surgical instrument with a virtual surgical site of the virtual environment is defined at least partly by a physical relationship between the physical surgical instrument and the physical surgical site.
- the simulation processing component outputs a representation of a simulation state signal indicative of a current state of the virtual environment.
- implementations can include a dummy instrument, anatomical model, control console, display device, teleoperable medical device, and/or other variations.
- a method includes coordinating a simulated medical procedure using a simulation processing component and receiving position signals based on one or more positions of elements of a teleoperable medical device moved by at least one trainee during the simulated medical procedure.
- the elements are physically positionable relative to a physical surgical site in order to perform the simulated medical procedure.
- Simulation state signals are determined based on the position signals, where the simulation state signals are indicative of a current state of the simulated medical procedure including integration of the position signals from the teleoperable medical device.
- the simulation state signals are sent to at least one output device operative to output a representation of the simulation state signals.
- Various implementations of the method can include receiving the position signals in a simulated setup procedure for setup tasks performed by a trainee, and/or in a simulated surgical operation following the simulated setup procedure, outputting real-time feedback information to at least one trainee performing the tasks, and other variations.
- a method includes receiving position signals indicating positions of one or more physical surgical instruments relative to a physical simulated surgical site in a simulated medical procedure.
- a virtual environment is updated based on the position signals, where the virtual
- Control signals are received from a control console and indicate manipulation of one or more input controls of the control console by a user.
- the method updates the virtual environment based on the control signals, including moving one or more virtual surgical instruments within the virtual environment.
- Interaction of the virtual instruments with the virtual surgical site are based on the positions of the one or more physical surgical instruments relative to the physical surgical site.
- Simulation state signals are output to at least one output device to cause output of a representation of the simulation state signals, where the simulation state signals are indicative of a current state of the virtual environment.
- Various implementations of the method can include physical surgical instruments being coupled to associated manipulator arms of a teleoperated medical device, or physical surgical instruments being manually operated by one or more users relative to a physical anatomical model, and other variations.
- Fig. 1 is a diagrammatic illustration of an example simulation system including a teleoperated medical system, according to some implementations;
- Fig. 2 is a block diagram illustrating an example of a simulation processing component and communication with other components of the simulation system;
- Fig. 3 is a flow diagram illustrating an example method for providing a simulated setup procedure according to one or more implementations described herein;
- Fig. 4 is a flow diagram illustrating an example method for providing a simulated surgical operation according to one or more implementations described herein;
- Fig. 5 is a diagrammatic illustration of aspects of an example system which can be used for automated evaluation of simulated medical procedures
- Fig. 6A and 6B are examples of training image screens which can be displayed on one or more display screens of a simulation system
- Fig. 7 A shows one example simulation system including examples of several components described herein;
- Fig. 7B shows an example display screen provided on the surgeon console of Fig. 7A;
- Fig. 8 is a perspective view of an example teleoperated medical device and anatomical model
- Fig. 9A shows another example simulation system including examples of several components described herein;
- Fig. 9B shows an example display screen provided on the surgeon console of Fig. 9A;
- Figs. 1 0A-1 0C illustrate examples related to tracking instruments within an anatomical model
- Figs. 1 1 A and 1 1 B are diagrammatic illustrations of one example of the use of an anatomical model in simulated medical procedures that include the use of both teleoperated and manual surgical instruments;
- Fig. 1 2 is a flow diagram illustrating an example method for using an anatomical model with reference to Figs. 1 1 A-1 1 B;
- Figs. 1 3A and 1 3B are diagrammatic illustrations of a second example of the use of an anatomical model in simulated medical procedures that include the use of manual surgical instruments;
- Fig. 14 is a flow diagram that illustrates an example method for using an anatomical model with reference to Figs. 1 3A-1 3B.
- the present application discloses features relating to simulated surgical procedures and training exercises.
- Various disclosed implementations of simulation systems and methods provide and teach realistic setup procedures for positioning, placement of simulation equipment for particular surgical procedures, as well as the realistic use of such equipment for the actual surgical operation.
- Simulations can involve some or all of the components involved in every stage of actual medical procedures and can involve any personnel in such procedures, to provide highly realistic training.
- Various tasks performed during all of these simulated medical procedures can be recorded and evaluated, with appropriate feedback on performances provided, allowing a high degree of analysis in the details of the procedures and enabling trainees for every function of a medical procedure to improve their skills more efficiently.
- Various simulation features described herein can allow users to learn and practice, and can allow quantification of user performance and tracking of user progress.
- a teleoperated medical system such as a da Vinci® Surgical System (e.g., a Model IS3000, marketed as the da Vinci® SiTM HDTM Surgical System), commercialized by Intuitive Surgical, Inc. of Sunnyvale, California.
- a da Vinci® Surgical System e.g., a Model IS3000, marketed as the da Vinci® SiTM HDTM Surgical System
- Intuitive Surgical, Inc. of Sunnyvale, California a teleoperated medical system
- a da Vinci® Surgical System e.g., a Model IS3000, marketed as the da Vinci® SiTM HDTM Surgical System
- Intuitive Surgical, Inc. of Sunnyvale, California Intuitive Surgical, Inc. of Sunnyvale, California.
- features disclosed herein may be embodied and implemented in various ways, including teleoperated and, if applicable, non-teleoperated (e.g., manual)
- da Vinci® Surgical Systems e.g., the Model IS3000; the Model IS2000, commercialized as the da Vinci® STM HDTM Surgical System
- Model IS2000 commercialized as the da Vinci® STM HDTM Surgical System
- a "setup procedure” or “surgical setup procedure” refers to setup tasks that configure system components to perform one or more later surgical operations.
- a “surgical operation” or “ surgical site procedure” refers to the actual surgical operation including surgical tasks at a surgical site.
- a "simulated medical procedure” or “simulated surgical procedure” can refer to the entire simulated procedure including setup procedure and surgical operation, or can include just setup procedure or surgical operation.
- the term “teleoperated medical system” refers to a system of one or more components used to perform surgical procedures using one or more master controller devices and one or more slave teleoperated medical devices.
- a “teleoperated medical device” can be a slave device controlled by a remote master device and can include one or more elements, such as manipulator arms and/or surgical instruments, that can be moved or manipulated in response to signals provided by one or more of the master controller devices, such as a control console or surgeon console operated by a user remotely from the teleoperated medical device.
- a "wet-lab” exercise refers to any exercise on actual (real) tissue, such as tissue samples, a porcine model, or cadaver.
- a “dry-lab” exercise refers to an exercise using non-tissue models or objects, including
- Fig. 1 is a diagrammatic illustration of an example simulation system 1 00 including a teleoperated medical system, according to some implementations.
- Simulation system 100 can be used to simulate actual medical procedures without using actual patients. Any simulated medical procedure or training activity that does not take place on an actual human patient can be performed using simulation system 1 00 or a variation thereof. For example, simulations of dry-lab training tasks (e.g., inanimate exercises) and/or wet-lab training tasks (e.g., exercises on real tissue, porcine model, or cadaver) can be performed.
- dry-lab training tasks e.g., inanimate exercises
- wet-lab training tasks e.g., exercises on real tissue, porcine model, or cadaver
- simulation system 1 00 can include a simulation processing component (or "processing component") 1 02, a surgeon console 1 04, a patient side cart 1 06, and a vision side cart 1 08.
- Other components can additionally or alternatively be included in the simulation system 1 00, as described in various implementations herein.
- Simulation processing component 1 02 can coordinate, control, and/or implement a simulation that involves the various other components of the simulation system 1 00.
- the simulation simulates a medical procedure environment involving the system components as if an actual patient were to be, or being, operated upon.
- the simulation processing component implements and controls the display of a virtual environment that includes a virtual surgical site depicting one or more elements of an actual physical surgical site.
- implementations can include a physical surgical site that includes a physical model and/or objects.
- the processing component 1 02 can coordinate simulation components of the simulation system and/or monitor and record parameters obtained during simulated medical procedures.
- the simulation is an interactive one that involves the simulation processing component 1 02 receiving a number of inputs from the other components of the system based on user manipulation of those components within the simulation.
- the simulation processing component also provides a number of outputs based on those inputs, where the outputs can coordinate the components of the simulation system and provide output to users of the system via any of different types of output devices (display screens, audio speakers, motors, etc.) provided on one or more of the components of the system 1 00.
- the simulation processing component can provide output to users via simulation state signals provided to one or more output devices.
- the simulation processing component can also provide feedback information to users via signals that it outputs.
- the simulation state signals can be indicative of a current state of the simulated medical procedure including integration of (e.g., influence from) inputs from one or more system components.
- the current "state" of the simulated medical procedure is the current point of progress or status in the performance of the medical procedure as influenced by the inputs of the components of the simulation system.
- simulation state signals can indicate the current positions of physical teleoperated surgical instruments of a teleoperated medical device of patent side cart 1 06 with respect to a physical surgical site, where these current instrument positions indicate the current state of progress in positioning the surgical instruments relative to the site in the simulated medical procedure, e.g., in a setup procedure.
- the simulation state signals can also indicate events in the medical procedure, such as collisions of any instruments with other instruments or surfaces, or mis-positioning of component elements.
- the current state of the simulated medical procedure can include a current state of a virtual environment, such as a virtual surgical site, implemented by the simulation processing component.
- the simulation state signals can include data that describes the virtual environment including virtual representations and current positions of surgical instruments.
- Simulation state signals can indicate current positions of virtual surgical instruments in the virtual environment based on control input from a surgeon console 1 02, where the positions of the virtual surgical instruments indicate the current state of a surgical task of a simulated medical procedure.
- the output device(s) can output a representation of the simulation state signals.
- the representation can be output using a variety of types of output, such as graphical (e.g., fully virtual / synthetic images, fully camera images, or combined camera/virtual images), tactile, haptic, aural, etc.
- the output representation can include graphical representations of physical instruments, displayed visual statuses, notifications, visual text and markers, audio cues and other output, haptic responses, and/or other output.
- the output representation can include a displayed environment at the surgical site, such as a virtual environment or images of a physical site.
- an initial state of a virtual environment can be selected by the simulation processing component by providing various controlling output signals to the other components, and users can experience current updates to the virtual environment via the state simulation signals based on user inputs via components such as the surgeon console 1 04 and/or patient side cart 1 06.
- the output device(s) can also output representations of signals providing feedback information.
- the simulation processing component 1 02 can be implemented using one or more processors (e.g., microprocessors, integrated circuits, logic, and/or other processing circuitry), as well as memory, input/output interfaces, and other components, as described below.
- simulation processing component 1 02 can be implemented as a particular external or standalone unit that is separate from the other components in the simulation system. In other
- the processing component 1 02 can be provided within or a part of one of the other components of the simulation system 100, and/or distributed within multiple other components of the system 1 00.
- One or more master consoles 1 04 can be included in system 1 00 to provide a user, such as a surgeon trainee, input controls by which surgical instruments can be teleoperated as well as various other controls.
- Surgeon console 104 can also include output devices such as visual, audio, and/or haptic output devices.
- a user operates the controls to provide control input signals to the simulation processing component.
- Control input signals can also be provided from a surgeon console 1 04 to one or more of the other components of the simulation system, such as the patient side cart 1 06 and/or vision side cart 108.
- teleoperated slave instrument arms of the patient side cart 1 06 can be controlled, e.g., each surgical instrument operated by one or more corresponding master controls of the surgeon console.
- the surgeon console 1 04 communicates with the simulation processing component 1 02 as indicated by connection 1 05.
- Connection 1 05 can be any type of communication channel, such as one or more wires or cables, wireless connections, etc.
- the surgeon console 104 outputs signals indicative of the manipulation of the controls of the console 1 04. For example, if a user moves levers, joysticks, or dials, selects particular buttons or touchscreen, or selects other controls, corresponding signals are provided to the simulation processing component 1 02. In some implementations, these signals can be standard signals provided to the other components of a teleoperated medical system during an actual medical procedure, such as patient side cart 1 06 and/or vision side cart 1 08, where the simulation processing component 1 02 can process these same signals. In other
- simulation signals which are specific to the simulation can be output by the surgeon console 1 04.
- the simulation processing component 1 02 can use the inputs to update a virtual environment of the simulation, for example.
- the surgeon console 1 04 also can output signals to one or more components of the simulation system 1 00, such as the patient side cart 1 06 and/or the vision side cart 1 08.
- signals received by the simulation processing component 1 00 can be relayed to these other components by the simulation processing component.
- the surgeon console 1 04 can have separate, direct connections similar to connection 1 05 with one or more of the other components of the simulation system.
- the output signals can drive the operation of these other components similarly to a teleoperated medical system that does not use a simulation processing component 1 02.
- surgeon console 1 04 receives signals on connection 1 05 from the simulation processing component 1 02.
- received signals include signals that would normally be received by the surgeon console 1 04 in an actual medical procedure, including simulation state signals used to update visual, audio, and/or haptic output devices of the surgeon console that provide a representation of the simulation state signals via video, audio, and haptic output to its user.
- these signals can be generated by the simulation processing component 1 02 to describe a current state of a simulated, virtual environment provided by the simulation processing component 1 02 and displayed at the surgeon console.
- received signals can include signals provided by one or more of the other components of the simulation system, such as signals received at the simulation processing component 1 02 from the patient side cart 106 and/or the vision side cart 1 08 and then relayed to the surgeon console 1 04 from the simulation processing component 1 02.
- the simulation processing component 1 02 can receive signals from one or more of the other components and can process or change these signals based on the simulation run by the simulation processing component. The processed signals can then be sent to the surgeon console 1 04 for its use.
- the simulation processing component 1 02 can create augmented reality data that is combined with or integrated into data received from the other components of the simulation system such as an image or video feed from an endoscope or other imaging device at the surgical site, and the combined data can then be sent to the surgeon console 1 04 as simulation state signals.
- the surgeon console 1 04 can have additional separate, direct connections similar to connection 1 05 with one or more of the other components of the simulation system to receive the signals from those other components similarly to a teleoperated medical system that does not use a simulation processing component 1 02.
- multiple master consoles 1 04 can be in
- Such multiple consoles can be each be operated by a dedicated user simultaneously during a medical procedure, e.g., to have each user control particular device instruments, to have one user assist the other in surgical exercises, etc.
- Each such surgeon console 1 04 can send signals to the simulation processing component 1 02 and can receive signals from the simulation processing component, e.g., describing a virtual environment.
- Some simulation implementations can allow a user at a console 1 04 to pass control of one or more surgical instruments (virtual and/or physical) or pass control of other components or inputs to a different user of a different console 1 04, e.g.
- console 1 04 or other device (e.g., other control panel in system 1 00).
- signals appropriate to each surgeon console can be received at that console, e.g., outputting a different visual perspective on a simulated surgical site at each console 1 04 based on which instruments that the particular console controls.
- Some implementations can include features specific to simulations having more than one console 1 04. For example, virtual pointers can be generated and displayed on display screens of the consoles 1 04, where one operator at one console 1 04 (e.g. an expert) can control the pointer and point to displayed objects as viewed by the other operator at the other console 1 04 (e.g., a new trainee).
- One or more patient side carts 1 06 can be included in simulation system 1 00 to provide realistic physical interactions of controlled devices that are made during an actual teleoperated medical procedure.
- one or more users such as trainee assistants who operate the patient side cart 1 06 can be trained during a simulated medical procedure using the actual patient-side devices used in
- Some trainees e.g., other, surgeon trainees
- Such features enable users to be realistically, accurately, and effectively trained during simulated medical procedures.
- the patient side cart 106 can be a standalone device separate from the other components of the system 1 00.
- Cart 1 06 can include a variety of different mechanisms and devices to enable teleoperated medical surgery on patients.
- the cart 1 06 includes one or more manipulable elements, such as multiple controlled manipulator arms 1 14 that each can have one or more surgical instruments removably the attached thereto.
- Such arms and their surgical instruments can be driven within particular ranges and modes of motion such as to allow a user of the surgeon console 1 04 to manipulate the instruments to perform a surgical medical operation on a patient.
- actuators e.g., motors
- the arms and/or instruments of the cart 1 06 can be controlled by signals from the console 1 04 and can drive movement of the instruments to perform surgical tasks.
- additional patient side carts 1 06 can be included in simulations.
- Some patient side carts can include teleoperated medical devices, while others can include other types of devices (other surgical instruments, video displays, operating room tables, etc.). Still others can include both teleoperated medical devices and non-teleoperated devices.
- a trainee user of the patient side cart 106 can perform a setup procedure involving the cart 106 to permit a (e.g. simulated) surgical operation to take place.
- this setup procedure can include tasks such as moving the cart to a proper position, and moving each arm 1 14 to a proper position.
- the setup of the patient side cart 1 06 can be with reference to a physical anatomical model 1 20.
- the anatomical model 1 20 can simulate a portion of a human patient or other subject, and can include various features allowing the surgical instruments of the patient side cart 1 06 to be positioned properly.
- the user places surgical instruments of the cart 1 06 within appropriate apertures of the anatomical model 1 20 (e.g., designated via port placement), so that the instruments obtain access to a physical surgical site simulated within the interior of the model 1 20.
- Other setup tasks may also be performed, such as installing the correct surgical instruments on the arms 1 14, selecting and operating particular controls of the cart 1 06 to enable needed functions, adjusting the positing of the manipulator arms to achieve patient clearance or avoid collisions, etc.
- the patient side cart 106 communicates with the simulation processing component 1 02 as indicated by connection 107.
- Connection 1 07 can be any type of communication channel, such as one or more wires or cables, wireless connections, etc.
- the patient side cart 1 06 can receive signals from the simulation processing component 1 02 which control its teleoperated functions, such as the movement of arms 1 14 and the manipulation of surgical instruments attached to the arms 1 14 and/or otherwise coupled to the cart 1 06.
- the patient side cart 1 06 can receive other signals such as simulation state signals (e.g., data) creating output from visual, audio, or other output devices on the cart 1 06 to the user of the cart.
- signals received by the patient side cart 1 06 can be generated by the simulation processing component 1 02 based on an implemented virtual environment, and/or can be provided by the surgeon console and passed through to the cart 1 06 by the simulation processing component 1 02.
- the patient side cart 106 also sends signals on connection 1 07 to the simulation processing component 1 02.
- signals can include data describing the current states of the cart 1 06, including positions and orientations of the arms 1 14 and surgical instruments of the cart 1 06 as determined by sensors of the cart 1 06.
- sensors of the cart 1 06. For example, joint position sensors, servo motor position encoders, fiber Bragg grating shape sensors, etc. can be used to determine kinematic information (position and/or orientation) associated with the manipulator arms.
- the signals can also include data describing a visual image of the physical surgical site as captured by an endoscope or other imaging instrument of the patient side cart 106 and /or other images describing the surgical site or simulated patient (e.g., rendered ultrasound images, patient vital signs, etc.).
- signals can also be sent, such as input data describing the cart user's actions or messages, audio data from a microphone or generated by interactions of the cart's devices, and other forms of data.
- signals describing states can also be sent, such as the states of particular cart controls, functions, etc.
- these signals can be standard signals provided to the surgeon console 1 04 of a teleoperated medical system for an actual medical procedure, where the simulation processing
- simulation components 1 02 can process these same signals.
- simulation signals specific to a simulation can be output by the patient side cart 1 06.
- the simulation processing component 1 02 can use the signals to update the virtual environment of the simulation, for example.
- the patient side cart 1 06 can send its signals to the simulation processing component 1 02, which generates appropriate signals in response which are sent to the surgeon console. In some cases or
- the simulation processing component can relay one or more of the signals from cart 1 06 directly to the surgeon console 1 04 via connection 1 25.
- the patient side cart 106 can have additional direct connections to the surgeon console 1 04, vision side cart 1 08, and/or other system components.
- the anatomical model 1 20 can include its own sensors and can provide signals to and/or receive signals from the simulation processing component 1 02 on a connection similar to connection 1 07.
- a connection 1 21 can provide signals between the anatomical model 1 20 and the simulation processing component 1 02.
- the model 1 20 can connect to patient side cart 1 06 which can relay signals between the model 120 and simulation processing component 1 02.
- Such sensors on the model 1 20 can allow manual surgical instruments to be tracked by the simulation, as described in greater detail below.
- system 1 00 can include other operating room equipment (e.g., operating table supporting the model 1 20, assistive tables or carts for additional surgery or support functions, etc.) which can include connections and communication to the simulation processing component 1 02 similarly to the anatomical model 1 20.
- other equipment e.g., operating table supporting the model 1 20, assistive tables or carts for additional surgery or support functions, etc.
- such other equipment can be included in and its use evaluated for simulation tasks and procedures described herein.
- One or more vision side carts 108 can be included in some implementations of simulation system 1 00 to provide output information to assistant users of the simulation system, and/or to hold equipment such as vision and data processing hardware.
- the vision side cart 1 08 can be a standalone device separate from the other components of the system 1 00.
- a vision side cart 1 08 can be used by an assistant, such as the assistant that sets up and operates the patient side cart 1 06.
- the vision side cart 1 08 includes one or more visual output devices, such as display screens, which can output a variety of information useful to the medical procedure being performed.
- the display screen can display a view of the surgical site as captured by an endoscopic camera provided on a surgical instrument of the patient side cart 1 06, which allows the assistant user to adjust the camera to positions needed for the surgical operation.
- the display screen can also display other output information such as the states of one or more controls being activated by the surgeon at the surgeon console, the states of other devices used in the medical procedure, etc.
- the vision side cart 1 08 communicates with the simulation processing component 1 02 as indicated by connection 1 09.
- Connection 1 09 can be any type of communication channel, such as one or more wires or cables, wireless connections, etc.
- the vision side cart 1 08 can receive signals from the simulation processing component 1 02 which control its functions, such as simulation state signals causing display of a virtual environment simulating the surgical site or display of images captured at the physical surgical site, display of status information related to various system components, and output of any other types of output (audio, haptic, etc.) via appropriate output devices of the cart 1 08.
- the vision side cart 1 08 can receive such signals provided by the surgeon console 1 04 and relayed through to the cart 1 08 by the simulation processing component 1 02.
- the vision side cart 1 08 also sends signals on connection 1 09 to the simulation processing component 1 02 and/or other components of the system 1 00.
- signals can include data describing the current states of controls or other input devices on the vision side cart which were activated by a user.
- the signals can include data received by the vision side cart 1 08 from other components such as patient side cart 1 06 and relayed by the cart 1 08 to the simulation processing component 1 02 and/or surgeon console 1 04.
- signals output by cart 1 08 can be standard signals provided to the surgeon console 1 04 of a teleoperated medical system, where the simulation processing component 1 02 can process these same signals.
- specific simulation signals can be output by the vision side cart 1 08.
- the simulation processing component 1 02 can use the signals to update the virtual environment of the simulation, for example.
- the vision side cart 1 08 can send its signals to the simulation processing component 1 02, which generates appropriate signals in response which are sent to the surgeon console 1 04 and/or to the patient side cart 1 06.
- the simulation processing component 1 02 can relay one or more of the signals from cart 1 08 directly to other components, such as to the surgeon console 1 04 via connection 1 27.
- the vision side cart 1 08 can have additional direct connections to the surgeon console 1 04, patient side cart 106, and/or other components.
- a variety of physical surgical instruments can be used by the simulation system 1 00 to more fully simulate an actual medical procedure.
- These surgical instruments can include complete, actual surgical instruments that are used in the actual medical procedure being simulated.
- standard manual surgical instruments such as a cannula 1 30 and laparoscopic instrument 132 can be used, which can be instruments not requiring the teleoperated patient side cart 1 06.
- surgical instruments used with the patient side cart 1 06 such as surgical instrument 1 34 and sterile adapter/ drape instrument 1 36 can be used, which are removably attached to teleoperated manipulator arms of the patient side cart 1 06.
- simulation system 1 00 can also or alternatively use non-operational "fake" surgical instruments 1 38.
- These can be instruments that are dummies used only for the simulation system and do not provide the full instrument functionality.
- the non-operational instruments 138 can include portions of instruments that can be attached to manipulator arms 1 14 like fully operational instruments, but need only be inserted in cannulas or apertures of the anatomical model 1 20.
- a shaft and end effector can be removed from a dummy instrument, and/or dummy instruments can be hollow instruments with no mechanism, or other non-operational versions of instruments that provide a user the experience of setting up and using such instruments during a simulated medical procedure.
- Fig. 2 is a block diagram illustrating an example of a simulation processing component 1 02 and communication with other components of the simulation system 1 00.
- Simulation processing component 1 02 can include an input processing block 202 which can perform various functions of the simulation. In some
- the input processing block 202 can implement one or more virtual environments for simulations provided by the simulation system.
- the virtual environment can provide a two dimensional (2D) or three-dimensional (3D) environment that can simulate a physical surgical site or a portion thereof.
- a portion of a human body can be simulated, including virtual models of skin surfaces and internal organs or other body structures, as well as virtual models of the surgical instruments and other objects used in an actual surgical operation.
- the virtual models can be changed and updated by the simulation processing block 202 based on signals 21 0 provided by the surgeon console 104 which indicate the manipulation of master controls on the surgeon console which direct how the surgical instruments on the teleoperated arms of the patient side cart are to be moved and manipulated.
- the signals 21 0 also can indicate other commands, such as entering particular usage modes, activating other surgical features (e.g., fluid spray, suction, etc.), or performing other functions.
- the input processing block 202 can receive signals 21 2 from patient side cart 1 06. These signals can include the positions and orientations of the manipulator arms and surgical instruments of the patient side cart, as well as statuses of various controls on the cart 106 as described above.
- the input processing block 202 can also receive signals 214 from the vision side cart 1 08, which can include statuses of various controls on the cart 108, etc., as described above.
- the input processing block 202 can also receive signals 21 5 from tracked devices 21 8, which for example can include one or more sensors of the anatomical model 120 that track manually operated surgical instruments.
- Other components of the simulation system (not shown) can similarly provide signals to the input processing block 202, such as operating room sensors that track component positions, etc.
- the input processing block 202 can also receive signals 21 6 from a simulator user interface (Ul) 220 in some implementations.
- the simulator interface 220 can present one or more options or selections to user(s) of the simulation system 100 to customize and/or select features of the simulation of the medical procedure.
- the simulator interface 220 can be presented on one or more of the components of the simulation system, such as surgical console 1 04, patient side cart 1 06, and/or vision side cart 1 08.
- the simulator interface can be implemented on its own dedicated device, such as a computer system (desktop computer, laptop computer, server, portable device, etc.).
- the simulator interface 220 can display options to a user, such as a number of different medical procedures to simulate, as well as various options, settings, and preferences for those medical procedures and for the components used in the simulation system. These selections can be provided in signals 21 6 to the input processing block 202. In some
- a single interface 220 can present options for simulated setup procedures as well as simulated surgical operations, thus allowing a unified interface to control simulated aspects of all stages of teleoperated medical procedures.
- the simulation processing component 1 02 can also include an output block 204.
- This block can provide signals to control or drive various components of the simulation system 1 00, as instructed by the simulation processing block 202.
- some signals can be signals to command functions on a component, such as signals controlling actuators on the patient side cart to move telemanipulator arms or to command another medical function (air suction, etc.).
- Some signals can be simulation state signals that cause an output to the user.
- the output block 204 can send a signal 230 output to the surgeon console 104 that provides video output on a display of the surgeon console, such as data causing a display of a virtual surgical site and virtual surgical instruments at the site that move in correspondence with a user's manipulation of the controls of the surgeon console 1 04.
- the output block 204 can send signal 232 to patient side cart 1 06, signal 234 to vision side cart, and signal 236 to simulator interface 220 to drive video displays on these components that are relevant to their functions.
- patient side cart 1 06 and/or vision side cart 1 08 can display a graphical virtual environment showing the surgical site based on a position of one or more endoscope instruments or other imaging instruments.
- Other visual output can be provided as well, such as status messages.
- output block 204 can send signals 235 to track devices to provide statuses, updates, etc. Other types of output can also be caused by signals to components, such as audio and haptic output.
- Simulator interface 220 can display an interface that can update its visual appearance based on input received from a user as provided in signal 236, such as a graphical user interface displaying graphical menu items and/or other selections and options, or other type of interface.
- Simulation processing component 1 02 can also include memory 206 in communication with the simulation processing block 202.
- Memory 206 can store various data needed by the simulation processing block 202 and simulation system 1 00.
- program instructions for implementing simulations and data describing one or more virtual environments, three-dimensional models, and various settings can be stored in memory 206.
- the simulation processing component 1 02 can monitor parameters based on events and actions occurring during simulated procedures, and can store such parameters in memory 206. For example, parameters such as time taken to perform a task that the procedure, positions of components during procedures, etc. can be stored, as described below.
- Fig. 3 is a flow diagram illustrating an example method 300 for providing a simulated medical procedure according to one or more implementations described herein.
- Method 300 can be controlled and/or coordinated by the simulation processing component 1 02.
- a simulated setup procedure is described for configuring one or more components of the simulation system before and in preparation for a simulated surgical operation that can be performed after the setup procedure.
- This example assumes the use of a patient side cart 1 06 having manipulator arms in the simulated setup procedure.
- Other implementations can include similar or equivalent setup components or tasks.
- the simulated setup procedure of method 300 can be performed while one or more user trainees are forming the setup tasks. For example, a single trainee can be required to perform all the tasks to obtain comprehensive training.
- multiple trainees can be simultaneously or otherwise required to perform setup tasks for the simulated procedure as in an actual surgical procedure.
- one trainee may be required to position components in operating room, anther trainee place ports, and another trainee position manipulator arms for docking.
- Advantages of the simulation system include the ability to train multiple trainees on a single system and or at the same time.
- the simulation processing component receives simulation option selections. These can be various selections to configure the setup
- Selections can include the type of surgical operation that is to be set up for simulation, such as procedures designed for general, urologic, gynecologic, transoral, cardiac, thoracic, and/or pediatric surgical operations. Selections can also include the particular system components to be used in the setup procedure, the experience level of the user trainee(s) involved, a difficulty level of the simulation (novice, standard, expert, etc.), time parameters for the procedure, etc. In some implementations, this interface can be the same interface used for the simulated surgical operations performed after setup (e.g., as described in Fig. 4).
- the simulation processing component receives and records signals indicating that the user is positioning one or more components of the simulation system.
- Such components may be required to be positioned in particular locations in the simulation area, e.g., absolute positions in the area or positions relative to each other.
- the patient side cart 1 06 can be placed relative to an operating table and/or anatomical model, and/or the vision side cart 1 08 can be positioned relative to the patient side cart 1 06, surgeon console 1 04, or other components.
- additional components can be positioned during setup simulation, such as the surgeon console and in any other components being used (anesthesia table, etc.).
- the component positions can be tracked using sensors, such as sensors for cameras positioned over the physical simulation area, sensors detecting the motion of the components, etc., and these positions can be sent to, monitored and recorded by the simulation processing component 1 02.
- sensors such as sensors for cameras positioned over the physical simulation area, sensors detecting the motion of the components, etc.
- a user can also indicate to the processing component that he or she has completed placing the components of the system, e.g., by providing input via the vision side cart 108 or other component.
- the simulation processing component 1 02 can record parameters such as the received signals and times taken to complete tasks, and can output signals causing feedback to be provided during this block.
- feedback can include a visual and/or audio displaying of instructions as to placement, graphical spatial diagrams or maps of actual and/or desired component placement, alerts when the user has deviated too much from appropriate placement, warnings when specific measures are not taken (e.g., moving the patient side cart without placing the arms up), etc.
- Feedback can be displayed on one or more of the output devices of the system components, in some implementations.
- the method can receive signals indicating that the user is positioning a model for the setup procedure. For example, in some
- static registration techniques can be used, where the user can move manipulator arms and instruments of the patient side cart 108 to touch the surface of the model at one or more known locations.
- the simulation processing component can determine the position and orientation of the model in 3-D space relative to the elements of the patient side cart, such as manipulator arms and/or surgical instruments. For example, this allows a virtual scene of the operating room and/or surgical site to be rendered and also can allows the simulator system to provide directed feedback, e.g., suggestions, evaluation and/or scoring, on which port(s) the user is currently using and how to move to the correct port, if appropriate.
- the simulation processing component 1 02 can record parameters such as sensor signals and times taken to complete tasks, and can provide feedback on user progress of the tasks performed during this block, such as updating visual displays.
- the positioning of the anatomical model can be sensed at a later time in method 300 instead of at block 306.
- the position and orientation of the model relative to the teleoperated medical device can be sensed after docking in block 31 0 using sensors of the teleoperated arms, and/or using sensors of the model.
- the simulation processing component receives and records signals indicating that the user is selecting or placing ports at a physical surgical site for use in a surgical operation.
- the ports can be placed in an anatomical model positioned at and/or including the physical surgical site.
- the ports are apertures or other locations in the model through which surgical instruments will be inserted, and the ports have specific pattern or distance requirements depending on the target anatomy and surgical operation selected for simulation.
- placing ports can include placing cannulas in selected apertures of the model (e.g., which can be detected from sensors in the model in some implementations), such as camera cannulas and operating instrument cannulas so that the desired surgical site portions are in view of an endoscopic or other camera surgical instrument and are in operating range of operating instruments to be placed in the cannulas.
- the system can detect the placement of ports using sensors within the anatomical model, and/or using sensors in the teleoperated arms after docking (described below). The user can indicate to the simulation processing component that he or she has completed placing the ports.
- the simulation processing component 102 can record parameters such as sensor signals and times taken to complete tasks, and can provide feedback on user progress or correctness of the tasks performed during this block (such as the correctness of position of placed ports), including updating visual displays.
- the simulation processing component receives and records signals indicating that the user is positioning ("docking") the manipulator arms and/or other elements of the patient side cart in appropriate positions and locations above or within selected ports of the model. For example, the user can position manipulator arms at particular angles, distances from each other, etc., in view of parameters such as mutual manipulator collision avoidance and required instrument range of motion.
- the simulation processing component receives signals from sensors in the arms of the patient side cart, which indicate the positions and orientations of the manipulator arms.
- Various implementations allow the user to manually move the arms or other elements by hand, and/or with remote control.
- the user can indicate to the simulation processing component that he or she has completed the docking.
- the simulation processing component 1 02 can record parameters such as sensor signals and times taken to complete tasks, and can provide feedback on user progress or correctness of the tasks performed during this block, such as updating visual displays.
- the simulation processing component receives and records signals indicating that a user is inserting the surgical instruments of the patient side cart in ports.
- the simulation processing component receives signals from sensors in the arms of the patient side cart and/or from sensors and surgical instruments which indicate the positions of the surgical instruments relative to the arms and/or surfaces of the model. Requirements can include particular distances or amounts of insertion, locking an instrument in place, etc.
- sensors in the model can detect instruments within cannulas.
- the surgical instruments are dummy instruments that are not functional as surgical instruments.
- sensors of the model and/or at other locations in the operating room
- the simulation processing component 1 02 can record parameters such as sensor signals and times taken to complete tasks during block 31 0.
- the processing component can also provide feedback in block 31 2. This can include the simulation processing component 102 outputting signals to cause a video display of various virtual images, progress indicators, suggestions, hints, warnings, etc. regarding instrument insertion by one or more display screens of the simulation system.
- an assistant user can view the display (e.g., at a vision side cart 1 08) to assist in determining whether arms and/or surgical instruments are properly positioned during the setup procedure.
- a display of the surgical site can include the current positions of surgical instruments and other objects at the surgical site.
- the display shows captured images of the physical surgical site at the patient side cart and/or model, such as captured by an endoscope instrument or other imaging instrument of the patient side cart (e.g. ultrasound sensors, patient vital sign sensors, etc.), model cameras or sensors, and/or other visual sensors directed at the physical site.
- This setup can be used for training on inanimate/dry-lab models or live tissue models, such as porcine or cadaveric training protocols.
- the display shows a virtual environment and virtual surgical site generated by the simulation processing component and based on detected positions of surgical instruments and/or other objects at the physical surgical site.
- the surgical instrument positions can be known from sensors in their manipulator arms, and positions of other objects at the site can be known from captured images sent by camera(s).
- the simulation processing component generates the virtual environment based on these known images and positions.
- the virtual surgical instruments and objects can be displayed to appear similar to their physical counterparts (if any), or can be displayed as virtual objects with different appearances in the virtual environment.
- the virtual environment can include display of realistic surroundings such as would be seen in an actual medical procedure.
- the background of the displayed surgical site can include body walls, blood vessels, or other realistic surroundings.
- the virtual environment can include accurate representations of the physical objects at the site, while the background and surroundings of the site can be made to look realistic as an actual medical procedure (e.g., as shown in Fig. 8B).
- the simulation processing component can output feedback information such as final parameters, metrics, score, and/or other feedback related to the setup procedure.
- feedback information may also or alternatively be displayed to the trainee during the performance of or upon completion of one or more tasks or exercises (e.g., in blocks 304-312), so that the trainee can monitor his or her progress or can compare his or her performance against other persons from a novice to expert range.
- metrics can be determined from recorded parameters and can include the times expended by an assistant for various tasks during the setup procedure, as well as a summary of the placement positions of the components and instruments used in the setup.
- An evaluation and score can also be determined by the simulation processing component based on the tasks completed by the user during the setup procedure, as described in greater detail below.
- the simulation processing component can output feedback indicating the result of evaluation, such as how well the trainee performed tasks, as well as hints or instructions for performing the tasks better. Some or all of this information can be output on one or more displays or other output devices of the simulation system.
- the method checks whether the setup is complete. For example, the simulation processing component 102 can evaluate the resulting positioning of the system components and determine whether the system
- the simulation processing component causes the user to repeat the appropriate stages or blocks of the setup procedure.
- Fig. 4 is a flow diagram illustrating an example method 400 for providing a simulated surgical operation according to one or more implementations described herein. In this example, a simulated surgical operation is described for performing surgical tasks at a surgical site (virtual site and/or physical site). Method 400 can be controlled and coordinated by the simulation processing component 1 02.
- method 400 can be performed after the setup procedure of Fig. 3.
- such implementations can offer the ability to use a simulation framework on a teleoperated medical system with a separate surgeon console and patient-side cart to monitor and track progress and to display output and feedback, all under a single software and user interface (Ul) framework.
- Ul software and user interface
- Method 400 is described assuming the setup of the simulation system components has been completed as described in Fig. 3.
- the simulated setup procedure of method 400 can be performed with one or more user trainees performing the surgical tasks.
- a single trainee can be required to perform simulated surgical tasks using the surgeon console.
- multiple trainees can be simultaneously or otherwise required to perform surgical tasks for the simulated procedure as in an actual surgical procedure.
- one (surgeon) trainee may be required to control instruments using the surgeon console, while a different (assistant) trainee may be required to control an additional manual instrument at the surgical site, or perform some other assistant function (e.g., exchange instruments, adjust positions of arms on patient side cart, adjust brightness of the endoscope feed, adjust ports, pass sutures to a teleoperated instrument using a manual laparoscopic instrument, coordinate a uterine manipulator to assist the console surgeon, etc.).
- assistant function e.g., exchange instruments, adjust positions of arms on patient side cart, adjust brightness of the endoscope feed, adjust ports, pass sutures to a teleoperated instrument using a manual laparoscopic instrument, coordinate a uterine manipulator to assist the console surgeon,
- Two (or more) surgeon trainees at two (or more) surgeon consoles can divide control of tasks, exchange control of instruments, and/or provide training to each other if, in other examples.
- Advantages of the simulation system include the ability to train multiple trainees on a single system and or at the same time.
- the simulation processing component can receive options and selections for the simulation. Such selections can include the type of surgical operation to be performed, particular stages or sub-stages of the operation to be performed, the particular components and/or instruments to be used, etc. For example, the same graphical interface used to provide selections to the setup procedure of Fig. 3 can be used for the surgical operation.
- the method checks whether the simulation is using only the surgeon console, without other components such as a patient-side cart and vision side cart. For example, the user may have designated a console-only simulation in block 402. If only the surgeon console is to be used, then in block 406, the simulation processing component 1 02 outputs signals (such as simulation state signals) to display a virtual environment on the console display device.
- This virtual environment can depict the surgical site at which the user of the console will be operating.
- a 3-D virtual environment can be displayed, including virtual anatomical structures that appear similarly to corresponding real anatomical structures.
- virtual surgical instruments controlled by the user of the console are also displayed in the virtual environment.
- the particular virtual anatomical structures and virtual surgical instruments displayed in the virtual environment can be based on the selections made in block 402, where the simulation processing component can determine the appropriate environment based on the type of procedure selected and other selections made by the user.
- the simulation processing component provides other signals to the console.
- the signals can include simulation state signals such as audio data informing the user of any events or interactions occurring in the virtual environment, haptic data for outputting haptic output at the console, and/or any other applicable data.
- the provided signals can also include performance feedback information such as metric, score, instructions, alerts, hints, or other information.
- the simulation processing component 102 receives signals from the console. These signals can include directional or positioning signals based on user manipulation of controls on the console, such as hand grips, buttons, foot pedals, and other controls.
- the simulation processing component can update the virtual environment based on the signals received from the console, including moving virtual surgical instruments to correspond with the user input as if the user were controlling physical teleoperated surgical instruments.
- Simulation processing component 1 02 can also determine interactions of the virtual surgical instruments with virtual anatomical structures, e.g., according to a physics model.
- the simulation processing component records parameters such as signals and events communicated during the simulation procedure.
- signals and events can be signals sent and received in blocks 406 - 41 0, alerts or other performance feedback provided in these blocks, user input provided in these blocks, the times expended by trainee(s) to complete surgical tasks, etc.
- the simulation processing component can check whether the simulated surgical operation is complete. For example, the user can indicate that the simulation is over via input by console controls. If the operation is not complete, the method returns to block 408 to receive signals from the console and continue the simulation in the virtual environment. If the operation is complete, the method continues to block 438, described below.
- the method finds that the simulated surgical operation is not only using a surgeon console, then the method continues to block 416, in which the method checks whether the simulation will display a virtual environment.
- a virtual environment can be displayed on a console display and other displays of the simulation system, such as a display on the vision side cart, and can be a similar virtual environment as described above for block 406. If a virtual environment is to be displayed, then in block 41 8 the simulation processing component 1 02 outputs signals (such as simulation state signals) to one or more components to display the virtual environment on displays. The method then proceeds to block 41 6, detailed below.
- the method determines that a non-virtual environment is to be displayed, then images of a physical surgical site as captured by a camera are to be received and displayed.
- the method checks if an augmented display is to be output.
- An augmented display allows the display of computer-generated graphics over the images captured by a camera. If not, the process continues to block 424. If an augmented display is to be used, in block 422 the simulation processing component processes visual overlay data to be overlaid on received images.
- visual overlays can include text, graphics, and interface objects which provide feedback information such as alerts, instructions, etc.
- the visual overlays can provide indicators to instruct where arms, surgical instruments, or other components of the system should be
- the simulation processing component 1 02 receives camera data from one or more endoscopes and/or other imaging instruments (e.g., ultrasound sensors, vital signs sensors, etc.) providing images of the physical surgical site and/or simulated patient.
- an endoscope can be one of the surgical instruments provided on an arm of the patient side cart.
- the simulation processing component 1 02 also determines and outputs signals (such as simulation state signals) to display the images of the surgical site on display devices of the system, such as display screens at the surgeon console 1 04 and the vision side cart 1 08 (if being used). If no augmented images are being used, the output signals include only the camera data. Otherwise, the augmented visual overlay signals of block 422 are combined with the camera images and the combined signals for output for display. Later in the simulation, the signals are used to update the displays.
- the simulation processing component 1 02 provides any other signals to the surgeon console and to other system components.
- the signals can include audio data for outputting audio at system components, haptic data for outputting haptic output at system components, etc.
- these signals can include control signals received from the surgeon console which the simulation processing component relays to the patient side cart and/or vision side cart.
- the signals can include position signals and / or control signals received from the patient side cart which the simulation processing component relays to the surgeon console and/or to the vision side cart.
- the provided signals can also include feedback information provided to system components, such as metrics, scores, instructions, alerts, hints, or other information (and/or feedback information can be included in augmented visual data in block 422).
- the simulation processing component 102 receives signals from the surgeon console 104 and other system components, such as the patient side cart 1 06 and the vision side cart 1 08 (if present).
- control signals from the surgeon console can indicate the manipulations of controls on the console by the console user, such as master levers used to move and otherwise manipulate the arms and surgical instruments of the patient side cart 1 06.
- Signals from the patient side cart can include position signals from sensors in teleoperated arms and surgical instruments indicating the position and orientation of those arms and instruments.
- Signals from the vision side cart can include control signals from controls on that cart operated by an assistant user.
- the simulation processing component can receive signals from the model, which can include sensor signals indicating positions of manual surgical instruments inserted in or contacting the model.
- the simulation processing component records parameters such as signals and events communicated during the simulation procedure.
- signals and events can be signals sent and received in blocks 41 8 - 428, alerts or other performance feedback provided in these blocks, user input provided in these blocks, the times expended by trainee(s) to complete surgical tasks, etc.
- the blocks 406-41 0 or blocks 41 6-428 can be implemented during a performance of one or more surgical tasks and/or exercises of the simulated surgical operation.
- a user such as a trainee surgeon can perform a simulated exercise at a simulated surgical site by teleoperating the surgical instruments inserted through the cannulas in the model, for by controlling virtual instruments.
- Such exercises can include suturing, manipulating objects, etc. at a virtual or physical surgical site, or one or more other simulated tasks.
- Assistant trainees can be performing patient-side tasks simultaneously or between tasks in some implementations.
- the simulation processing component checks whether an incorrect setup has been in place during the surgical operation simulation. This can be the case when a setup procedure was simulated before the surgical operation and included incorrect selection of model ports, positioning of system components (e.g., teleoperating arms, surgical instruments, or anatomical model), or included other incorrect settings or selections. Such incorrect setup settings can have a significant effect on a following surgical operation. For example, collisions may occur between arms or instruments, ranges of motion may be blocked, limits of motion may be prematurely reached, etc. If the simulated surgical operation was performed using this incorrect setup, then in block 434 the simulation processing component outputs feedback to the one or more users of the simulation indicating the incorrect setup and how and this incorrect setup has affected the surgical operation.
- system components e.g., teleoperating arms, surgical instruments, or anatomical model
- output feedback information can indicate that a mis- positioned arm or surgical instrument caused surgical tasks to be missed or performed poorly, unintended changes to be made to simulated patient tissue, etc.
- This feedback can also indicate the correct setup to allow the users to correct any errors.
- such feedback can be output at any point during the simulated surgical operation.
- the method continues to block 436, in which the method checks whether the simulated surgical operation is complete. For example, this can be indicated by one or more users providing input to the system to indicate the operation is over, or the simulation processing component can automatically determine that the operation is over based on evaluating component positions, the images of the surgical site, etc. If the simulated operation is not complete, the method returns to block 416 to continue displaying the surgical site environment and communicate signals between system components. If the simulated operation is complete, then the method continues to block 438. In block 438, the simulation processing component outputs feedback information such as final parameters, metrics, score, and/or other feedback to the users of the simulation.
- feedback information such as final parameters, metrics, score, and/or other feedback to the users of the simulation.
- feedback information may also be displayed to trainee(s) during the performance of or upon completion of one or more exercises (e.g., in blocks 406-41 0 or 416-428), so that a trainee can monitor his or her progress or can compare his or her performance against a database of other persons from a novice to expert skill levels.
- metrics can be determined from parameters and can include the times taken by the users for various tasks during the surgical operation, a summary of the placement positions of the components and instruments used in the surgical operation, etc.
- An evaluation and/or score can also be determined by the simulation processing component based on the surgical tasks completed by the user during the surgical operation, as described below.
- the simulation processing component can output feedback indicating the result of evaluation, such as how well the trainee performed tasks, as well as hints or instructions for performing the tasks better. Some or all of this information can be output on one or more displays and/or other output devices of the simulation system.
- the system can determine metrics and perform an automatic evaluation associated with one or more trainees' performances during the simulation based on recorded parameters.
- the system can also provide real-time performance feedback to trainees during the procedure and based on evaluations, in order to provide guidance during the procedure and for later procedures.
- an evaluation can include automatically comparing the parameters recorded during these procedures (and metrics determined therefrom) to stored reference parameters and metrics for the corresponding tasks.
- the reference parameters can be correct or optimal parameters for these tasks or parameters previously-recorded during previous simulated medical procedures.
- Parameters associated with relevant skills may be evaluated to measure trainee improvement or to compare one trainee's performance parameters to corresponding parameters demonstrated by other trainees (concurrent or historic) or by persons considered to have expert skill levels. Thus a trainee's skill level in a particular parameter may be evaluated relative to peers (e.g., to determine the trainee's progress with reference to anticipated improvement) or relative to experts (e.g., to identify deviations from a high skill level).
- patient-side skills of Fig. 3 and 4 associated with actions physically near the patient's location, e.g., manipulator arm position and orientation setup, cannula port placement, docking, assisting during surgical tasks, and the like
- surgeon-side skills associated with performing surgical tasks in the surgical operation of Fig. 4, e.g., teleoperating or manually positioning an endoscopic camera and moving tissue instruments at the surgical site can be evaluated.
- An evaluation component can measure parameters associated with the tasks performed by a trainee, such as the overall completion time of all tasks, completion time of particular tasks, the position and orientation of manipulators or instruments, as well as other parameters of the actions taken by the trainee.
- an evaluation can include determining one or more scores based on predetermined criteria associated with the parameters and the comparisons, where scores can indicate a performance level or skill of the trainee based on the performance in associated tasks. For example, a score can be based on the time needed to perform one or more tasks during the procedure and/or positioning or movement of system components during the one or more tasks.
- a trainee skill level associated with a specified parameter can be automatically scored by using kinematic and/or other sensor information obtained from a teleoperated medical system, such as from sensors of manipulator arms and of surgical instruments.
- an evaluation system may use sensor information to determine positions and orientations of instruments directed during an exercise.
- the sensor information can include kinematic information from the manipulator arms obtained during the performance of blocks 304-310 (e.g., using remote center positions of the surgical instruments and setup joint values), as well as sensor information from other sensors used in the procedure.
- a kinematic setup template can be created that defines a specific effective or ideal manipulator position and orientation for a specific surgical task. Data associated with a trainee's surgical task exercise performance is compared against the template to create a performance score.
- This comparison can be used to determine if a trainee has properly selected ports for a specific surgical task exercise, if manipulator arm setup joints and other structures are properly configured to place the associated manipulator arms at a proper position and orientation, if cannula ports are properly positioned and spaced to allow effective surgical site access with minimized manipulator collision avoidance, etc.
- the ideal template information can be, for example, clustered or averaged positions, movements, and/or placements from prior performances of trainees and/or experts, or known optimal positions for instruments, arm components, etc.
- a task exercise time parameter may be measured by starting a timer at the beginning of a cannula docking exercise and stopping the timer when the system senses that all manipulators have been properly docked to an associated cannula.
- a task exercise manipulator collision avoidance parameter may be measured by comparing sensor information from each docked manipulator arm against template sensor information to determine how close a trainee has come to placing the manipulators in prescribed ideal positions and orientations or within prescribed position and orientation envelopes.
- sensor information from the manipulator arms in conjunction with known physical dimensions of an anatomical model 1 20 can be used to determine if a trainee has properly positioned the cannulas in a correct port placement pattern in the model, or if the remote center of motion for each cannula (the location on each cannula that stays stationary in space as the manipulator arm moves) is correctly positioned so as to minimize tissue trauma at a patient's body wall.
- Scores can be determined in a variety of ways. A trainee may be scored, for example, on how well port placement is selected for a selected surgical operation, or how long it takes to determine the correct port placement. Or, a trainee may be scored on how the manipulator arms are coupled to the placed cannulas
- manipulator arm collision avoidance or how long it takes a trainee to couple the manipulator arms to cannulas placed in an anatomical model.
- Metrics also may be sampled and/or determined to indicate a trainee's performance as he or she completes the exercise, and these intermediate evaluations may be plotted against a template to obtain a score. For example, historic data may indicate that specific acts should be completed in a certain order in order to most effectively complete a task, sensor data may be used to show the actual order in which a trainee performed the acts, and differences between the recommended versus actual order of acts completed is used to determine a trainee's score.
- the system can similarly record and determine parameters such as completion time of one or more tasks of the exercise and arm positions based on kinematic data for computing metrics (e.g., movement volume, errors in the exercise, economy of motion of the instruments, etc.). Performance parameters can be measured at multiple times during the performance of surgical tasks during exercises and metrics determined from those parameters.
- parameters such as completion time of one or more tasks of the exercise and arm positions based on kinematic data for computing metrics (e.g., movement volume, errors in the exercise, economy of motion of the instruments, etc.).
- Performance parameters can be measured at multiple times during the performance of surgical tasks during exercises and metrics determined from those parameters.
- a training exercise can require that the trainee pick up a ring with an operating instrument, move the ring along the pathway to a finish position (transferring the ring to another instrument controlled by a different hand as needed) without dropping the ring, while moving the camera to keep the ring and instrument tips in the center of view at all times, and while repositioning controllers to keep the trainee's hands in central controlling positions.
- the trainee can be required to drive a needle in a predetermined pathway of suture holes in the component while keeping the site in view of the camera, or suture an opening closed with spatial requirements as to the locations of the sutures.
- Patient-side tasks during a surgical operation e.g., an assistant trainee guiding one or more instruments at the surgical site, controlling accessory equipment, etc.
- Scores and/or other results of the surgical operation evaluation indicate an estimated level or skill of the trainee for the evaluated surgical exercise.
- Some implementations can provide graphical feedback, e.g., indicating how close the operating instrument end effectors are to ideal or correct positions for the surgical task, and/or ideal locations for sutures, cuts of tissue, etc.
- Some implementations can output real-time feedback during the task performance, such as indicators of correct or incorrect sutures, instrument positions, hints to the trainee, etc.
- Some real-time feedback can be instructional, indicating how instruments should be placed, moved, or positioned.
- parameters and/or scores can be determined a first time based on a particular user's or team's performance, and the same parameters and/or scores can then be recorded at a second time during the performance of the same type of procedure. These parameters can then be compared to evaluate the same user's or team's performance for particular procedure. In other examples, the parameters can be recorded for different users or teams, and compared to evaluate and compare the different users and teams. Such scoring of a trainee based on simulation procedures allows
- a trainee can be scored in relation to other trainees or in relation to historic data in order to determine how well the trainee can perform the required task and/or to evaluate the trainee's relative learning speed and effectiveness and/or determine the trainee's skill level.
- aggregate historical scoring may reveal that trainees have difficulty performing a certain task, and so training can be modified to improve a training program for that task.
- the methods of Figs. 3 and 4 can also be used to measure and evaluate performances of multiple trainees or teams at once and in various roles during a training exercise.
- the simulation system can provide training for teams of persons, such as one or more surgeons, assistants, nurses, etc.
- one or more assistant trainees can perform patient-side tasks for the methods of Fig. 3 and or Fig. 4, and a surgeon trainee can perform surgical tasks in the method of Fig. 4 while operating a console.
- Trainees other than the surgeon can use an anatomical model to practice patient-side skills (e.g., port placement, docking, system setup, camera and instrument insertion) since they will often perform these activities in the operating room.
- the team can also train their communication to perform and coordinate various tasks such as exchange instruments, adjust ports, pass sutures using a conventional laparoscopic tool, coordinate a uterine manipulator to assist the console surgeon, etc.
- the evaluation and scoring methodology described above can be extended to evaluate the performance of operating room teams in addition to individual trainees. For example, various scores can be output indicating the performance level or skill for coordinated team tasks. Such evaluation can be assisted by automated metrics to track progress and compare to historical data similarly as described above. These features can help provide proficiency standards for teams to understand their efficiency and how they can improve.
- Fig. 5 is a diagrammatic illustration of aspects of an example system 500 which can be used for automated evaluation of simulated medical procedures using simulation system 1 00.
- a medical device 502 is used, which in this example can include an input device such as a surgeon console 1 04, and/or a teleoperated medical device such as patient side cart 1 06, or other system that is capable of providing data concerning the position and/or orientation of one or more medical Instruments.
- the medical device 502 provides parameter information 504 to be stored in a memory 506 that is included in an evaluation system 508.
- evaluation system 508 can be implemented in the simulation processing component 1 02 and memory 506 can be implemented in memory 206, for example.
- Parameter information 504 can include performance parameters for a trainee's performance and/or related data, such as kinematic information or other sensor information as described above. Information 504 may be provided, for example, via an application program interface (API) interface in the simulation system. Parameter information 504 can be provided from a patient-side cart 1 06, and/or from other components of the system, such as information describing position and/or orientation of controls for a operator (such as a surgeon or trainee) on a surgeon console 1 04, vision side cart 1 08, etc.
- API application program interface
- anatomical model information 51 0 (e.g., physical dimensions, locations of possible cannula ports, location of surgical manipulators or instruments, etc.) associated with an anatomical model 1 20 is also input to the memory 506.
- Ttemplate information 51 2 can also be input into memory 506, indicating baseline, desired, and/or correct parameters and data for comparison to trainee performance parameters.
- Other parameter information can also be stored in memory 506, such as event data, e.g., recorded times related to trainee tasks and task completions, etc., and which can be collected and/or determined by other components of system 1 00 or 500 such as processor 514, sensors of the system, etc.
- memory 506 can be one or more physical memory locations that can store information that evaluation system 508 uses to carry out an evaluation of a trainee's performance.
- processor 514 can be one or more information processing devices (e.g., microprocessor(s) or other processing circuitry) that can be used to carry out the evaluation.
- the evaluation results such as one or more scores, guidance feedback, and/or other information, can be output via an output device 516, such as a visible display on a display screen or other display device, a physical printout from a printer, or other output.
- the individual exercise results may be added to historic data 520 (e.g., depending on an input at operator selection input 51 8), which in turn may be used to modify template information 51 2.
- an operator input device 51 8 enables a training system operator to input various selections related to training exercises, such as identifying a particular surgical exercise task to be carried out, and/or identifying a particular anatomical model that is being used.
- the evaluation system can automatically select the appropriate information (e.g., proper template information 51 2) to use to carry out the
- Embodiments of a evaluation system 508 may be embedded in teleoperated medical systems (e.g., with outputs displayed via the system's displays) or may be implemented, for example, on a separate small computer system, such as a laptop computer or other electronic device. Such evaluation systems may also be networked to a central database to facilitate data collection from a number of medical devices and from a population of medical personnel (e.g., surgeons) and to facilitate data and/or scoring comparison within the trainee or surgeon population.
- a central database to facilitate data collection from a number of medical devices and from a population of medical personnel (e.g., surgeons) and to facilitate data and/or scoring comparison within the trainee or surgeon population.
- Scoring aspects for training can be adapted for training in such manual tasks, such as ability to reach locations at the surgical site, instrument interference, camera position, surgeon comfort, etc.
- Automated scoring aspects can be based on sensing a position of one or more components, such as cannulas, surgical instruments, etc. by various sensors in an anatomical model and/or in other locations as described above.
- Various implementations can provide results of evaluations and/or guidance feedback to trainees' during and after the simulated procedures, indicating differences to ideal or desired positions, times, metrics and/or scores, trainee level and/or skill, and suggestions and/or instructions for the correct or desired results.
- graphical diagrams can be displayed on a display device indicating how close the manipulator arms are positioned to ideal or correct positions for the surgical task.
- some implementations can output real-time feedback during the performance of tasks, such as indicators of correct or incorrect placements and positions of surgical instruments, hints to the trainee, graphical indications of correct positioning and orientation and the acceptable range of motions and placements for particular instruments, etc.
- Some real-time feedback can be instructional, indicating where and when in the procedure that instruments should be placed or positioned.
- the system can provide tutorials to persons, demonstrating how to select ports in a model, position components, and dock manipulator arms.
- Figs. 6A and 6B are examples of training image screens 600 which can be displayed on one or more display screens of a simulation system as described above.
- the images on screen 600 can be displayed to assist and guide the user in placing the manipulator arms of the patient side cart during a setup procedure.
- display screen 600 shows an image 602 of an example
- the image 602 includes images of three manipulator arms 604.
- a user physically moves the physical arms corresponding to the arm images 604, where this motion is sensed by the simulation processing component (or other processing component) via arm sensors, and processing component causes the arm images 604 displayed on screen 600 to move in correspondence with the physical arms.
- Indications can also be displayed to indicate the status of the arms relative to desired positions in a particular stage or block of a setup procedure.
- the image 602 can indicate that the position of one or more of the arms of the physical side cart is currently incorrect or suboptimal.
- display screen 600 can display an enclosed line or border 606 around an area of the image 602 that is incorrect in position.
- the left manipulator arm 604 is shown to be incorrect in an area of its joints which are circled by the line 606.
- the line 606 thus directs the viewer's attention to the incorrectly positioned area.
- a more precise indication such as a highlight 608 of a particular joint can specifically point out the incorrect positioning.
- the highlight 608 can be of a particular color, pattern, or other distinguishing mark.
- a legend 61 0 can indicate that particular problem that indicated by highlight 608, which in this case is that the left manipulator arm (Patient Side Manipulator, PSM) 604 is not facing forward enough, e.g., toward an anatomical model. Additional explanations of the incorrect positioning can be displayed in some implementations, if desired by the user.
- PSM Patient Side Manipulator
- Fig. 6B shows another example of display screen 600, in which a different portion of the physical patient side cart is incorrectly positioned as indicated in the displayed image 602 of the patient side cart.
- a line 61 6 indicates an area of the image 602 having inaccurate positioning, which in this example is an endoscopic center arm (ECM), i.e., the center arm 604 of the patient side cart.
- ECM endoscopic center arm
- a highlight 61 8 indicates the particular joint of the arm 604 which should be corrected.
- legend 61 0 informs the viewer that highlight 61 8 indicates that the center endoscopic manipulator 604 is not positioned in a "sweet spot" which allows the instrument of that harm to provide accurate or optimal views when inserted in a patient or model for the currently selected surgical operation.
- Other types of lines, borders, and/or indicators can be displayed in other implementations, including visual, audio, haptic, or other forms.
- Virtual reality or other generated images such as those of screen 600, and/or augmented reality (AR) ghost images overlaid on camera images, can be displayed on system display devices to indicate or highlight system areas of concern or interest to users.
- manipulator arm setup joints can have incorrect positions highlighted, as described above.
- reachability limits of manipulator arms and/or instruments can be displayed.
- spatial areas where internal or external collisions between arms and/or instruments can be highlighted as zones for the user to be aware of.
- feedback information such as suggestions can be displayed on screen 600.
- text suggestions can indicate an estimated amount of movement which would cause a highlighted arm to the mood into a correct position.
- Graphical suggestions can display correct positions (e.g., in a different color or other format) on the same display 600 relative to the current, incorrect positions. Broader hints can also be provided to allow the user to exercise judgment or make decisions. Such suggestions can guide the user in a training exercise to learn the correct way to perform tasks.
- a variety of other types of feedback information can also be displayed on one or more display screens during setup procedures and surgical operations to provide guidance on system setup and accurate positioning of the manipulator arms, as well as guidance on surgical tasks.
- text information messages such as instructions and alerts can be displayed to inform users of correct or incorrect actions or positioning during procedures.
- Some feedback can be provided in other forms, such as audio output or haptic output (e.g., on motor-controlled manipulator arms and/or surgical instruments).
- the simulation system can thus provide guidance and feedback to trainees for system setup and skills during procedures such as dry-lab or wet-lab patient- side training scenarios. This can also reduce the burden on training assistants to constantly catch mistakes during simulation procedures, especially when training multiple trainees simultaneously.
- the system can also provide guidance and feedback during surgical operations and tasks, e.g., to assistant users operating the patient side cart (and other components) and/or manual instruments at an
- the simulation can include a setup procedure using the patient side cart 1 06, vision side cart 1 08, and/or any other system components needed for simulating a setup for a surgical operation.
- only the patient side cart 106 is used in the setup simulation.
- a user can set up the position of the cart, the arms of the cart, and the surgical instruments of the cart, and the simulation processing component 1 02 can read the positions of these elements and provide current simulation state and feedback as to how well the setup was performed.
- an output device such as a display screen on the patient side cart or on another component, and/or audio speakers, can be used to output representations of the current states of the setup procedure and/or feedback regarding user performance of the setup procedure.
- Some implementations of the simulated setup procedure can include the use of an anatomical model.
- An assistant trainee can set up an anatomical model on an operating table.
- the model can be an inanimate model made of a rigid material and can be approximately shaped like a portion of a human patient. This model can be set up with a particular configuration for a particular surgical operation.
- exercise devices can be placed within the model, such as beads or rings to be manipulated on wires, rubber or foam pieces of material to be sutured, cut, or otherwise manipulated with surgical instruments.
- the model can be a biological specimen such as a porcine or cadaveric model, and / or one or more biological specimens can be placed within an inanimate model.
- the assistant trainee can position a patient side cart in an operating position next to the anatomical model.
- the simulation processing component 1 02 can receive signals indicating the position of the patient side cart 1 06 and the simulation processing component can provide signals to a display such as a display screen on the patient side cart 1 06 or vision side cart 1 08 which provide indications of the current states of the procedure and feedback to the assistant trainee as to the correctness of the positioning of the patient side cart relative to the anatomical model.
- the assistant can position the arms of the patient side cart to contact the model surface at multiple points on the surface, which locates the model relative to the patient side cart. The assistant trainee can then position the arms and instruments of the patient side cart relative to the anatomical model. The assistant can select appropriate apertures in the model, place cannulas into the selected apertures of the model, and then place the surgical instruments into the appropriate cannulas.
- fully functional surgical instruments are provided on the arms of the patient side cart, and the trainee can insert the surgical instruments in ports on the model.
- one or more dummy instruments are provided on the arms of the patient side cart. These dummy instruments can include base portions of instruments which can be inserted into cannulas, but do not include end effectors such as claws, scissors, or scalpels.
- manual instruments can also be used in the medical procedure in connection with the anatomical model.
- the assistant trainee can place manual instruments in appropriate apertures of the model and the positions of these instruments are tracked by the simulation processing component.
- a vision side cart 1 08 can also or alternatively be used in the setup simulation.
- the vision side cart is used with an anatomical model and a patient side cart in the setup procedure.
- the user can view a model and/or the arms and instruments of a teleoperated medical device on a display screen of the vision side cart as captured by one or more cameras, e.g., positioned over and/or within the model.
- the user can view a display screen of the vision side cart to determine if surgical instruments have been correctly positioned, based on the view of an endoscope which provides images of the surgical site to the display screen after the user has positioned the endoscope instrument.
- a virtual environment can be displayed that models the anatomical model, physical surgical site, and/or teleoperated arms and instruments.
- the user can be required to use controls on the vision side cart to control one or more functions in the simulated procedure.
- the vision side cart is used in the setup procedure without the anatomical model and/or without the patient's side cart.
- a user can be tested in positioning of vision side cart within the operating area, and/or the positioning of the vision side cart relative to other components such as a surgeon console.
- one or more surgeon consoles can be included in a simulated setup procedure.
- the positioning of the surgeon console within the operating area can be simulated.
- a surgeon trainee can also be required to perform some tasks during the setup procedure, such as port selection on an anatomical model.
- the surgeon console can be used singly, or in conjunction with other components, such as a patient side cart, vision side cart, and/or anatomical model.
- a setup procedure can include only the surgeon console and an anatomical model, where the positioning of the components in the operating area, and the setup of instruments on the anatomical model are simulated.
- a virtual surgical site based on the physical side of the model can be displayed by the surgeon console while a user sets up manual instruments in the model.
- any other components of the simulation system can also be used singly or in conjunction with other components in a setup simulation.
- the simulation processing component can monitor the used system components and the assistant can be required to position each system component correctly within an operating area or room.
- the setup procedure can be the only simulation performed. In other implementations, a simulation of a surgical operation can be performed after the setup procedure, as in an actual surgical procedure.
- a simulated surgical operation can be performed on its own, without a simulation of a setup procedure.
- the simulation of the surgical operation can be performed using a variety of implementations.
- only the surgeon console is used and the simulation component provides a virtual environment simulation in which virtual surgical instruments are displayed manipulating virtual structures at a virtual surgical site based on user input provided by the surgeon trainee operating the console.
- the teleoperated surgical instruments can be virtual instruments displayed in a virtual environment by the simulation processing component 1 02 and controlled by the console trainee.
- One or more real manual instruments can be inserted in the model and controlled by an assistant trainee, where the manual instrument position is tracked by the simulation processing component using sensors of the model, allowing the simulation processing component to display a virtual version of the manual instrument tip, or the image-captured manual instrument, within the virtual environment alongside the virtual teleoperated surgical instruments.
- surgeon console and the patient side cart 106 are used in the surgical operation.
- a virtual environment is displayed by the simulation processing component 1 02, e.g., on a display of the surgeon console 1 04.
- the trainee operating the surgeon console provides input controlling the physical arms and surgical instruments of the patient side cart, however, the simulation processing component displays corresponding virtual versions of these instruments at the surgical site on the displays of the simulation system.
- real or dummy instruments can be used for the surgical instruments of the patient side cart.
- Figs. 7 A and 7B One example of such an implementation is shown below with respect to Figs. 7 A and 7B.
- surgeon console and patient side cart are used in a simulated surgical operation in which the physical surgical site is displayed on a display device of the surgeon console.
- real, fully functional surgical instruments are inserted in the physical model, including one or more endoscopes having cameras which capture images of the physical surgical site (or other imaging instruments).
- the images are displayed on the console display.
- the console user thus sees the actual instruments he or she is manipulating.
- the simulation system can coordinate the simulation, including record parameters, provide guidance and evaluations, etc.
- Figs. 9A and 9B One example of such an implementation is shown below with respect to Figs. 9A and 9B.
- images displayed to users can be combinations of generated virtual graphics and captured images of a physical surgical site.
- generated virtual instruments can be displayed alongside captured images of other, physical instruments, or images of physical instruments can be displayed alongside a virtual background generated to look like an actual surgical site.
- images of the physical surgical site can be combined with augmented images that are displayed over portions of the images of the physical surgical site.
- graphics can be overlaid on the image of the physical site to provide feedback information such as statuses, instructions, alerts, and other information before, during, and after the medical procedure.
- a vision side cart is included in the system. Any of the above displays of the surgical site can also be displayed on one or more displays on the vision side cart that is viewed by an assistant user.
- the surgeon console and vision side cart can display different images or views in some implementations.
- some display screens of the simulation system can display an endoscopic or camera view of the physical surgical site, while other display screens of the system can display a virtual environment that corresponds to the physical surgical site.
- a camera view of the physical site can be displayed by the vision side cart for the assistant user to operate manual instruments at the site.
- a virtual environment corresponding to the physical site can be displayed on the surgeon console.
- the vision side cart can display instructional feedback instead of or in addition to images of the surgical site displayed by the surgeon console.
- Fig. 7 A shows one example of a simulation system 700 including examples of several components described herein.
- a surgeon console 704 can provide controls for a user such as a surgeon or surgeon trainee, who sits at the console to manipulate the controls, and can also include the display screen (shown in Fig. 5B).
- a patient side cart 706 includes a number of manipulator arms 714 which include surgical instruments at the ends of the arms and which are responsive to the controls operated by the user at the surgeon console 504.
- An operating room table 722 is positioned adjacent to the patient side cart 706, and can include an anatomical model 720 which can receive the surgical instruments of the patient side cart (the model in this example is covered by the cloth over the operating table).
- a vision side cart 708 can include a display screen 726 and other components, such as electronic equipment.
- the display screen 726 displays a virtual surgical site generated by the simulation processing component.
- the model 720 can be only a surface or object having one or more apertures and not having any interior physical surgical site, where the virtual surgical site on screen 726 is not based on any physical corresponding site.
- the virtual surgical site shown on screen 726 can correspond at least partially to a physical site included within the model 720.
- the simulation processing component 1 02 can be located in one or more of the components of system 700, such as the surgeon console 704, the patient side cart 706, etc., or can be located in its own housing (not shown).
- FIG. 7B an example display screen 740 is shown that is provided on the surgeon console 704.
- two stereoscopic display screens 740 can be provided to show a 3-D view, and/or screen 740 can be a touch- responsive screen.
- display screen 740 displays a virtual environment generated by the simulation processing component 1 04.
- virtual instrument tips 742 e.g., end effectors or other end portions
- These displayed virtual instrument tips 742 also track the physical instrument tips at the patient side cart 706, which are moved within the model 720.
- Objects in the environment are also displayed, such as a thread 744 grasped by the instrument tips 742 used in suturing a portion of the object 746.
- the virtual thread is generated within the virtual
- the object 746 is generated as a new virtual object different from any physical objects within the model 720.
- the virtual thread 744 can correspond to a physical thread being manipulated within the model 720 by the physical instrument tips.
- the manipulated object 746 can also correspond to a physical object within the model 720,
- the display screen 726 on the vision side cart 708 can display the same environment displayed on the screen 740 of the console 704. This allows an assistant user to view the scene that the console user is viewing, allowing greater assistance during the operation procedure.
- an endoscope or other imaging device on the patient side cart 706 can capture images of the physical site within the model 720, and these images of the actual physical site can be displayed on display screen 740 and and/or display screen 726 instead of a generated virtual environment, or in combination with some virtual, generated objects.
- Fig. 8 is a perspective view of an example teleoperated medical device 800 that can be included in the patient side cart 1 06, similar to the patient side cart 706 shown in Fig. 7A, and an example anatomical model.
- Device 800 can include multiple manipulator arms, where each arm is coupled to one or more surgical instruments.
- each arm can be considered a teleoperated manipulator that can be coupled ("docked") to each port or cannula in a model for patient, and the manipulator controls both the cannula and the instrument that extends through the cannula and into the model or patient to reach the physical surgical site.
- one instrument 802 can be a camera or endoscope instrument and the three other instruments 804, 806, 808 can be surgical operation instruments.
- Model 820 can include multiple holes 822 and a top surface simulating a surface of a patient and through which cannulas and surgical instruments are inserted. In some implementations, one cannula and instrument can be inserted in each hole, while in other implementations, multiple cannulas and/or instruments can be inserted through a single hole (e.g., single-site).
- the model 820 can include a hollow space underneath or within, which can hold one or more physical surgical sites 824 at which physical exercises can take place manipulating exercise objects, such as flexible materials, thread, beads on wires, etc.
- Model 820 is placed on an operating table (such as table 722 described above) at a location corresponding to a patient's position on the table.
- different surgical operations may require various different port placements, and a user being trained may have to position the device 800 in one location for one surgical operation (e.g., at the foot of the operating table, simulating a location between the patient's legs) and in a second location for another surgical operation (e.g., to the side the operating table).
- Some examples of anatomical model 820 and exercises are described in copending Patent Application no. 1 3/968,253, entitled, "Anatomical Model and Method for Surgical Training," which is incorporated herein by reference in its entirety.
- Fig. 9A shows another example 900 of a simulation system including examples of several of the components described herein.
- a surgeon console 904 can provide controls for a user and can also include one or more display screens (example shown in Fig. 9B).
- a patient side cart 906 includes a number of manipulator arms 914 which include surgical instruments at the ends of the arms and which are responsive to the controls operated by the user at the surgeon console 904.
- An operating room table 922 is positioned adjacent to the patient side cart 906, and can include an anatomical model 920 similarly as described above.
- a vision side cart 908 can include a display screen 926 and other components, such as electronic equipment.
- the display screen 926 displays a virtual environment similar to the environment displayed on the screen of the surgeon console as described below in Fig. 9B.
- a component corresponding to simulation processing component 1 02 can be located in one or more of the components of system 900 similarly as described above.
- Simulation system 900 can also include manual surgical instruments, such as manual instrument 930 which is shown as a laparoscopic instrument.
- a manual instrument 930 can be guided and manipulated by an assistant user into or relative to model 920 during simulated surgical operations, while a surgeon user controls teleoperated surgical instruments using the surgeon console 904.
- the surgeon trainee an assistant trainee can train together during a simulation.
- the surgeon trainee can operate the surgeon console 904 and can operate one or more manual instruments 930, e.g. Where one or more telemanipulated instruments can be operated by the simulation system
- Some implementations can enable the simulation of a surgical operation using teleoperated instruments, and then enable the simulation of that same surgical operation using one or more manual instruments. The results of these two simulations can then be compared by the system, and results summarized.
- manual instrument 930 Some implementations including manual instrument 930 are described below.
- FIG. 9B an example display screen 940 is shown that can be provided on the surgeon console 904.
- display screen 940 displays a virtual environment generated by the simulation processing component 1 04, which can be a 2D or 3D environment, and can be displayed on a touch-responsive screen in some implementations.
- the virtual environment presents a realistic background of the interior simulating an actual patient surgical site within an actual patient, including body tissue and other body components instead of the generated exercise environment of Fig. 7B.
- Virtual instrument tips 942 are displayed and are moved on the display screen 940 based on user manipulation of the controls at console 904.
- the displayed instrument tips 942 also track the physical instrument tips of the patient side cart 906, which are moved at a physical surgical site within the model 920.
- Objects in the virtual environment are also displayed, such as a ring 944 grasped by an instrument tip or end effector 942a and moved along a track 946 following a virtual object 948 as an exercise.
- the ring 944 and wire track 946 can have physical corresponding objects provided in the model 920 which are manipulated by the physical instruments corresponding to the virtual instruments 942.
- none of the virtual objects need correspond to physical instruments, where the physical instruments of the patient side cart can be dummy instruments.
- just the virtual instruments can correspond with physical instruments, which interact with nothing in the model.
- instrument tip 942b can be grasping a virtual object 950 which has no physical corresponding object at the physical site in the model 920.
- haptic output can be provided on the controls of the surgeon console 904 using one or more actuators of the surgeon console, to provide the user the sensation of manipulating the object 950.
- an instrument tip 952 can be displayed on screen 840 and within the virtual environment.
- tip 952 can correspond to a manual instrument, such as manual instrument 930 shown in Fig. 9A, that has been inserted in the anatomical model 920.
- the physical end or tip of instrument 930 can be tracked within model 920 as described in some implementations herein, and its corresponding virtual tip 952 moved accordingly within the virtual environment of screen 940.
- the virtual tip 952 can be displayed to interact with virtual objects that correspond with physical objects in the model 920, and/or with virtual objects with no corresponding physical objects.
- Figs. 1 0A-1 0C illustrate examples related to tracking instruments within an anatomical model.
- Fig. 1 0A is a diagrammatic illustration of an example
- Manipulator arms 1 006a, 1006b, and 1 006c of the patient side cart 1002 include operating instruments that are surgical instruments 1 008a, 1 008b, and 1 008c, respectively, and manipulator arm 1 006d includes an operating instrument that is an endoscopic instrument 1 008d.
- Each of the instruments 1 008a-d is inserted in an associated cannula 1 01 0a, 1 01 0b, 101 0c, or 1 01 Od, respectively (e.g., the instrument can be a trocar within a cannula 1 01 0, or a cannula 1 01 0 can be part of a trocar 1 008 in some examples, such as for an initial insertion in the model 1 01 0).
- the cannulas 101 0 are inserted in apertures of the model 1 004.
- the endoscope instrument 1 008d can have its own sensing reference origin 1 014 relative to which it can sense the instruments and cannulas inserted in the model 1004.
- the endoscope camera can capture images of the cannulas 1 010 within the model when they are moved into the view of the camera.
- the model 1 004 can also include its own sensing system for tracking instruments inserted in (or otherwise interacting with) the model 1004.
- one or more sensors are provided within the model 1 004 to sense the cannulas 101 0.
- a camera system 1 020 is positioned on the interior base of the model 1004 to sense the interior of the modeH 004.
- a camera system 1 020 can be positioned near or within a patient side element (PSE) such as model 1004, or can be positioned at other locations of the bottom or sides of the model.
- PSE patient side element
- Camera system 1 020 thus continually capture images showing the positions of cannulas 101 0 being inserted in the model, as well as images showing the positions of surgical instruments 1008 inserted through the cannulas 1 01 0.
- Camera system 1 020 thus has its own sensing reference origin 1 022 which is the reference point for images captured by the camera system.
- sensing reference origin 1 022 is the reference point for images captured by the camera system.
- two cameras are shown in the camera system 1 020 to allow stereo triangulation in determining positions of the cannulas 1 01 0 in the anatomic model.
- camera system 1 020 can include a single camera, or other types of sensors to capture position or motion of cannulas and instruments.
- Fig. 1 0B shows example views 1 050 of a camera system within an
- anatomical model such as camera system 1 020 with an model 1 004 of Fig. 10A.
- the camera system 1020 includes two cameras, and a left view 1 052 is the view of one of the cameras, and a right view 1 054 is the view of the other camera.
- the top surface 1 056 and bottom surface 1 058 are shown, as well as apertures 1 060 in the top of the model.
- Cannulas 1 010 can be viewed inserted through specific holes of the model. In this example having two cameras, stereo triangulation can be used to accurately determine the position of each cannula 1 01 0 with reference to the origin system of the cameras.
- each cannula 1 01 0 can be distinguished from each other cannula 1 01 0 with individual marks or other characteristics.
- each cannula 101 0 can have a different exterior color to allow easy distinguishment of each cannula 1 01 0 by the sensing system 1 020.
- Fig. 1 0C shows a plan view of the external surface of the top surface of model 1 004, including holes 1 060 in the surface.
- Marks 1 070 indicate particular holes through which the cannulas 1 01 0 were detected by the camera system 1020 to have been inserted.
- Such a view can be created by the simulation processing component 1 04 based on the sensing views of the camera system 1020 and (for example) a 3-D computer-aided design (CAD) model of the anatomical model 1 004 used in visualization software.
- the view of Fig. 1 0C can be used to display port placement for the model for instructional or guidance purposes during a simulated medical procedure. For example, the view showing marked used ports can be displayed next to a similar view that displays the correct ports to be used in the particular medical procedure being simulated.
- Manual instruments can be tracked by sensing system 1 020 similarly to the cannulas and teleoperated instruments described above.
- a manual laparoscopic tool can be tracked.
- Other instruments can include a uterine manipulator, retraction tool, needle-passing tools, another manipulator arm or instrument attached to a separate component of the simulation system, or other instruments where an instrument or device separate from the patient side cart is being tracked and incorporated into the simulation environment.
- Figs. 1 1 A and 1 1 B are diagrammatic illustrations of one example of the use of an anatomical model 1 1 00 in simulated medical procedures that includes the use of both teleoperated and manual surgical instruments.
- Fig. 1 1 A is an exterior view of the model 1 1 00 and inserted instruments
- Model 1 1 B is an interior view of the model 1 1 00.
- Model 1 100 can be similar to models described above and includes apertures 1 1 02 in an upper shelf portion of the model, through which cannulas 1 104 have been inserted during a setup procedure.
- Teleoperated surgical instruments such as laparoscopic instruments and endoscopes, can be inserted in the cannulas 1 1 04.
- manual surgical instruments such as manual laparoscopic instrument 1 1 10
- sensors can be provided within the model 1 100 to sense the cannulas 1 1 04 and manual instruments such as instrument 1 1 0.
- camera system 1 1 1 2 is positioned on the interior base of the model 1 1 00 to sense the interior of the model 1 1 00 similarly as described for Fig. 1 0A. Camera system 1 1 1 2 can thus capture images showing when cannulas 1 1 04 have been inserted in the model, as well as images showing when surgical instruments have been inserted in the cannulas 1 104.
- dummy instruments can be used, e.g., which do not extend into the interior hollow portion of model 1 1 00.
- electromagnetic sensors can be used to sense cannulas and manual surgical instruments.
- other optical sensors can be used to sense cannulas and manual surgical instruments.
- Fig. 1 2 is a flow diagram illustrating an example method for using an anatomical model with both teleoperated surgical instruments and manual surgical instruments in one or more simulated surgical procedures, with reference to Figs. 1 1 A-1 1 B.
- blocks 1202 to 1 208 can be performed during a simulated setup procedure
- blocks 1 210 and 121 2 can be performed during a simulated surgical operation (block 1 21 0 can also be performed during a simulated setup procedure).
- the simulation processing component 1 02 can receive a position of the model 1 1 00 relative to the teleoperated arms of the patient side cart. For example, a teleoperated instrument at the end of one of the arms can be moved to contact (register) the model in multiple locations to establish the location of the mode in 3-D space.
- block 1 202 can be omitted or performed at a later time, e.g., the model location can be determined relative to the teleoperated instruments after docking instruments to cannulas in block 1 206 below by using sensors in the teleoperated arms.
- the simulation processing component senses the insertion of cannulas 1 1 04 in the model 1 1 00 and the simulation processing component estimates the position and orientation of the cannulas 1 1 04.
- sensors like camera system 1 1 1 2 can send signals to the simulation processing component.
- the simulation processing component senses docking and insertion of teleoperated dummy instruments in cannulas 1 1 04, e.g., based on signals from the sensors in the manipulator arms of the patient side cart. In other implementations, full surgical instruments can be docked and inserted in the cannulas 1 1 04.
- the sensors (such as camera system 1 1 1 2) and simulation processing component sense insertion of one or more manual instruments in cannulas 1 1 04, such as instrument 1 1 1 0.
- Blocks 1 206 and 1 208 can be performed in any order and/or at least partially simultaneously.
- the simulation processing component generates a virtual environment and generates virtual surgical instruments in the virtual environment corresponding to the teleoperated surgical instruments and the manual surgical instruments.
- the simulation processing component runs the simulated surgical operation based on console signals, sensed teleoperated instruments, and sensed manual surgical instruments.
- method 1 200 can use the sensing system of the anatomical model and/or the teleoperated medical device in conjunction with displaying a virtual environment.
- a generic picture of the model can be displayed without the teleoperated arms and without the exact
- the port locations placed by a user can be identified using cameras inside the model without using the teleoperated arm kinematics. Instruction can be given to the user to adjust incorrect port locations before beginning to dock the teleoperated arms to the model.
- the arm kinematics can be used to estimate the position and orientation of the model relative to the teleoperated medical device. (Other implementations can use sensors placed in or on the anatomical model to estimate the pose and location of the model relative to the patient side cart, rather than using the teleoperated arm sensors.) An entire scene of the surgical site and/or operating room can then be displayed to trainees.
- sensors in the model can track surgical instruments to provide an estimate of where the model is relative to the teleoperated device and this estimate of location can be updated during a procedure or operation to continuously provide accurate relative model location in case the model is bumped or moved by trainees.
- Figs. 1 3A and 1 3B are diagrammatic illustrations of an example of the use of an anatomical model 1300 in simulated medical procedures that include the use of manual surgical instruments. Model 1 300 can be similar to the models described above.
- Fig. 1 3A is an exterior view of the model 1 300 and an inserted instrument
- Fig. 1 3B is an interior view of the model 1 300.
- Model 1 300 includes apertures 1 302 in an upper portion of the model.
- Particular apertures 1 306 can be designated as remote centers for teleoperated instruments typically inserted through these apertures, but no cannulas for teleoperated instruments need be placed in this implementation.
- One or more cannulas, such as cannula 1 304, are inserted in model 1300 during a setup procedure in apertures where manual instruments are to be inserted.
- Manual surgical instruments, such as manual laparoscopic instrument 1 31 0, can be inserted in the cannula 1 304.
- Sensors can be provided within the model 1 300 to sense the cannulas such as cannula 1 304 and manual instruments such as instrument 1 31 0.
- camera system 1 31 2 is positioned on the interior base of the model 1300 to sense the interior of the model 1300.
- Camera system 1 31 2 can capture images showing when cannula 1 304 has been inserted in the model, as well as images showing when manual surgical instruments have been inserted in the cannula 1 304.
- the apertures 1 306 for teleoperated surgical instruments can be located by the simulation processor based on the known geometry of the model and the particular medical procedure being simulated.
- these particular aperture locations can be assumed arm remote centers, and no cannulas need to be tracked, nor do any teleoperated surgical instruments need to be docked with the model.
- this implementation can be used for simulations that include the use of manual surgical instruments and do not need use of a patient side cart, e.g., the teleoperated surgical instruments can all be virtual instruments generated in a virtual
- electromagnetic sensors can be used to sense cannulas and manual surgical instruments.
- other optical sensors can be used to sense cannulas and manual surgical instruments.
- Fig. 14 is a flow diagram illustrating an example method 1400 for using manual surgical instruments in a simulated medical procedure, with reference to Fig. 1 3.
- blocks 1402 to 1406 can be performed during a simulated setup procedure
- blocks 1408 and 1410 can be performed during a simulated surgical operation (block 1408 can also be performed during a setup procedure).
- the simulation processing component 1 02 assumes the position and orientation of the model 1 300, including assuming the positions of the surgical site and the apertures in the model (remote centers of teleoperated instruments) through which teleoperated surgical instruments would be inserted. To do this, the simulation processing component knows the geometry of the model and its apertures and physical surgical site location, as well as the particular apertures used in the surgical operation being set up.
- the simulation processing component knows the geometry of the model and its apertures and physical surgical site location, as well as the particular apertures used in the surgical operation being set up.
- the simulation processing component knows the geometry of the model and its apertures and physical surgical site location, as well as the particular aperture
- the processing component senses the insertion of cannulas 1 304 in the model 1 300 using sensors of the model 1 300 and the simulation processing component estimates the position and orientation of the cannulas 1304.
- the simulation processing component senses insertion of one or more manual instruments in cannulas 1 304, such as instrument 131 0.
- the simulation processing component generates a virtual environment and generates virtual surgical instruments in the virtual environment corresponding to teleoperated surgical instruments and the manual surgical instruments. The relative position between the manual surgical instruments and the assumed aperture locations for the teleoperated instruments in the model 1 300 enable relative positioning of these instruments in the virtual environment.
- the simulation processing component runs the simulated surgical operation and updates the virtual
- console signals to move the virtual teleoperated surgical instruments
- sensed manual surgical instruments to move the virtual teleoperated surgical instruments
- a simulation processing component can interact with a surgeon console (e.g., with master controllers) and/or a patient side cart (e.g., with slave manipulator arms and instruments).
- the console master can drive the slave arms with or without instruments (or with dummy instruments) on the patient side cart.
- the simulation system can simulate and provide guidance and other feedback on system setup and accurate positioning of the manipulator arms prior to a simulated surgical operation. This can be used to provide standardized and consistent training to surgeons using inanimate training exercises or wet-lab exercises.
- a simulated surgical operation can follow the simulated setup procedure, which can allow the entire medical procedure to be simulated. This allows trainees to see the consequences of improperly-performed tasks. For example, improper or incorrect tasks performed in a setup procedure may have repercussions in a following surgical operation, and the simulation system herein simulates this entire effect to allow trainees to learn and improve.
- the simulation system can display a virtual environment (e.g., ignore endoscope feed and instruments if installed), combined or augmented environment (e.g., endoscope feed with generated graphical visual overlays or virtual
- virtual or augmented images can be output through display systems (such as using TilePro from Intuitive Surgical, Inc.) on the surgeon console during any training exercise to provide instruction or performance metrics.
- VR virtual reality
- AR augmented reality
- ghost images overlaid on camera images
- system display devices can indicate or highlight system areas of concern or interest, such as patient cart setup joints having incorrect positions highlighted, reachability limits of instruments displayed, and/or internal/external collisions. This can reduce the burden on training assistants to catch mistakes during training procedures.
- the system can record the kinematics and events of the master console(s) and teleoperated slave medical device(s) during completion of inanimate training or wet-lab exercises by a console surgeon to compute training metrics and display such metrics using a similar interface as purely virtual training exercises.
- the system can record the kinematics and events of teleoperated devices during completion of patient-side exercises and setup procedures using actual instruments, and compute training metrics and display such metrics using a similar interface as purely virtual training exercises. Further, the system can record the kinematics and events of the masters and slaves during completion of exercise modules on porcine models (for example, during offsite training) to provide metrics and display such metrics.
- porcine models for example, during offsite training
- All data from any training environment or configuration can be recorded and stored locally or remotely in the same way to improve accessibility of data, monitoring of surgeon training and performance of trainee personnel during simulated procedures, standardization of exercises, and feedback to surgeon during training (e.g., to improve surgeon training).
- the simulation system can centralize most training content to one software platform separate from the system
- Teleoperated medical device surgery offers an unprecedented ability to record, track, and monitor surgery and surgeon training unlike any pre-existing form of surgery. Implementations described herein can make effective use of this capability and the data that can be harvested e.g., for simulation and training purposes. Some additional advantages of various implementations of teleoperated and non-teleoperated systems can include the following. Features described herein can centralize user training and evaluation on a single system, e.g., a single teleoperated medical system. Some systems can provide the ability to use a single simulation framework on teleoperated medical system with a separate surgeon console and patient side cart to monitor and track progress and to display feedback all under a single software and user interface (Ul) framework.
- Ul software and user interface
- Some systems can provide the ability to provide augmented reality output and feedback during wet-lab or porcine model exercises and dry-lab exercises using a teleoperated medical system. Some systems can provide the ability to combine training data using a single software and hardware architecture used for various types of training exercises including virtual environment exercises, inanimate exercises, wet-lab or porcine models, etc.
- One or more features can allow any training exercise or offsite lab exercise to be conducted using the single simulation architecture to provide real-time (during a procedure) and end-of- exercise metrics to guide training and learning.
- Features herein can improve accessibility of training data, especially for tasks not normally implemented on a simulator system.
- Features can improve standardization of training since the system can be used for several types of training tasks.
- Features can improve surgeon training at offsite training labs by quantifying and delivering feedback in addition to that provided by training personnel to help trainees learn.
- features can help training personnel to better manage multiple surgeon trainings simultaneously (e.g., dual surgeon console trainings).
- features can improve surgeon training conducted by clinical sales representatives (CSRs) or other instructors by simulating tasks performed during setup procedures, and by providing feedback determined by the system and displayed in real-time for setup exercises.
- CSRs clinical sales representatives
- features described herein can expand the capability of teleoperated and non-teleoperated medical simulation systems to support inanimate training exercises, wet-lab training scenarios, and VR-based training exercises.
- a single simulation system can administer and record all training performed by surgeons, e.g., with their CSRs, with dedicated training specialists (TSs), or independently.
- the simulation system can be used to simulate all interactions with the system outside of actual surgery.
- blocks described in the various methods herein can be performed in a different order than shown and/or simultaneously (partially or completely) with other blocks in the same method (or other methods), where appropriate.
- blocks can occur multiple times, in a different order, and/or at different times in the methods.
- spatially relative terms such as “beneath”, “below”, “lower”, “above”, “upper”, “proximal”, “distal”, and the like— may be used to describe one element's or feature's relationship to another element or feature as illustrated in the figures.
- These spatially relative terms are intended to encompass different positions (i.e., locations) and orientations (i.e., rotational placements) of a device in use or operation in addition to the position and orientation shown in the figures. For example, if a device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be “above” or “over” the other elements or features.
- the exemplary term “below” can encompass both positions and orientations of above and below.
- a device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
- descriptions of movement along and around various axes includes various special device positions and orientations.
- the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context indicates otherwise.
- Components described as coupled may be electrically or mechanically directly coupled, or they may be indirectly coupled via one or more intermediate components.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Educational Technology (AREA)
- Educational Administration (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Computational Mathematics (AREA)
- Mathematical Optimization (AREA)
- Mathematical Physics (AREA)
- Pure & Applied Mathematics (AREA)
- Mathematical Analysis (AREA)
- Algebra (AREA)
- Medicinal Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Pulmonology (AREA)
- Radiology & Medical Imaging (AREA)
- Manipulator (AREA)
- Instructional Devices (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Robotics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
Priority Applications (10)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2016541234A JP6659547B2 (ja) | 2013-12-20 | 2014-12-19 | 医療処置トレーニングのためのシミュレータシステム |
| EP14871282.1A EP3084747B1 (en) | 2013-12-20 | 2014-12-19 | Simulator system for medical procedure training |
| US15/106,254 US10510267B2 (en) | 2013-12-20 | 2014-12-19 | Simulator system for medical procedure training |
| CN201480076076.0A CN106030683B (zh) | 2013-12-20 | 2014-12-19 | 用于医疗程序培训的模拟器系统 |
| CN202011072053.6A CN112201131B (zh) | 2013-12-20 | 2014-12-19 | 用于医疗程序培训的模拟器系统 |
| EP22205491.8A EP4184483B1 (en) | 2013-12-20 | 2014-12-19 | Simulator system for medical procedure training |
| KR1020227005342A KR102405656B1 (ko) | 2013-12-20 | 2014-12-19 | 의료 절차 훈련을 위한 시뮬레이터 시스템 |
| KR1020167019144A KR102366023B1 (ko) | 2013-12-20 | 2014-12-19 | 의료 절차 훈련을 위한 시뮬레이터 시스템 |
| US16/584,564 US11468791B2 (en) | 2013-12-20 | 2019-09-26 | Simulator system for medical procedure training |
| US17/902,678 US12456392B2 (en) | 2013-12-20 | 2022-09-02 | Simulator system for medical procedure training |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201361919631P | 2013-12-20 | 2013-12-20 | |
| US61/919,631 | 2013-12-20 |
Related Child Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/106,254 A-371-Of-International US10510267B2 (en) | 2013-12-20 | 2014-12-19 | Simulator system for medical procedure training |
| US16/584,564 Continuation US11468791B2 (en) | 2013-12-20 | 2019-09-26 | Simulator system for medical procedure training |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2015095715A1 true WO2015095715A1 (en) | 2015-06-25 |
Family
ID=53403746
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2014/071521 Ceased WO2015095715A1 (en) | 2013-12-20 | 2014-12-19 | Simulator system for medical procedure training |
Country Status (6)
| Country | Link |
|---|---|
| US (3) | US10510267B2 (enExample) |
| EP (2) | EP3084747B1 (enExample) |
| JP (3) | JP6659547B2 (enExample) |
| KR (2) | KR102366023B1 (enExample) |
| CN (2) | CN106030683B (enExample) |
| WO (1) | WO2015095715A1 (enExample) |
Cited By (29)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2017030848A1 (en) * | 2015-08-17 | 2017-02-23 | Ethicon Endo-Surgery, Llc | Gathering and analyzing data for robotic surgical systems |
| WO2017083768A1 (en) | 2015-11-12 | 2017-05-18 | Jarc Anthony Michael | Surgical system with training or assist functions |
| JP2017104964A (ja) * | 2015-12-11 | 2017-06-15 | 川崎重工業株式会社 | マスターアーム入力装置 |
| ITUA20161926A1 (it) * | 2016-03-23 | 2017-09-23 | Medvirt Sagl | Metodo per la simulazione di una endoscopia. |
| WO2017176857A1 (en) * | 2016-04-08 | 2017-10-12 | KindHeart, Inc. | Thoracic surgery simulator for training surgeons |
| US9805625B2 (en) | 2010-10-29 | 2017-10-31 | KindHeart, Inc. | Surgical simulation assembly |
| US10013896B2 (en) | 2010-10-29 | 2018-07-03 | The University Of North Carolina At Chapel Hill | Modular staged reality simulator |
| WO2019005983A1 (en) * | 2017-06-29 | 2019-01-03 | Verb Surgical Inc. | LAPAROSCOPIC TOOLS WITH VIRTUAL REALITY |
| US10198969B2 (en) | 2015-09-16 | 2019-02-05 | KindHeart, Inc. | Surgical simulation system and associated methods |
| EP3345478A4 (en) * | 2015-09-02 | 2019-04-10 | Universidad Miguel Hernandez De Elche | CLINICAL KADAVER SIMULATOR |
| JP2019537459A (ja) * | 2016-09-29 | 2019-12-26 | シンバイオニクス リミテッド | 仮想現実環境または拡張現実環境の中の手術室内での医療シミュレーションのための方法およびシステム |
| CN110815215A (zh) * | 2019-10-24 | 2020-02-21 | 上海航天控制技术研究所 | 多模融合的旋转目标接近停靠抓捕地面试验系统及方法 |
| EP3723069A1 (en) * | 2019-04-08 | 2020-10-14 | Covidien LP | Systems and methods for simulating surgical procedures |
| WO2021034694A1 (en) * | 2019-08-16 | 2021-02-25 | Intuitive Surgical Operations, Inc. | Auto-configurable simulation system and method |
| US11011077B2 (en) | 2017-06-29 | 2021-05-18 | Verb Surgical Inc. | Virtual reality training, simulation, and collaboration in a robotic surgical system |
| US11058501B2 (en) | 2015-06-09 | 2021-07-13 | Intuitive Surgical Operations, Inc. | Configuring surgical system with surgical procedures atlas |
| EP3744283A4 (en) * | 2018-02-20 | 2022-02-23 | Hutom Co., Ltd. | Surgery optimization method and device |
| US11270601B2 (en) | 2017-06-29 | 2022-03-08 | Verb Surgical Inc. | Virtual reality system for simulating a robotic surgical environment |
| US11284955B2 (en) | 2017-06-29 | 2022-03-29 | Verb Surgical Inc. | Emulation of robotic arms and control thereof in a virtual reality environment |
| WO2022064059A1 (en) * | 2020-09-28 | 2022-03-31 | Institut Hospitalo-Universitaire De Strasbourg | Device for simulating the movement of an endoscope in an environment |
| US11589937B2 (en) | 2017-04-20 | 2023-02-28 | Intuitive Surgical Operations, Inc. | Systems and methods for constraining a virtual reality surgical system |
| WO2023038424A1 (ko) | 2021-09-07 | 2023-03-16 | 주식회사 로엔서지컬 | 신장 수술 훈련 시스템 |
| WO2023144845A1 (en) * | 2022-01-27 | 2023-08-03 | B2Or Srl | A system for performing practical surgery exercises with particular reference to the cervico-facial area |
| US11727827B2 (en) | 2012-08-17 | 2023-08-15 | Intuitive Surgical Operations, Inc. | Anatomical model and method for surgical training |
| EP4125671A4 (en) * | 2020-04-03 | 2024-04-24 | Verb Surgical Inc. | MOBILE VIRTUAL REALITY SYSTEM FOR SURGICAL ROBOT SYSTEMS |
| US12053254B2 (en) | 2018-07-26 | 2024-08-06 | Sony Corporation | Information processing apparatus and information processing method |
| US12201484B2 (en) | 2017-10-23 | 2025-01-21 | Intuitive Surgical Operations, Inc. | Systems and methods for presenting augmented reality in a display of a teleoperational system |
| US12295662B2 (en) | 2020-03-20 | 2025-05-13 | The Johns Hopkins University | Augmented reality based surgical navigation system |
| US12456392B2 (en) | 2013-12-20 | 2025-10-28 | Intuitive Surgical Operations, Inc. | Simulator system for medical procedure training |
Families Citing this family (294)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8423182B2 (en) | 2009-03-09 | 2013-04-16 | Intuitive Surgical Operations, Inc. | Adaptable integrated energy control system for electrosurgical tools in robotic surgical systems |
| US8672837B2 (en) | 2010-06-24 | 2014-03-18 | Hansen Medical, Inc. | Methods and devices for controlling a shapeable medical device |
| US11871901B2 (en) | 2012-05-20 | 2024-01-16 | Cilag Gmbh International | Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage |
| KR102188033B1 (ko) | 2012-09-17 | 2020-12-07 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | 원격조작 수술 기구 기능들에 입력 장치들을 할당하는 방법 및 시스템 |
| US10631939B2 (en) | 2012-11-02 | 2020-04-28 | Intuitive Surgical Operations, Inc. | Systems and methods for mapping flux supply paths |
| US9566414B2 (en) | 2013-03-13 | 2017-02-14 | Hansen Medical, Inc. | Integrated catheter and guide wire controller |
| US9057600B2 (en) | 2013-03-13 | 2015-06-16 | Hansen Medical, Inc. | Reducing incremental measurement sensor error |
| US10849702B2 (en) | 2013-03-15 | 2020-12-01 | Auris Health, Inc. | User input devices for controlling manipulation of guidewires and catheters |
| US9283046B2 (en) | 2013-03-15 | 2016-03-15 | Hansen Medical, Inc. | User interface for active drive apparatus with finite range of motion |
| US9014851B2 (en) | 2013-03-15 | 2015-04-21 | Hansen Medical, Inc. | Systems and methods for tracking robotically controlled medical instruments |
| US9629595B2 (en) | 2013-03-15 | 2017-04-25 | Hansen Medical, Inc. | Systems and methods for localizing, tracking and/or controlling medical instruments |
| US9271663B2 (en) | 2013-03-15 | 2016-03-01 | Hansen Medical, Inc. | Flexible instrument localization from both remote and elongation sensors |
| US11020016B2 (en) | 2013-05-30 | 2021-06-01 | Auris Health, Inc. | System and method for displaying anatomy and devices on a movable display |
| EP3689284B1 (en) | 2013-10-24 | 2025-02-26 | Auris Health, Inc. | System for robotic-assisted endolumenal surgery |
| US9937626B2 (en) | 2013-12-11 | 2018-04-10 | Covidien Lp | Wrist and jaw assemblies for robotic surgical systems |
| JP2017505202A (ja) * | 2014-02-12 | 2017-02-16 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 外科用器具可視性のロボット制御 |
| EP3243476B1 (en) | 2014-03-24 | 2019-11-06 | Auris Health, Inc. | Systems and devices for catheter driving instinctiveness |
| CA2957750C (en) | 2014-08-13 | 2023-04-04 | Covidien Lp | Robotically controlling mechanical advantage gripping |
| AU2015349700B2 (en) * | 2014-11-21 | 2019-11-07 | Think Surgical, Inc. | Visible light communication system for transmitting data between visual tracking systems and tracking markers |
| EP3258874B1 (en) | 2015-02-19 | 2024-01-17 | Covidien LP | Input device for robotic surgical system |
| CA2977413A1 (en) | 2015-03-10 | 2016-09-15 | Covidien Lp | Measuring health of a connector member of a robotic surgical system |
| GB2536650A (en) | 2015-03-24 | 2016-09-28 | Augmedics Ltd | Method and system for combining video-based and optic-based augmented reality in a near eye display |
| US20160314711A1 (en) * | 2015-04-27 | 2016-10-27 | KindHeart, Inc. | Telerobotic surgery system for remote surgeon training using robotic surgery station and remote surgeon station with display of actual animal tissue images and associated methods |
| WO2016191361A1 (en) | 2015-05-22 | 2016-12-01 | The University Of North Carolina At Chapel Hill | Methods, systems, and computer readable media for transoral lung access |
| US10959788B2 (en) | 2015-06-03 | 2021-03-30 | Covidien Lp | Offset instrument drive unit |
| JP6761822B2 (ja) | 2015-06-16 | 2020-09-30 | コヴィディエン リミテッド パートナーシップ | ロボット外科用システムトルク変換検知 |
| US10779897B2 (en) | 2015-06-23 | 2020-09-22 | Covidien Lp | Robotic surgical assemblies |
| EP3349649B1 (en) | 2015-09-18 | 2022-03-09 | Auris Health, Inc. | Navigation of tubular networks |
| AU2016326371B2 (en) | 2015-09-25 | 2020-07-23 | Covidien Lp | Robotic surgical assemblies and instrument drive connectors thereof |
| US10912449B2 (en) | 2015-10-23 | 2021-02-09 | Covidien Lp | Surgical system for detecting gradual changes in perfusion |
| US10660714B2 (en) | 2015-11-19 | 2020-05-26 | Covidien Lp | Optical force sensor for robotic surgical system |
| US10143526B2 (en) | 2015-11-30 | 2018-12-04 | Auris Health, Inc. | Robot-assisted driving systems and methods |
| JP6644530B2 (ja) * | 2015-11-30 | 2020-02-12 | オリンパス株式会社 | 内視鏡業務支援システム |
| WO2017115425A1 (ja) * | 2015-12-28 | 2017-07-06 | オリンパス株式会社 | 医療用マニピュレータシステム |
| WO2017151996A1 (en) | 2016-03-04 | 2017-09-08 | Covidien Lp | Inverse kinematic control systems for robotic surgical system |
| EP3440660A1 (en) * | 2016-04-06 | 2019-02-13 | Koninklijke Philips N.V. | Method, device and system for enabling to analyze a property of a vital sign detector |
| US11576562B2 (en) | 2016-04-07 | 2023-02-14 | Titan Medical Inc. | Camera positioning method and apparatus for capturing images during a medical procedure |
| AU2017272072B2 (en) | 2016-05-26 | 2021-03-25 | Covidien Lp | Robotic surgical assemblies |
| US10736219B2 (en) | 2016-05-26 | 2020-08-04 | Covidien Lp | Instrument drive units |
| EP3463149B1 (en) | 2016-06-03 | 2025-02-19 | Covidien LP | Passive axis system for robotic surgical systems |
| CN113180835A (zh) | 2016-06-03 | 2021-07-30 | 柯惠Lp公司 | 用于机器人手术系统的控制臂 |
| WO2017210497A1 (en) | 2016-06-03 | 2017-12-07 | Covidien Lp | Systems, methods, and computer-readable program products for controlling a robotically delivered manipulator |
| WO2017210500A1 (en) | 2016-06-03 | 2017-12-07 | Covidien Lp | Robotic surgical system with an embedded imager |
| US11037464B2 (en) | 2016-07-21 | 2021-06-15 | Auris Health, Inc. | System with emulator movement tracking for controlling medical devices |
| US11000953B2 (en) * | 2016-08-17 | 2021-05-11 | Locus Robotics Corp. | Robot gamification for improvement of operator performance |
| CN106251752A (zh) * | 2016-10-25 | 2016-12-21 | 深圳市科创数字显示技术有限公司 | Ar和vr相结合的医学培训系统 |
| KR102523945B1 (ko) | 2016-11-11 | 2023-04-21 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | 외과의 숙련도 레벨 기반 기구 제어를 갖는 원격조작 수술 시스템 |
| WO2018118858A1 (en) | 2016-12-19 | 2018-06-28 | National Board Of Medical Examiners | Medical training and performance assessment instruments, methods, and systems |
| US10244926B2 (en) | 2016-12-28 | 2019-04-02 | Auris Health, Inc. | Detecting endolumenal buckling of flexible instruments |
| WO2018132336A1 (en) * | 2017-01-11 | 2018-07-19 | Magic Leap, Inc. | Medical assistant |
| US11690691B2 (en) | 2017-02-15 | 2023-07-04 | Covidien Lp | System and apparatus for crush prevention for medical robot applications |
| US10251709B2 (en) * | 2017-03-05 | 2019-04-09 | Samuel Cho | Architecture, system, and method for developing and robotically performing a medical procedure activity |
| JP6649912B2 (ja) * | 2017-03-15 | 2020-02-19 | 株式会社モリタ | 歯科診療実習装置及び歯科診療実習システム |
| WO2018175971A1 (en) * | 2017-03-24 | 2018-09-27 | Surgical Theater LLC | System and method for training and collaborating in a virtual environment |
| CN106875802A (zh) * | 2017-03-29 | 2017-06-20 | 张小来 | 一种传染病护理模拟操作方法及系统 |
| EP3600031A4 (en) | 2017-03-31 | 2021-01-20 | Auris Health, Inc. | ROBOTIC NAVIGATION SYSTEMS IN LUMINAL NETWORKS COMPENSATION FOR PHYSIOLOGICAL NOISE |
| JP7190448B2 (ja) * | 2017-05-16 | 2022-12-15 | コーニンクレッカ フィリップス エヌ ヴェ | 解剖学的モデルの仮想拡張 |
| CN110650705B (zh) | 2017-05-24 | 2023-04-28 | 柯惠Lp公司 | 机器人系统中的电外科工具的存在检测 |
| EP3629981A4 (en) | 2017-05-25 | 2021-04-07 | Covidien LP | SYSTEMS AND METHODS FOR DETECTION OF OBJECTS WITHIN A FIELD OF VIEW OF AN IMAGE CAPTURING DEVICE |
| CN110662507A (zh) | 2017-05-25 | 2020-01-07 | 柯惠Lp公司 | 具有自动引导的机器人手术系统 |
| JP2020520694A (ja) | 2017-05-25 | 2020-07-16 | コヴィディエン リミテッド パートナーシップ | ロボット手術システムおよびロボット手術システムのコンポーネントを覆うためのドレープ |
| US20180365959A1 (en) * | 2017-06-14 | 2018-12-20 | David R. Hall | Method for Posture and Body Position Correction |
| US10022192B1 (en) | 2017-06-23 | 2018-07-17 | Auris Health, Inc. | Automatically-initialized robotic systems for navigation of luminal networks |
| WO2019008737A1 (ja) * | 2017-07-07 | 2019-01-10 | オリンパス株式会社 | 内視鏡用トレーニングシステム |
| US11043144B2 (en) * | 2017-08-04 | 2021-06-22 | Clarius Mobile Health Corp. | Systems and methods for providing an interactive demonstration of an ultrasound user interface |
| US11406441B2 (en) | 2017-08-16 | 2022-08-09 | Covidien Lp | End effector including wrist assembly and monopolar tool for robotic surgical systems |
| JP7349992B2 (ja) | 2017-09-05 | 2023-09-25 | コヴィディエン リミテッド パートナーシップ | ロボット手術システムのための衝突処理アルゴリズム |
| EP3678573B1 (en) | 2017-09-06 | 2025-01-15 | Covidien LP | Velocity scaling of surgical robots |
| CN111093550B (zh) | 2017-09-08 | 2023-12-12 | 柯惠Lp公司 | 用于机器人手术组件的能量断开 |
| US20190087830A1 (en) | 2017-09-15 | 2019-03-21 | Pearson Education, Inc. | Generating digital credentials with associated sensor data in a sensor-monitored environment |
| JP7069617B2 (ja) * | 2017-09-27 | 2022-05-18 | 富士フイルムビジネスイノベーション株式会社 | 行動情報処理装置 |
| US11058493B2 (en) | 2017-10-13 | 2021-07-13 | Auris Health, Inc. | Robotic system configured for navigation path tracing |
| US10555778B2 (en) | 2017-10-13 | 2020-02-11 | Auris Health, Inc. | Image-based branch detection and mapping for navigation |
| CN107610574B (zh) * | 2017-10-17 | 2023-04-07 | 上海褚信医学科技有限公司 | 一种模拟仿真穿刺类手术的装置及方法 |
| FR3072559B1 (fr) * | 2017-10-24 | 2023-03-24 | Spineguard | Systeme medical comprenant un bras robotise et un dispositif medical destine a penetrer dans une structure anatomique |
| US11911045B2 (en) | 2017-10-30 | 2024-02-27 | Cllag GmbH International | Method for operating a powered articulating multi-clip applier |
| US11510741B2 (en) | 2017-10-30 | 2022-11-29 | Cilag Gmbh International | Method for producing a surgical instrument comprising a smart electrical system |
| WO2019089226A2 (en) * | 2017-10-30 | 2019-05-09 | Intuitive Surgical Operations, Inc. | Systems and methods for guided port placement selection |
| US11291510B2 (en) | 2017-10-30 | 2022-04-05 | Cilag Gmbh International | Method of hub communication with surgical instrument systems |
| US11564756B2 (en) | 2017-10-30 | 2023-01-31 | Cilag Gmbh International | Method of hub communication with surgical instrument systems |
| US11026712B2 (en) | 2017-10-30 | 2021-06-08 | Cilag Gmbh International | Surgical instruments comprising a shifting mechanism |
| US11801098B2 (en) | 2017-10-30 | 2023-10-31 | Cilag Gmbh International | Method of hub communication with surgical instrument systems |
| FR3073657B1 (fr) * | 2017-11-10 | 2023-05-05 | Virtualisurg | Systeme de simulation d'acte chirurgical |
| CN111417356A (zh) | 2017-12-01 | 2020-07-14 | 柯惠Lp公司 | 用于机器人手术系统的帷帘管理组件 |
| US12458411B2 (en) | 2017-12-07 | 2025-11-04 | Augmedics Ltd. | Spinous process clamp |
| WO2019113391A1 (en) | 2017-12-08 | 2019-06-13 | Auris Health, Inc. | System and method for medical instrument navigation and targeting |
| CN110869173B (zh) * | 2017-12-14 | 2023-11-17 | 奥瑞斯健康公司 | 用于估计器械定位的系统与方法 |
| CN110809453B (zh) | 2017-12-18 | 2023-06-06 | 奥瑞斯健康公司 | 用于腔网络内的器械跟踪和导航的方法和系统 |
| GB2569655B (en) | 2017-12-22 | 2022-05-11 | Jemella Ltd | Training system and device |
| US20190201090A1 (en) | 2017-12-28 | 2019-07-04 | Ethicon Llc | Capacitive coupled return path pad with separable array elements |
| US11696760B2 (en) | 2017-12-28 | 2023-07-11 | Cilag Gmbh International | Safety systems for smart powered surgical stapling |
| US10758310B2 (en) | 2017-12-28 | 2020-09-01 | Ethicon Llc | Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices |
| US20190201039A1 (en) | 2017-12-28 | 2019-07-04 | Ethicon Llc | Situational awareness of electrosurgical systems |
| US11257589B2 (en) | 2017-12-28 | 2022-02-22 | Cilag Gmbh International | Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes |
| US20190206569A1 (en) | 2017-12-28 | 2019-07-04 | Ethicon Llc | Method of cloud based data analytics for use with the hub |
| US11969216B2 (en) | 2017-12-28 | 2024-04-30 | Cilag Gmbh International | Surgical network recommendations from real time analysis of procedure variables against a baseline highlighting differences from the optimal solution |
| US11937769B2 (en) | 2017-12-28 | 2024-03-26 | Cilag Gmbh International | Method of hub communication, processing, storage and display |
| US11832899B2 (en) | 2017-12-28 | 2023-12-05 | Cilag Gmbh International | Surgical systems with autonomously adjustable control programs |
| US11179175B2 (en) | 2017-12-28 | 2021-11-23 | Cilag Gmbh International | Controlling an ultrasonic surgical instrument according to tissue location |
| US11896443B2 (en) | 2017-12-28 | 2024-02-13 | Cilag Gmbh International | Control of a surgical system through a surgical barrier |
| US11896322B2 (en) | 2017-12-28 | 2024-02-13 | Cilag Gmbh International | Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub |
| US10892995B2 (en) | 2017-12-28 | 2021-01-12 | Ethicon Llc | Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs |
| US11744604B2 (en) | 2017-12-28 | 2023-09-05 | Cilag Gmbh International | Surgical instrument with a hardware-only control circuit |
| US11969142B2 (en) | 2017-12-28 | 2024-04-30 | Cilag Gmbh International | Method of compressing tissue within a stapling device and simultaneously displaying the location of the tissue within the jaws |
| US12062442B2 (en) | 2017-12-28 | 2024-08-13 | Cilag Gmbh International | Method for operating surgical instrument systems |
| US11410259B2 (en) * | 2017-12-28 | 2022-08-09 | Cilag Gmbh International | Adaptive control program updates for surgical devices |
| US11607811B2 (en) * | 2017-12-28 | 2023-03-21 | Fuji Corporation | Information providing device, information providing method and program |
| US11389164B2 (en) | 2017-12-28 | 2022-07-19 | Cilag Gmbh International | Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices |
| US11076921B2 (en) | 2017-12-28 | 2021-08-03 | Cilag Gmbh International | Adaptive control program updates for surgical hubs |
| US11633237B2 (en) | 2017-12-28 | 2023-04-25 | Cilag Gmbh International | Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures |
| US11464559B2 (en) | 2017-12-28 | 2022-10-11 | Cilag Gmbh International | Estimating state of ultrasonic end effector and control system therefor |
| US10918310B2 (en) | 2018-01-03 | 2021-02-16 | Biosense Webster (Israel) Ltd. | Fast anatomical mapping (FAM) using volume filling |
| US11202570B2 (en) | 2017-12-28 | 2021-12-21 | Cilag Gmbh International | Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems |
| WO2019133143A1 (en) | 2017-12-28 | 2019-07-04 | Ethicon Llc | Surgical hub and modular device response adjustment based on situational awareness |
| US12096916B2 (en) | 2017-12-28 | 2024-09-24 | Cilag Gmbh International | Method of sensing particulate from smoke evacuated from a patient, adjusting the pump speed based on the sensed information, and communicating the functional parameters of the system to the hub |
| US11864728B2 (en) | 2017-12-28 | 2024-01-09 | Cilag Gmbh International | Characterization of tissue irregularities through the use of mono-chromatic light refractivity |
| US11376002B2 (en) | 2017-12-28 | 2022-07-05 | Cilag Gmbh International | Surgical instrument cartridge sensor assemblies |
| US11132462B2 (en) | 2017-12-28 | 2021-09-28 | Cilag Gmbh International | Data stripping method to interrogate patient records and create anonymized record |
| US20190201112A1 (en) | 2017-12-28 | 2019-07-04 | Ethicon Llc | Computer implemented interactive surgical systems |
| US11559308B2 (en) | 2017-12-28 | 2023-01-24 | Cilag Gmbh International | Method for smart energy device infrastructure |
| US11166772B2 (en) | 2017-12-28 | 2021-11-09 | Cilag Gmbh International | Surgical hub coordination of control and communication of operating room devices |
| US11304763B2 (en) | 2017-12-28 | 2022-04-19 | Cilag Gmbh International | Image capturing of the areas outside the abdomen to improve placement and control of a surgical device in use |
| US12127729B2 (en) | 2017-12-28 | 2024-10-29 | Cilag Gmbh International | Method for smoke evacuation for surgical hub |
| US11998193B2 (en) | 2017-12-28 | 2024-06-04 | Cilag Gmbh International | Method for usage of the shroud as an aspect of sensing or controlling a powered surgical device, and a control algorithm to adjust its default operation |
| US11818052B2 (en) | 2017-12-28 | 2023-11-14 | Cilag Gmbh International | Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs |
| US11659023B2 (en) | 2017-12-28 | 2023-05-23 | Cilag Gmbh International | Method of hub communication |
| US11026751B2 (en) | 2017-12-28 | 2021-06-08 | Cilag Gmbh International | Display of alignment of staple cartridge to prior linear staple line |
| US11857152B2 (en) | 2017-12-28 | 2024-01-02 | Cilag Gmbh International | Surgical hub spatial awareness to determine devices in operating theater |
| US20190201115A1 (en) * | 2017-12-28 | 2019-07-04 | Ethicon Llc | Aggregation and reporting of surgical hub data |
| US20190201139A1 (en) | 2017-12-28 | 2019-07-04 | Ethicon Llc | Communication arrangements for robot-assisted surgical platforms |
| US11903601B2 (en) | 2017-12-28 | 2024-02-20 | Cilag Gmbh International | Surgical instrument comprising a plurality of drive systems |
| US11576677B2 (en) | 2017-12-28 | 2023-02-14 | Cilag Gmbh International | Method of hub communication, processing, display, and cloud analytics |
| US11109866B2 (en) | 2017-12-28 | 2021-09-07 | Cilag Gmbh International | Method for circular stapler control algorithm adjustment based on situational awareness |
| US11324557B2 (en) | 2017-12-28 | 2022-05-10 | Cilag Gmbh International | Surgical instrument with a sensing array |
| CN108122467A (zh) * | 2017-12-28 | 2018-06-05 | 王震坤 | 腹腔镜模拟训练机 |
| US11311306B2 (en) | 2017-12-28 | 2022-04-26 | Cilag Gmbh International | Surgical systems for detecting end effector tissue distribution irregularities |
| US11612444B2 (en) | 2017-12-28 | 2023-03-28 | Cilag Gmbh International | Adjustment of a surgical device function based on situational awareness |
| US11666331B2 (en) | 2017-12-28 | 2023-06-06 | Cilag Gmbh International | Systems for detecting proximity of surgical end effector to cancerous tissue |
| US11786245B2 (en) | 2017-12-28 | 2023-10-17 | Cilag Gmbh International | Surgical systems with prioritized data transmission capabilities |
| US11678881B2 (en) | 2017-12-28 | 2023-06-20 | Cilag Gmbh International | Spatial awareness of surgical hubs in operating rooms |
| US12376855B2 (en) | 2017-12-28 | 2025-08-05 | Cilag Gmbh International | Safety systems for smart powered surgical stapling |
| US11786251B2 (en) | 2017-12-28 | 2023-10-17 | Cilag Gmbh International | Method for adaptive control schemes for surgical network control and interaction |
| US12396806B2 (en) | 2017-12-28 | 2025-08-26 | Cilag Gmbh International | Adjustment of a surgical device function based on situational awareness |
| US11213359B2 (en) | 2017-12-28 | 2022-01-04 | Cilag Gmbh International | Controllers for robot-assisted surgical platforms |
| US11304699B2 (en) | 2017-12-28 | 2022-04-19 | Cilag Gmbh International | Method for adaptive control schemes for surgical network control and interaction |
| EP3735199A4 (en) | 2018-01-04 | 2021-10-13 | Covidien LP | SYSTEMS AND ASSEMBLIES FOR MOUNTING SURGICAL ACCESSORY ON ROBOTIC SURGICAL SYSTEMS, AND PROVIDING ACCESS THROUGH THEM |
| JP2021510327A (ja) | 2018-01-10 | 2021-04-22 | コヴィディエン リミテッド パートナーシップ | コンピュータビジョンを利用したロボット外科システムのツールの位置および状態の判定 |
| US12102403B2 (en) | 2018-02-02 | 2024-10-01 | Coviden Lp | Robotic surgical systems with user engagement monitoring |
| US11189379B2 (en) | 2018-03-06 | 2021-11-30 | Digital Surgery Limited | Methods and systems for using multiple data structures to process surgical data |
| US11337746B2 (en) | 2018-03-08 | 2022-05-24 | Cilag Gmbh International | Smart blade and power pulsing |
| US11298148B2 (en) | 2018-03-08 | 2022-04-12 | Cilag Gmbh International | Live time tissue classification using electrical parameters |
| US11259830B2 (en) | 2018-03-08 | 2022-03-01 | Cilag Gmbh International | Methods for controlling temperature in ultrasonic device |
| EP3761900A4 (en) | 2018-03-08 | 2021-12-08 | Covidien LP | SURGICAL ROBOTIC SYSTEMS |
| US11090047B2 (en) | 2018-03-28 | 2021-08-17 | Cilag Gmbh International | Surgical instrument comprising an adaptive control system |
| WO2019191144A1 (en) | 2018-03-28 | 2019-10-03 | Auris Health, Inc. | Systems and methods for registration of location sensors |
| US11259806B2 (en) | 2018-03-28 | 2022-03-01 | Cilag Gmbh International | Surgical stapling devices with features for blocking advancement of a camming assembly of an incompatible cartridge installed therein |
| US10827913B2 (en) | 2018-03-28 | 2020-11-10 | Auris Health, Inc. | Systems and methods for displaying estimated location of instrument |
| KR102085374B1 (ko) * | 2018-04-03 | 2020-03-06 | (주)다울디엔에스 | Ar 기반 저작도구를 이용한 외과 수술 교육 시스템 |
| EP3781073A4 (en) | 2018-04-20 | 2022-01-26 | Covidien LP | COMPENSATION OF OBSERVER MOTION IN ROBOTIC SURGICAL SYSTEMS WITH STEREOSCOPIC DISPLAYS |
| CN111971150A (zh) | 2018-04-20 | 2020-11-20 | 柯惠Lp公司 | 手术机器人手推车放置的系统和方法 |
| US10933526B2 (en) * | 2018-04-23 | 2021-03-02 | General Electric Company | Method and robotic system for manipulating instruments |
| JP6993927B2 (ja) * | 2018-04-24 | 2022-02-04 | 株式会社日立産機システム | シミュレーション装置およびシミュレーション方法 |
| WO2019211741A1 (en) | 2018-05-02 | 2019-11-07 | Augmedics Ltd. | Registration of a fiducial marker for an augmented reality system |
| US12165536B2 (en) | 2018-05-05 | 2024-12-10 | Mentice, Inc. | Simulation-based training and assessment systems and methods |
| WO2019222495A1 (en) | 2018-05-18 | 2019-11-21 | Auris Health, Inc. | Controllers for robotically-enabled teleoperated systems |
| CN110831486B (zh) | 2018-05-30 | 2022-04-05 | 奥瑞斯健康公司 | 用于基于定位传感器的分支预测的系统和方法 |
| CN112236083B (zh) | 2018-05-31 | 2024-08-13 | 奥瑞斯健康公司 | 用于导航检测生理噪声的管腔网络的机器人系统和方法 |
| US10898275B2 (en) | 2018-05-31 | 2021-01-26 | Auris Health, Inc. | Image-based airway analysis and mapping |
| JP7371026B2 (ja) | 2018-05-31 | 2023-10-30 | オーリス ヘルス インコーポレイテッド | 管状網の経路ベースのナビゲーション |
| JP7271579B2 (ja) | 2018-06-19 | 2023-05-11 | ホウメディカ・オステオニクス・コーポレイション | 整形外科手術における複合現実支援を用いた手術支援 |
| US11576739B2 (en) | 2018-07-03 | 2023-02-14 | Covidien Lp | Systems, methods, and computer-readable media for detecting image degradation during surgical procedures |
| US10410542B1 (en) * | 2018-07-18 | 2019-09-10 | Simulated Inanimate Models, LLC | Surgical training apparatus, methods and systems |
| KR102189334B1 (ko) | 2018-07-24 | 2020-12-09 | 주식회사 라이너스 | 의료용 학습 관리 시스템 및 방법 |
| US10565977B1 (en) * | 2018-08-20 | 2020-02-18 | Verb Surgical Inc. | Surgical tool having integrated microphones |
| US20210304638A1 (en) * | 2018-09-04 | 2021-09-30 | Orsi Academy cvba | Chicken Model for Robotic Basic Skills Training |
| JP2022500163A (ja) | 2018-09-14 | 2022-01-04 | コヴィディエン リミテッド パートナーシップ | 外科用ロボットシステムおよびその外科用器具の使用量を追跡する方法 |
| WO2020060789A1 (en) | 2018-09-17 | 2020-03-26 | Covidien Lp | Surgical robotic systems |
| US11998288B2 (en) | 2018-09-17 | 2024-06-04 | Covidien Lp | Surgical robotic systems |
| US12076100B2 (en) | 2018-09-28 | 2024-09-03 | Auris Health, Inc. | Robotic systems and methods for concomitant endoscopic and percutaneous medical procedures |
| US11109746B2 (en) | 2018-10-10 | 2021-09-07 | Titan Medical Inc. | Instrument insertion system, method, and apparatus for performing medical procedures |
| WO2020075545A1 (en) * | 2018-10-12 | 2020-04-16 | Sony Corporation | Surgical support system, data processing apparatus and method |
| US11766296B2 (en) | 2018-11-26 | 2023-09-26 | Augmedics Ltd. | Tracking system for image-guided surgery |
| KR102665091B1 (ko) * | 2018-12-10 | 2024-05-13 | 한국전자통신연구원 | 의료 정보 처리 장치 및 방법 |
| WO2020138734A1 (ko) * | 2018-12-27 | 2020-07-02 | 경북대학교 산학협력단 | 햅틱기반 이비인후과 및 신경외과 의학실습 시뮬레이터 및 방법 |
| KR102146719B1 (ko) | 2018-12-27 | 2020-08-24 | 주식회사 홀로웍스 | 가상현실 기반의 정형외과 시뮬레이터의 수술 평가 시스템 |
| US11586106B2 (en) | 2018-12-28 | 2023-02-21 | Titan Medical Inc. | Imaging apparatus having configurable stereoscopic perspective |
| US11717355B2 (en) | 2019-01-29 | 2023-08-08 | Covidien Lp | Drive mechanisms for surgical instruments such as for use in robotic surgical systems |
| US11576733B2 (en) | 2019-02-06 | 2023-02-14 | Covidien Lp | Robotic surgical assemblies including electrosurgical instruments having articulatable wrist assemblies |
| EP3696794A1 (en) * | 2019-02-15 | 2020-08-19 | Virtamed AG | Compact haptic mixed reality simulator |
| US11484372B2 (en) | 2019-02-15 | 2022-11-01 | Covidien Lp | Articulation mechanisms for surgical instruments such as for use in robotic surgical systems |
| US11272931B2 (en) | 2019-02-19 | 2022-03-15 | Cilag Gmbh International | Dual cam cartridge based feature for unlocking a surgical stapler lockout |
| US11464511B2 (en) | 2019-02-19 | 2022-10-11 | Cilag Gmbh International | Surgical staple cartridges with movable authentication key arrangements |
| CN119214798A (zh) | 2019-03-07 | 2024-12-31 | 普罗赛普特生物机器人公司 | 用于组织切除和成像的机器人臂和方法 |
| US12478444B2 (en) | 2019-03-21 | 2025-11-25 | The Board Of Trustees Of The Leland Stanford Junior University | Systems and methods for localization based on machine learning |
| EP3946125B1 (en) * | 2019-03-25 | 2025-10-15 | Brainlab SE | Determining a surgical port for a trocar or laparoscope |
| US12048487B2 (en) * | 2019-05-06 | 2024-07-30 | Biosense Webster (Israel) Ltd. | Systems and methods for improving cardiac ablation procedures |
| EP3972518A4 (en) | 2019-05-22 | 2023-10-11 | Covidien LP | SURGICAL ROBOTIC ARMS STORAGE ASSEMBLIES AND METHODS OF REPLACING SURGICAL ROBOTIC ARMS USING THE STORAGE ASSEMBLIES |
| US20220211270A1 (en) * | 2019-05-23 | 2022-07-07 | Intuitive Surgical Operations, Inc. | Systems and methods for generating workspace volumes and identifying reachable workspaces of surgical instruments |
| KR102051309B1 (ko) * | 2019-06-27 | 2019-12-03 | 주식회사 버넥트 | 지능형 인지기술기반 증강현실시스템 |
| EP3989793B1 (en) | 2019-06-28 | 2025-11-19 | Auris Health, Inc. | Surgical console interface |
| US12178666B2 (en) | 2019-07-29 | 2024-12-31 | Augmedics Ltd. | Fiducial marker |
| US11980506B2 (en) | 2019-07-29 | 2024-05-14 | Augmedics Ltd. | Fiducial marker |
| KR20220058918A (ko) | 2019-08-30 | 2022-05-10 | 아우리스 헬스, 인코포레이티드 | 기구 이미지 신뢰성 시스템 및 방법 |
| US11207141B2 (en) | 2019-08-30 | 2021-12-28 | Auris Health, Inc. | Systems and methods for weight-based registration of location sensors |
| GB201912903D0 (en) * | 2019-09-06 | 2019-10-23 | Inovus Ltd | Laparoscopic simulator |
| US11234779B2 (en) * | 2019-09-10 | 2022-02-01 | Verb Surgical. Inc. | Handheld user interface device for a surgical robot |
| WO2021046752A1 (en) | 2019-09-11 | 2021-03-18 | Covidien Lp | Systems and methods for neural-network based color restoration |
| US12223629B2 (en) | 2019-09-11 | 2025-02-11 | Covidien Lp | Systems and methods for smoke-reduction in images |
| CN110638529B (zh) * | 2019-09-20 | 2021-04-27 | 和宇健康科技股份有限公司 | 一种手术远程控制方法、装置、存储介质及终端设备 |
| US12324645B2 (en) | 2019-09-26 | 2025-06-10 | Auris Health, Inc. | Systems and methods for collision avoidance using object models |
| CN120788732A (zh) * | 2019-09-30 | 2025-10-17 | 马科外科公司 | 用于引导工具的移动的系统和方法 |
| BR112022007849A2 (pt) | 2019-10-29 | 2022-07-05 | Verb Surgical Inc | Sistemas de realidade virtual para simulação de fluxo de trabalho cirúrgico com modelo de paciente e sala de operação personalizável |
| US11071601B2 (en) | 2019-11-11 | 2021-07-27 | Procept Biorobotics Corporation | Surgical probes for tissue resection with robotic arms |
| US20230024362A1 (en) * | 2019-12-09 | 2023-01-26 | Covidien Lp | System for checking instrument state of a surgical robotic arm |
| WO2021126545A1 (en) | 2019-12-16 | 2021-06-24 | Covidien Lp | Surgical robotic systems including surgical instruments with articulation |
| US11382712B2 (en) | 2019-12-22 | 2022-07-12 | Augmedics Ltd. | Mirroring in image guided surgery |
| WO2021133483A1 (en) | 2019-12-23 | 2021-07-01 | Covidien Lp | System for guiding surgical procedures |
| KR102347471B1 (ko) * | 2019-12-26 | 2022-01-06 | (의료)길의료재단 | 증강 현실을 이용한 모의 수술 시스템 및 방법 |
| KR20220123273A (ko) | 2019-12-31 | 2022-09-06 | 아우리스 헬스, 인코포레이티드 | 해부학적 특징부 식별 및 표적설정 |
| CN114901192A (zh) | 2019-12-31 | 2022-08-12 | 奥瑞斯健康公司 | 用于经皮进入的对准技术 |
| EP4084722A4 (en) | 2019-12-31 | 2024-01-10 | Auris Health, Inc. | ALIGNMENT INTERFACES FOR PERCUTANE ACCESS |
| EP4099915B1 (en) | 2020-02-06 | 2025-09-03 | Covidien LP | System and methods for suturing guidance |
| US20210248922A1 (en) * | 2020-02-11 | 2021-08-12 | Covidien Lp | Systems and methods for simulated product training and/or experience |
| GB2592378B (en) * | 2020-02-25 | 2024-04-03 | Cmr Surgical Ltd | Controlling movement of a surgical robot arm |
| EP4110221A1 (en) | 2020-02-26 | 2023-01-04 | Covidien LP | Robotic surgical instrument including linear encoders for measuring cable displacement |
| CN111276032A (zh) * | 2020-02-29 | 2020-06-12 | 中山大学中山眼科中心 | 一种虚拟手术训练系统 |
| WO2021194903A1 (en) * | 2020-03-23 | 2021-09-30 | Intuitive Surgical Operations, Inc. | Systems and methods for optimizing configurations of a computer-assisted surgical system for reachability of target objects |
| US11737663B2 (en) | 2020-03-30 | 2023-08-29 | Auris Health, Inc. | Target anatomical feature localization |
| JP7475948B2 (ja) * | 2020-04-24 | 2024-04-30 | 東芝システムテクノロジー株式会社 | 訓練システム、方法及びプログラム |
| CN115484858A (zh) | 2020-05-12 | 2022-12-16 | 柯惠Lp公司 | 用于在外科手术过程期间的图像映射和融合的系统和方法 |
| US12030195B2 (en) | 2020-05-27 | 2024-07-09 | Covidien Lp | Tensioning mechanisms and methods for articulating surgical instruments such as for use in robotic surgical systems |
| EP3923297A1 (en) * | 2020-06-11 | 2021-12-15 | Koninklijke Philips N.V. | Simulation mode for a medical device |
| US11389252B2 (en) | 2020-06-15 | 2022-07-19 | Augmedics Ltd. | Rotating marker for image guided surgery |
| US11096753B1 (en) * | 2020-06-26 | 2021-08-24 | Procept Biorobotics Corporation | Systems and methods for defining and modifying range of motion of probe used in patient treatment |
| US11877818B2 (en) | 2020-06-26 | 2024-01-23 | Procept Biorobotics Corporation | Integration of robotic arms with surgical probes |
| USD963851S1 (en) | 2020-07-10 | 2022-09-13 | Covidien Lp | Port apparatus |
| FR3112416B1 (fr) * | 2020-07-10 | 2024-08-09 | Univ De Lorraine | Procédé et système d'assistance à l'apprentissage de la chirurgie endoscopique |
| KR102505016B1 (ko) * | 2020-08-03 | 2023-03-02 | (주)휴톰 | 수술영상 내 단위동작의 서술정보 생성 시스템 및 그 방법 |
| US12239385B2 (en) | 2020-09-09 | 2025-03-04 | Augmedics Ltd. | Universal tool adapter |
| WO2022094551A1 (en) * | 2020-10-27 | 2022-05-05 | Verily Life Sciences Llc | Detecting events during a surgery |
| KR102396104B1 (ko) * | 2020-11-16 | 2022-05-11 | 아주통신주식회사 | Ar 융복합 한의학 교육 장치 |
| EP4231908A4 (en) * | 2020-11-24 | 2024-05-08 | Global Diagnostic Imaging Solutions, LLP | System and method for medical simulation |
| US20240033005A1 (en) * | 2020-12-01 | 2024-02-01 | Intuitive Surgical Operations, Inc. | Systems and methods for generating virtual reality guidance |
| CN112802594B (zh) * | 2021-01-26 | 2023-07-25 | 巴超飞 | 一种远程诊疗系统 |
| KR102624918B1 (ko) * | 2021-02-02 | 2024-01-15 | 경북대학교 산학협력단 | 환자유형별 맞춤형 훈련제공이 가능한 햅틱기반 이비인후과 및 신경외과 의학실습 시뮬레이터 및 방법 |
| KR102580178B1 (ko) * | 2021-02-02 | 2023-09-19 | 경북대학교 산학협력단 | 환자유형별 맞춤형 훈련제공이 가능한 이비인후과 및 신경외과 의학실습 시뮬레이터 및 방법 |
| US12161419B2 (en) * | 2021-03-05 | 2024-12-10 | Verb Surgical Inc. | Robot-assisted setup for a surgical robotic system |
| DE102021109241B4 (de) * | 2021-04-13 | 2023-03-16 | Siemens Healthcare Gmbh | System und verfahren zum bereitstellen interaktiver virtueller schulungen für mehrere medizinische mitarbeiter in echtzeit |
| US12178498B2 (en) | 2021-04-20 | 2024-12-31 | Procept Biorobotics Corporation | Surgical probe with independent energy sources |
| US12409003B2 (en) | 2021-05-14 | 2025-09-09 | Covidien Lp | Instrument cassette assemblies for robotic surgical instruments |
| US20220370137A1 (en) | 2021-05-21 | 2022-11-24 | Cilag Gmbh International | Surgical Simulation Object Rectification System |
| US11948226B2 (en) | 2021-05-28 | 2024-04-02 | Covidien Lp | Systems and methods for clinical workspace simulation |
| US12369998B2 (en) | 2021-05-28 | 2025-07-29 | Covidien Lp | Real time monitoring of a robotic drive module |
| US11896445B2 (en) | 2021-07-07 | 2024-02-13 | Augmedics Ltd. | Iliac pin and adapter |
| KR102667464B1 (ko) * | 2021-07-21 | 2024-05-20 | (주)휴톰 | 환자의 3차원 가상 기복 모델 상에 트로카의 삽입 위치를 결정하는 장치 및 방법 |
| US12150821B2 (en) | 2021-07-29 | 2024-11-26 | Augmedics Ltd. | Rotating marker and adapter for image-guided surgery |
| US12475662B2 (en) | 2021-08-18 | 2025-11-18 | Augmedics Ltd. | Stereoscopic display and digital loupe for augmented-reality near-eye display |
| WO2023053334A1 (ja) * | 2021-09-30 | 2023-04-06 | オリンパス株式会社 | 処理システム及び情報処理方法 |
| US12496119B2 (en) | 2021-12-06 | 2025-12-16 | Covidien Lp | Jaw member, end effector assembly, and method of manufacturing a jaw member of an electrosurgical instrument |
| US12390294B2 (en) | 2021-12-14 | 2025-08-19 | Covidien Lp | Robotic surgical assemblies including surgical instruments having articulatable wrist assemblies |
| US12433699B2 (en) | 2022-02-10 | 2025-10-07 | Covidien Lp | Surgical robotic systems and robotic arm carts thereof |
| EP4491142A4 (en) * | 2022-03-08 | 2025-06-25 | Sony Group Corporation | SIMULATION SYSTEM, INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD |
| DE112023001322T5 (de) | 2022-03-09 | 2025-01-09 | All India Institute Of Medical Sciences (Aiims) | 3-dimensionaler tracking und navigationssimulator für neuroendoskopie |
| US20250124815A1 (en) * | 2022-03-16 | 2025-04-17 | Intuitive Surgical Operations, Inc. | Systems and methods for generating customized medical simulations |
| EP4511809A1 (en) | 2022-04-21 | 2025-02-26 | Augmedics Ltd. | Systems and methods for medical image visualization |
| US12479098B2 (en) | 2022-08-03 | 2025-11-25 | Covidien Lp | Surgical robotic system with access port storage |
| US12465447B2 (en) | 2022-08-25 | 2025-11-11 | Covidien Lp | Surgical robotic system with instrument detection |
| IL319523A (en) | 2022-09-13 | 2025-05-01 | Augmedics Ltd | Augmented reality glasses for image-guided medical intervention |
| US12496728B2 (en) | 2022-10-25 | 2025-12-16 | Covidien Lp | Surgical robotic system and method for restoring operational state |
| KR20240067173A (ko) | 2022-11-07 | 2024-05-16 | 그리다텍 주식회사 | 멀티플랫폼 실습환경 공유 시스템 |
| CN115670352B (zh) * | 2023-01-05 | 2023-03-31 | 珠海视新医用科技有限公司 | 内窥镜防碰撞报警装置 |
| USD1066383S1 (en) | 2023-01-13 | 2025-03-11 | Covidien Lp | Display screen with graphical user interface |
| USD1066404S1 (en) | 2023-01-13 | 2025-03-11 | Covidien Lp | Display screen with graphical user interface |
| USD1066405S1 (en) | 2023-01-13 | 2025-03-11 | Covidien Lp | Display screen with graphical user interface |
| USD1066381S1 (en) | 2023-01-13 | 2025-03-11 | Covidien Lp | Display screen with graphical user interface |
| USD1066380S1 (en) | 2023-01-13 | 2025-03-11 | Covidien Lp | Display screen with graphical user interface |
| USD1066382S1 (en) | 2023-01-13 | 2025-03-11 | Covidien Lp | Display screen with graphical user interface |
| USD1066378S1 (en) | 2023-01-13 | 2025-03-11 | Covidien Lp | Display screen with graphical user interface |
| USD1066379S1 (en) | 2023-01-13 | 2025-03-11 | Covidien Lp | Display screen with graphical user interface |
| US20240268890A1 (en) * | 2023-02-14 | 2024-08-15 | Anastasios Papadonikolakis | Surgery simulation system and method |
| WO2024201141A1 (en) * | 2023-03-28 | 2024-10-03 | Fvrvs Limited | Systems and methods for simulating surgical procedures |
| FR3148520A1 (fr) * | 2023-05-12 | 2024-11-15 | Virtualisurg | Console d’entraînement chirurgical à contrainte |
| USD1087995S1 (en) | 2023-08-02 | 2025-08-12 | Covidien Lp | Surgeon display screen with a transitional graphical user interface having staple firing icon |
| USD1087135S1 (en) | 2023-08-02 | 2025-08-05 | Covidien Lp | Surgeon display screen with a graphical user interface having spent staple icon |
| KR102628586B1 (ko) * | 2023-09-20 | 2024-01-25 | 그리다텍 주식회사 | 의료소모품 낙하 훈련을 위한 가상현실 시뮬레이션 방법 |
| KR102718826B1 (ko) | 2023-11-16 | 2024-10-17 | (주)현성에프에이 | IoT 기반 부품 제조 및 조립을 위한 스마트팩토리 시스템 |
| US20250177069A1 (en) * | 2023-12-05 | 2025-06-05 | Metal Industries Research & Development Centre | Surgical robot arm control system and surgical robot arm control method |
| WO2025133854A1 (en) * | 2023-12-20 | 2025-06-26 | Covidien Lp | Systems and methods for cooperation between surgeon and assistant in virtual procedure training |
| KR20250120656A (ko) * | 2024-02-02 | 2025-08-11 | 국립암센터 | 싱글 포트 수술 트레이닝 장치 및 그의 수술 트레이닝 방법 |
| EP4607495A1 (en) | 2024-02-23 | 2025-08-27 | Virtamed AG | Medical training system and method for medical training |
| JP2025154561A (ja) * | 2024-03-29 | 2025-10-10 | 川崎重工業株式会社 | 支援システムおよび支援方法 |
| WO2025212743A1 (en) * | 2024-04-02 | 2025-10-09 | Auburn University | System and method for simulating a medical examination |
| US12417713B1 (en) * | 2024-12-18 | 2025-09-16 | Mammen Thomas | Use of real-time and storable image data stream for generation of an immersive virtual universe in metaverse or a 3-D hologram or image, for teaching and training students |
| CN119863967B (zh) * | 2025-03-25 | 2025-07-11 | 长春理工大学 | 一种微创手术仿真训练器械定位装置与定位方法 |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2006060406A1 (en) | 2004-11-30 | 2006-06-08 | The Regents Of The University Of California | Multimodal medical procedure training system |
| US20090253109A1 (en) * | 2006-04-21 | 2009-10-08 | Mehran Anvari | Haptic Enabled Robotic Training System and Method |
| US20100234857A1 (en) * | 1998-11-20 | 2010-09-16 | Intuitve Surgical Operations, Inc. | Medical robotic system with operatively couplable simulator unit for surgeon training |
| KR20110065388A (ko) * | 2009-12-07 | 2011-06-15 | 광주과학기술원 | 의료 훈련 시뮬레이션 시스템 및 방법 |
| KR20120122542A (ko) * | 2011-04-29 | 2012-11-07 | 주식회사 코어메드 | 영상수술 리허설 제공방법 및 시스템, 그 기록매체 |
| US20130295540A1 (en) * | 2010-05-26 | 2013-11-07 | The Research Foundation For The State University Of New York | Method and System for Minimally-Invasive Surgery Training Using Tracking Data |
Family Cites Families (112)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4321047A (en) | 1980-06-05 | 1982-03-23 | Bradley Landis | Simulator and process for teaching surgical knot tying techniques |
| US5403191A (en) | 1991-10-21 | 1995-04-04 | Tuason; Leo B. | Laparoscopic surgery simulator and method of use |
| US5769640A (en) * | 1992-12-02 | 1998-06-23 | Cybernet Systems Corporation | Method and system for simulating medical procedures including virtual reality and control method and system for use therein |
| US5766016A (en) | 1994-11-14 | 1998-06-16 | Georgia Tech Research Corporation | Surgical simulator and method for simulating surgical procedure |
| US5882206A (en) | 1995-03-29 | 1999-03-16 | Gillio; Robert G. | Virtual surgery system |
| US5620326A (en) | 1995-06-09 | 1997-04-15 | Simulab Corporation | Anatomical simulator for videoendoscopic surgical training |
| US6929481B1 (en) * | 1996-09-04 | 2005-08-16 | Immersion Medical, Inc. | Interface device and method for interfacing instruments to medical procedure simulation systems |
| US6024576A (en) * | 1996-09-06 | 2000-02-15 | Immersion Corporation | Hemispherical, high bandwidth mechanical interface for computer systems |
| US6132368A (en) * | 1996-12-12 | 2000-10-17 | Intuitive Surgical, Inc. | Multi-component telepresence system and method |
| US5945056A (en) | 1997-05-28 | 1999-08-31 | Simutech Limited | Method of making a surgical simulator |
| WO1999017265A1 (en) * | 1997-09-26 | 1999-04-08 | Boston Dynamics, Inc. | Method and apparatus for surgical training and simulating surgery |
| WO1999042978A1 (en) * | 1998-02-19 | 1999-08-26 | Boston Dynamics, Inc. | Method and apparatus for surgical training and simulating surgery |
| US6659939B2 (en) | 1998-11-20 | 2003-12-09 | Intuitive Surgical, Inc. | Cooperative minimally invasive telesurgical system |
| US7594912B2 (en) * | 2004-09-30 | 2009-09-29 | Intuitive Surgical, Inc. | Offset remote center manipulator for robotic surgery |
| US6544041B1 (en) | 1999-10-06 | 2003-04-08 | Fonar Corporation | Simulator for surgical procedures |
| JP2001150368A (ja) | 1999-11-24 | 2001-06-05 | Olympus Optical Co Ltd | マニピュレータ制御装置 |
| US6377011B1 (en) | 2000-01-26 | 2002-04-23 | Massachusetts Institute Of Technology | Force feedback user interface for minimally invasive surgical simulator and teleoperator and other similar apparatus |
| US7857626B2 (en) | 2000-10-23 | 2010-12-28 | Toly Christopher C | Medical physiological simulator including a conductive elastomer layer |
| SE518252C2 (sv) * | 2001-01-24 | 2002-09-17 | Goeteborg University Surgical | Metod för simulering av ett kirurgiskt moment, metod för simulering av kirurgisk operation och system för simulering av ett kirurgiskt moment |
| US7607440B2 (en) * | 2001-06-07 | 2009-10-27 | Intuitive Surgical, Inc. | Methods and apparatus for surgical planning |
| US7831292B2 (en) * | 2002-03-06 | 2010-11-09 | Mako Surgical Corp. | Guidance system and method for surgical procedures with improved feedback |
| US7798815B2 (en) | 2002-04-03 | 2010-09-21 | University Of The West Indies | Computer-controlled tissue-based simulator for training in cardiac surgical techniques |
| DE10217630A1 (de) | 2002-04-19 | 2003-11-13 | Robert Riener | Verfahren und Vorrichtung zum Erlernen und Trainieren zahnärztlicher Behandlungsmethoden |
| SE0202864D0 (sv) * | 2002-09-30 | 2002-09-30 | Goeteborgs University Surgical | Device and method for generating a virtual anatomic environment |
| US20050142525A1 (en) * | 2003-03-10 | 2005-06-30 | Stephane Cotin | Surgical training system for laparoscopic procedures |
| US20070275359A1 (en) | 2004-06-22 | 2007-11-29 | Rotnes Jan S | Kit, operating element and haptic device for use in surgical simulation systems |
| GB0420977D0 (en) | 2004-09-21 | 2004-10-20 | Keymed Medicals & Ind Equip | An instrument for use in a medical simulator |
| US8480404B2 (en) * | 2004-11-30 | 2013-07-09 | Eric A. Savitsky | Multimodal ultrasound training system |
| US20070292829A1 (en) | 2004-12-02 | 2007-12-20 | King Lynn R | Intravenous (iv) training system |
| US8073528B2 (en) * | 2007-09-30 | 2011-12-06 | Intuitive Surgical Operations, Inc. | Tool tracking systems, methods and computer products for image guided surgery |
| US9789608B2 (en) | 2006-06-29 | 2017-10-17 | Intuitive Surgical Operations, Inc. | Synthetic representation of a surgical robot |
| JP4152402B2 (ja) * | 2005-06-29 | 2008-09-17 | 株式会社日立メディコ | 手術支援装置 |
| US8382485B2 (en) | 2005-09-29 | 2013-02-26 | The General Hospital Corporation | Methods and apparatus for providing realistic medical training |
| US8190238B2 (en) | 2005-12-09 | 2012-05-29 | Hansen Medical, Inc. | Robotic catheter system and methods |
| WO2007082313A2 (en) | 2006-01-13 | 2007-07-19 | East Tennessee State University Research Foundation | Surgical simulator system |
| JP2009133878A (ja) | 2006-03-03 | 2009-06-18 | Univ Waseda | 外科手術訓練装置 |
| US20070238981A1 (en) * | 2006-03-13 | 2007-10-11 | Bracco Imaging Spa | Methods and apparatuses for recording and reviewing surgical navigation processes |
| ES2298051B2 (es) * | 2006-07-28 | 2009-03-16 | Universidad De Malaga | Sistema robotico de asistencia a la cirugia minimamente invasiva capaz de posicionar un instrumento quirurgico en respueta a las ordenes de un cirujano sin fijacion a la mesa de operaciones ni calibracion previa del punto de insercion. |
| US20080085499A1 (en) * | 2006-10-05 | 2008-04-10 | Christopher Horvath | Surgical console operable to simulate surgical procedures |
| US8460002B2 (en) | 2006-10-18 | 2013-06-11 | Shyh-Jen Wang | Laparoscopic trainer and method of training |
| WO2008099028A1 (es) | 2007-02-14 | 2008-08-21 | Gmv, S.A. | Sistema de simulación para entrenamiento en cirugía artroscópica. |
| US20090017430A1 (en) * | 2007-05-15 | 2009-01-15 | Stryker Trauma Gmbh | Virtual surgical training tool |
| US7706000B2 (en) * | 2007-07-18 | 2010-04-27 | Immersion Medical, Inc. | Orientation sensing of a rod |
| US20090132925A1 (en) | 2007-11-15 | 2009-05-21 | Nli Llc | Adventure learning immersion platform |
| EP2068294A1 (en) | 2007-12-03 | 2009-06-10 | Endosim Limited | Laparoscopic training apparatus |
| US8786675B2 (en) | 2008-01-23 | 2014-07-22 | Michael F. Deering | Systems using eye mounted displays |
| US8956165B2 (en) * | 2008-01-25 | 2015-02-17 | University Of Florida Research Foundation, Inc. | Devices and methods for implementing endoscopic surgical procedures and instruments within a virtual environment |
| JP2009236963A (ja) | 2008-03-25 | 2009-10-15 | Panasonic Electric Works Co Ltd | 内視鏡手術用トレーニング装置、内視鏡手術用技能評価方法 |
| US7843158B2 (en) * | 2008-03-31 | 2010-11-30 | Intuitive Surgical Operations, Inc. | Medical robotic system adapted to inhibit motions resulting in excessive end effector forces |
| US20090263775A1 (en) * | 2008-04-22 | 2009-10-22 | Immersion Medical | Systems and Methods for Surgical Simulation and Training |
| WO2010008846A2 (en) | 2008-06-23 | 2010-01-21 | John Richard Dein | Intra-operative system for identifying and tracking surgical sharp objects, instruments, and sponges |
| JP4565220B2 (ja) | 2008-07-30 | 2010-10-20 | 株式会社モリタ製作所 | 医療用実習装置 |
| EP2320990B2 (en) * | 2008-08-29 | 2023-05-31 | Corindus, Inc. | Catheter control system and graphical user interface |
| JP2010082189A (ja) | 2008-09-30 | 2010-04-15 | Olympus Corp | 手術マニピュレータシステムにおけるマニピュレータのキャリブレーション方法 |
| US20100099066A1 (en) | 2008-10-21 | 2010-04-22 | Warsaw Orthopedics, Inc. | Surgical Training System and Model With Simulated Neural Responses and Elements |
| US20100167248A1 (en) * | 2008-12-31 | 2010-07-01 | Haptica Ltd. | Tracking and training system for medical procedures |
| US20100167249A1 (en) * | 2008-12-31 | 2010-07-01 | Haptica Ltd. | Surgical training simulator having augmented reality |
| US20100167250A1 (en) * | 2008-12-31 | 2010-07-01 | Haptica Ltd. | Surgical training simulator having multiple tracking systems |
| US20100167253A1 (en) | 2008-12-31 | 2010-07-01 | Haptica Ltd. | Surgical training simulator |
| US8480405B2 (en) | 2009-02-24 | 2013-07-09 | Innovative Surgical Designs, Inc. | Surgical simulation device and assembly |
| KR20110136847A (ko) * | 2009-03-12 | 2011-12-21 | 헬스 리서치 인코포레이티드 | 최소 침습 수술 트레이닝 방법 및 시스템 |
| KR101914303B1 (ko) | 2009-03-20 | 2018-11-01 | 더 존스 홉킨스 유니버시티 | 전문 기술을 정량화하는 방법 및 시스템 |
| US20110117530A1 (en) * | 2009-05-07 | 2011-05-19 | Technion Research & Development Foundation Ltd. | Method and system of simulating physical object incisions, deformations and interactions therewith |
| US8662900B2 (en) * | 2009-06-04 | 2014-03-04 | Zimmer Dental Inc. | Dental implant surgical training simulation system |
| US20110046935A1 (en) * | 2009-06-09 | 2011-02-24 | Kiminobu Sugaya | Virtual surgical table |
| WO2010148078A2 (en) * | 2009-06-16 | 2010-12-23 | Simquest Llc | Hemorrhage control simulator |
| DE102009048994A1 (de) * | 2009-10-09 | 2011-04-14 | Karl Storz Gmbh & Co. Kg | Simulationssystem für das Training endoskopischer Operationen |
| DE102009060522A1 (de) * | 2009-12-23 | 2011-06-30 | Karl Storz GmbH & Co. KG, 78532 | Simulationssystem für das Training endoskopischer Operationen |
| US9341704B2 (en) * | 2010-04-13 | 2016-05-17 | Frederic Picard | Methods and systems for object tracking |
| US8469716B2 (en) | 2010-04-19 | 2013-06-25 | Covidien Lp | Laparoscopic surgery simulator |
| JP2012005557A (ja) | 2010-06-23 | 2012-01-12 | Terumo Corp | 医療用ロボットシステム |
| US20120053406A1 (en) | 2010-09-01 | 2012-03-01 | Conlon Sean P | Minimally invasive surgery |
| EP4002330B1 (en) | 2010-10-01 | 2024-09-04 | Applied Medical Resources Corporation | Portable laparoscopic trainer |
| US20120251987A1 (en) | 2010-10-28 | 2012-10-04 | Ta-Ko Huang | System and method for virtual reality simulation of local dental anesthesiological techniques and skills training |
| CN103299355B (zh) * | 2010-11-04 | 2016-09-14 | 约翰霍普金斯大学 | 用于微创手术技能的评估或改进的系统和方法 |
| US20120115118A1 (en) | 2010-11-08 | 2012-05-10 | Marshall M Blair | Suture training device |
| US9283675B2 (en) * | 2010-11-11 | 2016-03-15 | The Johns Hopkins University | Human-machine collaborative robotic systems |
| US20120135387A1 (en) | 2010-11-29 | 2012-05-31 | Stage Front Presentation Systems | Dental educational apparatus and method |
| JP5550050B2 (ja) | 2010-12-14 | 2014-07-16 | 株式会社ティー・エム・シー | 人体の部分模型 |
| WO2012082987A1 (en) | 2010-12-15 | 2012-06-21 | Allergan, Inc. | Anatomical model |
| US8932063B2 (en) * | 2011-04-15 | 2015-01-13 | Ams Research Corporation | BPH laser ablation simulation |
| US10354555B2 (en) * | 2011-05-02 | 2019-07-16 | Simbionix Ltd. | System and method for performing a hybrid simulation of a medical procedure |
| CN102254476B (zh) * | 2011-07-18 | 2014-12-10 | 广州赛宝联睿信息科技有限公司 | 内窥镜微创手术模拟训练方法及其系统 |
| KR101963610B1 (ko) | 2011-10-21 | 2019-03-29 | 어플라이드 메디컬 리소시스 코포레이션 | 수술 트레이닝용 모의 조직 구조 |
| CA2859967A1 (en) | 2011-12-20 | 2013-06-27 | Applied Medical Resources Corporation | Advanced surgical simulation |
| US9424761B2 (en) * | 2012-01-23 | 2016-08-23 | Virtamed Ag | Medical simulation system and method with configurable anatomy model manufacturing |
| US8992230B2 (en) * | 2012-01-23 | 2015-03-31 | Virtamed Ag | Medical training systems and methods |
| US9472123B2 (en) | 2012-01-27 | 2016-10-18 | Gaumard Scientific Company, Inc. | Human tissue models, materials, and methods |
| US9123261B2 (en) | 2012-01-28 | 2015-09-01 | Gaumard Scientific Company, Inc. | Surgical simulation models, materials, and methods |
| US9092996B2 (en) * | 2012-03-01 | 2015-07-28 | Simquest Llc | Microsurgery simulator |
| EP4140414A1 (en) * | 2012-03-07 | 2023-03-01 | Ziteo, Inc. | Methods and systems for tracking and guiding sensors and instruments |
| US20140051049A1 (en) | 2012-08-17 | 2014-02-20 | Intuitive Surgical Operations, Inc. | Anatomical model and method for surgical training |
| US9607528B2 (en) * | 2012-08-24 | 2017-03-28 | Simquest International, Llc | Combined soft tissue and bone surgical simulator |
| WO2014052373A1 (en) | 2012-09-26 | 2014-04-03 | Applied Medical Resources Corporation | Surgical training model for laparoscopic procedures |
| WO2014052612A1 (en) | 2012-09-27 | 2014-04-03 | Applied Medical Resources Corporation | Surgical training model for laparoscopic procedures |
| EP4276801A3 (en) | 2012-09-27 | 2024-01-03 | Applied Medical Resources Corporation | Surgical training model for laparoscopic procedures |
| EP3467805B1 (en) | 2012-09-28 | 2020-07-08 | Applied Medical Resources Corporation | Surgical training model for transluminal laparoscopic procedures |
| US20140106328A1 (en) | 2012-10-17 | 2014-04-17 | The Cleveland Clinic Foundation | Surgical training apparatus |
| CN103077633A (zh) * | 2013-01-11 | 2013-05-01 | 深圳超多维光电子有限公司 | 一种立体虚拟培训系统和方法 |
| CA2897439A1 (en) * | 2013-01-23 | 2014-07-31 | Ams Research Corporation | Surgical training system |
| WO2014152668A1 (en) * | 2013-03-15 | 2014-09-25 | Ratcliffe Mark B | System and method for performing virtual surgery |
| US9087458B2 (en) | 2013-03-15 | 2015-07-21 | Smartummy Llc | Dynamically-changeable abdominal simulator system |
| US9117377B2 (en) | 2013-03-15 | 2015-08-25 | SmarTummy, LLC | Dynamically-changeable abdominal simulator system |
| WO2014179556A1 (en) | 2013-05-01 | 2014-11-06 | Northwestern University | Surgical simulators and methods associated with the same |
| US9595208B2 (en) * | 2013-07-31 | 2017-03-14 | The General Hospital Corporation | Trauma training simulator with event-based gesture detection and instrument-motion tracking |
| US9283048B2 (en) * | 2013-10-04 | 2016-03-15 | KB Medical SA | Apparatus and systems for precise guidance of surgical tools |
| KR102366023B1 (ko) | 2013-12-20 | 2022-02-23 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | 의료 절차 훈련을 위한 시뮬레이터 시스템 |
| US20150262511A1 (en) * | 2014-03-17 | 2015-09-17 | Henry Lin | Systems and methods for medical device simulator scoring |
| US20160314711A1 (en) * | 2015-04-27 | 2016-10-27 | KindHeart, Inc. | Telerobotic surgery system for remote surgeon training using robotic surgery station and remote surgeon station with display of actual animal tissue images and associated methods |
| AU2016263585B2 (en) * | 2015-05-19 | 2021-04-29 | Mako Surgical Corp. | System and method for demonstrating planned autonomous manipulation of an anatomy |
| US10528840B2 (en) * | 2015-06-24 | 2020-01-07 | Stryker Corporation | Method and system for surgical instrumentation setup and user preferences |
| US10648790B2 (en) * | 2016-03-02 | 2020-05-12 | Truinject Corp. | System for determining a three-dimensional position of a testing tool |
-
2014
- 2014-12-19 KR KR1020167019144A patent/KR102366023B1/ko active Active
- 2014-12-19 JP JP2016541234A patent/JP6659547B2/ja active Active
- 2014-12-19 WO PCT/US2014/071521 patent/WO2015095715A1/en not_active Ceased
- 2014-12-19 CN CN201480076076.0A patent/CN106030683B/zh active Active
- 2014-12-19 KR KR1020227005342A patent/KR102405656B1/ko active Active
- 2014-12-19 US US15/106,254 patent/US10510267B2/en active Active
- 2014-12-19 CN CN202011072053.6A patent/CN112201131B/zh active Active
- 2014-12-19 EP EP14871282.1A patent/EP3084747B1/en active Active
- 2014-12-19 EP EP22205491.8A patent/EP4184483B1/en active Active
-
2019
- 2019-09-26 US US16/584,564 patent/US11468791B2/en active Active
-
2020
- 2020-02-06 JP JP2020018974A patent/JP6916322B2/ja active Active
-
2021
- 2021-07-15 JP JP2021117360A patent/JP7195385B2/ja active Active
-
2022
- 2022-09-02 US US17/902,678 patent/US12456392B2/en active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20100234857A1 (en) * | 1998-11-20 | 2010-09-16 | Intuitve Surgical Operations, Inc. | Medical robotic system with operatively couplable simulator unit for surgeon training |
| WO2006060406A1 (en) | 2004-11-30 | 2006-06-08 | The Regents Of The University Of California | Multimodal medical procedure training system |
| US20090253109A1 (en) * | 2006-04-21 | 2009-10-08 | Mehran Anvari | Haptic Enabled Robotic Training System and Method |
| KR20110065388A (ko) * | 2009-12-07 | 2011-06-15 | 광주과학기술원 | 의료 훈련 시뮬레이션 시스템 및 방법 |
| US20130295540A1 (en) * | 2010-05-26 | 2013-11-07 | The Research Foundation For The State University Of New York | Method and System for Minimally-Invasive Surgery Training Using Tracking Data |
| KR20120122542A (ko) * | 2011-04-29 | 2012-11-07 | 주식회사 코어메드 | 영상수술 리허설 제공방법 및 시스템, 그 기록매체 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3084747A4 |
Cited By (58)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9805625B2 (en) | 2010-10-29 | 2017-10-31 | KindHeart, Inc. | Surgical simulation assembly |
| US10013896B2 (en) | 2010-10-29 | 2018-07-03 | The University Of North Carolina At Chapel Hill | Modular staged reality simulator |
| US11727827B2 (en) | 2012-08-17 | 2023-08-15 | Intuitive Surgical Operations, Inc. | Anatomical model and method for surgical training |
| US12456392B2 (en) | 2013-12-20 | 2025-10-28 | Intuitive Surgical Operations, Inc. | Simulator system for medical procedure training |
| US12383358B2 (en) | 2015-06-09 | 2025-08-12 | Intuitive Surgical Operations, Inc. | Configuring surgical system with surgical procedures atlas |
| US11737841B2 (en) | 2015-06-09 | 2023-08-29 | Intuitive Surgical Operations, Inc. | Configuring surgical system with surgical procedures atlas |
| US11058501B2 (en) | 2015-06-09 | 2021-07-13 | Intuitive Surgical Operations, Inc. | Configuring surgical system with surgical procedures atlas |
| WO2017030848A1 (en) * | 2015-08-17 | 2017-02-23 | Ethicon Endo-Surgery, Llc | Gathering and analyzing data for robotic surgical systems |
| US10136949B2 (en) | 2015-08-17 | 2018-11-27 | Ethicon Llc | Gathering and analyzing data for robotic surgical systems |
| EP3345478A4 (en) * | 2015-09-02 | 2019-04-10 | Universidad Miguel Hernandez De Elche | CLINICAL KADAVER SIMULATOR |
| US10198969B2 (en) | 2015-09-16 | 2019-02-05 | KindHeart, Inc. | Surgical simulation system and associated methods |
| US11751957B2 (en) * | 2015-11-12 | 2023-09-12 | Intuitive Surgical Operations, Inc. | Surgical system with training or assist functions |
| CN113456241A (zh) * | 2015-11-12 | 2021-10-01 | 直观外科手术操作公司 | 具有训练或辅助功能的外科手术系统 |
| WO2017083768A1 (en) | 2015-11-12 | 2017-05-18 | Jarc Anthony Michael | Surgical system with training or assist functions |
| EP3373834A4 (en) * | 2015-11-12 | 2019-07-31 | Intuitive Surgical Operations Inc. | SURGICAL SYSTEM WITH TRAINING OR ASSISTANCE FUNCTION |
| CN108472084A (zh) * | 2015-11-12 | 2018-08-31 | 直观外科手术操作公司 | 具有训练或辅助功能的外科手术系统 |
| JP2021191519A (ja) * | 2015-11-12 | 2021-12-16 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | 訓練又は支援機能を有する手術システム |
| CN113456241B (zh) * | 2015-11-12 | 2025-04-25 | 直观外科手术操作公司 | 具有训练或辅助功能的外科手术系统 |
| JP7662716B2 (ja) | 2015-11-12 | 2025-04-15 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | 訓練又は支援機能を有する手術システム |
| JP7608305B2 (ja) | 2015-11-12 | 2025-01-06 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | 訓練又は支援機能を有する手術システム |
| US10912619B2 (en) | 2015-11-12 | 2021-02-09 | Intuitive Surgical Operations, Inc. | Surgical system with training or assist functions |
| US12114949B2 (en) | 2015-11-12 | 2024-10-15 | Intuitive Surgical Operations, Inc. | Surgical system with training or assist functions |
| JP2018538037A (ja) * | 2015-11-12 | 2018-12-27 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | 訓練又は支援機能を有する手術システム |
| JP2023126480A (ja) * | 2015-11-12 | 2023-09-07 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | 訓練又は支援機能を有する手術システム |
| US20210186634A1 (en) * | 2015-11-12 | 2021-06-24 | Intuitive Surgical Operations, Inc. | Surgical system with training or assist functions |
| JP2017104964A (ja) * | 2015-12-11 | 2017-06-15 | 川崎重工業株式会社 | マスターアーム入力装置 |
| ITUA20161926A1 (it) * | 2016-03-23 | 2017-09-23 | Medvirt Sagl | Metodo per la simulazione di una endoscopia. |
| WO2017176857A1 (en) * | 2016-04-08 | 2017-10-12 | KindHeart, Inc. | Thoracic surgery simulator for training surgeons |
| JP7055988B2 (ja) | 2016-09-29 | 2022-04-19 | シンバイオニクス リミテッド | 仮想現実環境または拡張現実環境の中の手術室内での医療シミュレーションのための方法およびシステム |
| JP2019537459A (ja) * | 2016-09-29 | 2019-12-26 | シンバイオニクス リミテッド | 仮想現実環境または拡張現実環境の中の手術室内での医療シミュレーションのための方法およびシステム |
| US12082897B2 (en) | 2017-04-20 | 2024-09-10 | Intuitive Surgical Operations, Inc. | Systems and methods for constraining a field of view in a virtual reality surgical system |
| US11589937B2 (en) | 2017-04-20 | 2023-02-28 | Intuitive Surgical Operations, Inc. | Systems and methods for constraining a virtual reality surgical system |
| KR102441640B1 (ko) * | 2017-06-29 | 2022-09-13 | 버브 서지컬 인크. | 가상 현실 복강경 도구 |
| US10610303B2 (en) | 2017-06-29 | 2020-04-07 | Verb Surgical Inc. | Virtual reality laparoscopic tools |
| WO2019005983A1 (en) * | 2017-06-29 | 2019-01-03 | Verb Surgical Inc. | LAPAROSCOPIC TOOLS WITH VIRTUAL REALITY |
| US11580882B2 (en) | 2017-06-29 | 2023-02-14 | Verb Surgical Inc. | Virtual reality training, simulation, and collaboration in a robotic surgical system |
| US11284955B2 (en) | 2017-06-29 | 2022-03-29 | Verb Surgical Inc. | Emulation of robotic arms and control thereof in a virtual reality environment |
| US20190133689A1 (en) * | 2017-06-29 | 2019-05-09 | Verb Surgical Inc. | Virtual reality laparoscopic tools |
| US12419712B2 (en) | 2017-06-29 | 2025-09-23 | Verb Surgical Inc. | Emulation of robotic arms and control thereof in a virtual reality environment |
| US11270601B2 (en) | 2017-06-29 | 2022-03-08 | Verb Surgical Inc. | Virtual reality system for simulating a robotic surgical environment |
| KR20200012926A (ko) * | 2017-06-29 | 2020-02-05 | 버브 서지컬 인크. | 가상 현실 복강경 도구 |
| AU2018292597B2 (en) * | 2017-06-29 | 2021-07-08 | Verb Surgical Inc. | Virtual reality laparoscopic tools |
| US11013559B2 (en) | 2017-06-29 | 2021-05-25 | Verb Surgical Inc. | Virtual reality laparoscopic tools |
| US11944401B2 (en) | 2017-06-29 | 2024-04-02 | Verb Surgical Inc. | Emulation of robotic arms and control thereof in a virtual reality environment |
| US11011077B2 (en) | 2017-06-29 | 2021-05-18 | Verb Surgical Inc. | Virtual reality training, simulation, and collaboration in a robotic surgical system |
| US12201484B2 (en) | 2017-10-23 | 2025-01-21 | Intuitive Surgical Operations, Inc. | Systems and methods for presenting augmented reality in a display of a teleoperational system |
| US11957415B2 (en) | 2018-02-20 | 2024-04-16 | Hutom Co., Ltd. | Method and device for optimizing surgery |
| EP3744283A4 (en) * | 2018-02-20 | 2022-02-23 | Hutom Co., Ltd. | Surgery optimization method and device |
| US12053254B2 (en) | 2018-07-26 | 2024-08-06 | Sony Corporation | Information processing apparatus and information processing method |
| EP3723069A1 (en) * | 2019-04-08 | 2020-10-14 | Covidien LP | Systems and methods for simulating surgical procedures |
| US12349978B2 (en) | 2019-08-16 | 2025-07-08 | Intuitive Surgical Operations, Inc. | Auto-configurable simulation system and method |
| WO2021034694A1 (en) * | 2019-08-16 | 2021-02-25 | Intuitive Surgical Operations, Inc. | Auto-configurable simulation system and method |
| CN110815215A (zh) * | 2019-10-24 | 2020-02-21 | 上海航天控制技术研究所 | 多模融合的旋转目标接近停靠抓捕地面试验系统及方法 |
| US12295662B2 (en) | 2020-03-20 | 2025-05-13 | The Johns Hopkins University | Augmented reality based surgical navigation system |
| EP4125671A4 (en) * | 2020-04-03 | 2024-04-24 | Verb Surgical Inc. | MOBILE VIRTUAL REALITY SYSTEM FOR SURGICAL ROBOT SYSTEMS |
| WO2022064059A1 (en) * | 2020-09-28 | 2022-03-31 | Institut Hospitalo-Universitaire De Strasbourg | Device for simulating the movement of an endoscope in an environment |
| WO2023038424A1 (ko) | 2021-09-07 | 2023-03-16 | 주식회사 로엔서지컬 | 신장 수술 훈련 시스템 |
| WO2023144845A1 (en) * | 2022-01-27 | 2023-08-03 | B2Or Srl | A system for performing practical surgery exercises with particular reference to the cervico-facial area |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2021165860A (ja) | 2021-10-14 |
| US20160314710A1 (en) | 2016-10-27 |
| US11468791B2 (en) | 2022-10-11 |
| KR20220025286A (ko) | 2022-03-03 |
| CN106030683A (zh) | 2016-10-12 |
| US10510267B2 (en) | 2019-12-17 |
| EP4184483B1 (en) | 2024-09-11 |
| US20220415210A1 (en) | 2022-12-29 |
| JP6916322B2 (ja) | 2021-08-11 |
| CN112201131A (zh) | 2021-01-08 |
| KR102366023B1 (ko) | 2022-02-23 |
| EP4184483A1 (en) | 2023-05-24 |
| CN112201131B (zh) | 2022-11-18 |
| KR102405656B1 (ko) | 2022-06-07 |
| JP2017510826A (ja) | 2017-04-13 |
| JP7195385B2 (ja) | 2022-12-23 |
| EP3084747A4 (en) | 2017-07-05 |
| CN106030683B (zh) | 2020-10-30 |
| EP3084747A1 (en) | 2016-10-26 |
| US20200020249A1 (en) | 2020-01-16 |
| US12456392B2 (en) | 2025-10-28 |
| KR20160102464A (ko) | 2016-08-30 |
| EP3084747B1 (en) | 2022-12-14 |
| JP2020106844A (ja) | 2020-07-09 |
| JP6659547B2 (ja) | 2020-03-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12456392B2 (en) | Simulator system for medical procedure training | |
| US12419712B2 (en) | Emulation of robotic arms and control thereof in a virtual reality environment | |
| US11580882B2 (en) | Virtual reality training, simulation, and collaboration in a robotic surgical system | |
| EP3646309B1 (en) | Virtual reality laparoscopic tools | |
| US20220101745A1 (en) | Virtual reality system for simulating a robotic surgical environment |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14871282 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2016541234 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 15106254 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| REEP | Request for entry into the european phase |
Ref document number: 2014871282 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2014871282 Country of ref document: EP |
|
| ENP | Entry into the national phase |
Ref document number: 20167019144 Country of ref document: KR Kind code of ref document: A |