US20210233429A1 - Mixed-reality endoscope and surgical tools with haptic feedback for integrated virtual-reality visual and haptic surgical simulation - Google Patents

Mixed-reality endoscope and surgical tools with haptic feedback for integrated virtual-reality visual and haptic surgical simulation Download PDF

Info

Publication number
US20210233429A1
US20210233429A1 US17/048,940 US201917048940A US2021233429A1 US 20210233429 A1 US20210233429 A1 US 20210233429A1 US 201917048940 A US201917048940 A US 201917048940A US 2021233429 A1 US2021233429 A1 US 2021233429A1
Authority
US
United States
Prior art keywords
endoscope
model
simulation system
device representing
physical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/048,940
Inventor
Samuel Barber
Saurabh Jain
Young-Jun Son
Eugene Chang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Arizona
Original Assignee
University of Arizona
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Arizona filed Critical University of Arizona
Priority to US17/048,940 priority Critical patent/US20210233429A1/en
Publication of US20210233429A1 publication Critical patent/US20210233429A1/en
Assigned to ARIZONA BOARD OF REGENTS ON BEHALF OF THE UNIVERSITY OF ARIZONA reassignment ARIZONA BOARD OF REGENTS ON BEHALF OF THE UNIVERSITY OF ARIZONA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JAIN, SAURABH, BARBER, SAMUEL, CHANG, EUGENE, SON, YOUNG-JUN
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • G09B23/285Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for injections, endoscopy, bronchoscopy, sigmoidscopy, insertion of contraceptive devices or enemas
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/24Surgical instruments, devices or methods, e.g. tourniquets for use in the oral cavity, larynx, bronchial passages or nose; Tongue scrapers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y80/00Products made by additive manufacturing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • G09B23/30Anatomical models
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/28Surgical forceps
    • A61B17/29Forceps for use in minimally invasive surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/32Surgical cutting instruments
    • A61B17/320016Endoscopic cutting instruments, e.g. arthroscopes, resectoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/32Surgical cutting instruments
    • A61B17/3201Scissors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/04Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
    • A61B18/12Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
    • A61B18/14Probes or electrodes therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00119Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00216Electrical control of surgical instruments with eye tracking or head position tracking control
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00707Dummies, phantoms; Devices simulating patient or parts of patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00707Dummies, phantoms; Devices simulating patient or parts of patient
    • A61B2017/00716Dummies, phantoms; Devices simulating patient or parts of patient simulating physical properties
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00053Mechanical features of the instrument of device
    • A61B2018/00297Means for providing haptic feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Definitions

  • the present document relates to co-filed applications entitled System for Integrated Virtual-Reality Visual and Haptic Surgical Simulator and System for Generating 3D Models for 3D Printing, and for Generating Video for an Integrated Virtual-Reality Visual and Haptic Surgical Simulator.
  • the present document describes a training, practice, and enhanced operating environment for surgeons training for, performing, or teaching surgery.
  • a training, practice, and performing tele- or virtual surgical environment is described for Functional Endoscopic Sinus Surgery (FESS).
  • FESS Functional Endoscopic Sinus Surgery
  • This document also highlights segmentation of critical structures, including the orbit, brain, cranial nerves, and vessels in augmented-reality and the training and practice environment includes visual, auditory, and haptic feedback.
  • Functional endoscopic sinus surgery utilizes surgical endoscopes that allow visualization, magnification and lighting of structures in the sinuses and nose to perform minimally invasive surgery through the nose.
  • image-guided surgery provides the surgeon with intraoperative landmarks to avoid critical structures in the sinonasal cavity, with the goal of reducing complications into the orbit, brain, cerebrospinal fluid, or major vessels. Although these complications are rare, they can be catastrophic if they occur.
  • FESS is commonly used in the surgical treatment of chronic sinusitis, the removal of sinonasal tumors, or in access to other craniofacial structures such as the orbital or cranial cavities.
  • FESS requires rigorous preoperative planning and careful intraoperative dissection of intricate anatomic structures. Due to each individual's unique anatomy, image-guided surgery is commonly used in complex cases, in which real-time 3-dimensional (3D) tracking systems determine positions of instruments relative to known skull base anatomy shown on visual displays. Although image-guided surgery has been shown to be helpful, several studies have shown that complication rates have not significantly decreased.
  • 3D 3-dimensional
  • the endoscopes used in FESS are typically rigid endoscopes, providing image pickup from the surgical field from their distal end.
  • Tools used in FESS are typically rigid, having a handle, long tubular or rod-shaped shafts, and operative devices at their distal end. These tools are inserted alongside, over, or under the endoscope; once inserted into the surgical field they are manipulated under visual observation from the endoscope until their distal end and operative devices are positioned as needed for the operation being performed. When inserting these tools, it is necessary to avoid undue pressure on, or damage to, structures within the nasal cavity that are not part of the surgical field—these structures are known to. Safe manipulation of these tools and endoscope through the obstacle course of turbinates and other structures within the nasal cavity and into the surgical field, and use of the tools to perform desired functions, requires practice.
  • Our surgical simulation system is a mixed-reality surgical simulator with an immersive user experience that may, in some embodiments, incorporate patient-specific anatomy and may be used for preoperative planning and practice.
  • the system includes a physical head model, a real or dummy endoscope which can be navigated, a tracking system configured to track location and angle of the endoscope with 6-degrees-of-freedom in virtual space, trackable instruments either real surgical instruments or dummy instruments modeled after real surgical instruments. In some embodiments, new surgical instruments or models thereof may be used.
  • the tracking system also tracks virtual-reality goggles.
  • a tracking, modeling, and display machine is configured to track a tip of the endoscope within the physical head model and identify corresponding locations in a CAD model of the physical head and to generate a video stream corresponding to a view of the CAD model from the corresponding location in the CAD model.
  • This model allows for: 1) surgical simulation on patient-specific models in virtual reality, 2) the development of an operating room environment virtually, 3) the use of augmented-reality to highlight critical structures that can be highlighted through visual or auditory cues, 4) the recording of this virtual surgery on a patient-specific model to then be used as a tracer or guide for trainees performing live surgery on the specific patient
  • an apparatus has a device representing an endoscope, being either an endoscope or a dummy endoscope having shape and feel resembling an endoscope, having an attached tracker adapted to operate with a three-dimensional tracking system to track location and orientation of the device in three dimensions in a simulated operating-room environment.
  • the apparatus also has a physical head model comprising hard and soft components, the device representing an endoscope is configured to be inserted into the physical head model to provide haptic feedback resembling that of using same or similar instruments and endoscopes in real endoscopic surgery.
  • FIG. 1 is a block diagram of a system providing integrated virtual-reality and haptic feedback from an FESS endoscope to surgical trainees or to surgeons preparing for specific cases.
  • FIG. 2A is a flowchart of a method of training or surgical preparation using the system of FIG. 1 .
  • FIG. 2B is a detail flowchart of preparing CAD models of hard bony, mucosal, and soft tissues from tomographic radiological image stacks.
  • FIG. 3 is a block diagram of a system providing integrated virtual-reality and haptic feedback from an FESS endoscope and surgical tools to surgical trainees or to surgeons preparing for specific cases.
  • FIG. 3A is a schematic illustration of critical structures identified in the radiographic three-dimensional images.
  • FIG. 3B is an illustration of bony, mucosa, and soft tissue portions of the computer-aided design (CAD) model of the head as replicated through 3D printing and assembled into the head physical model.
  • CAD computer-aided design
  • FIG. 4A-4F illustrate of tracker-equipped tools such as may be used with the endoscope. These include FIG. 4A illustrating an angle-tipped forceps or biter, FIG. 4 B illustrating a straight-tipped forceps or biter, FIG. 4C illustrating an angled-tipped scissors, FIG. 4D illustrating a straight-tipped scissors, FIG. 4E illustrating a straight-tipped electrocautery, FIG. 4F illustrating a bent-tipped probe or alternatively a bent-tipped cutter.
  • FIG. 4G illustrates a clamp-on tracker attachment that may be attached to a 3D printed model of a surgical tool, or to a real surgical tool, to track the tool in real time.
  • FIG. 5A is a photograph illustrating a tracker attached to an endoscope.
  • FIG. 5B illustrates a virtual-reality operating room environment, with draped patient, endoscope and surgical tool, and endoscopic monitor.
  • FIG. 6 is a photograph illustrating a hard-plastic physical model
  • FIG. 7 is a top view photograph of the hard-plastic physical model representing bone.
  • FIG. 8 is a photographic view of a tracker attached to a dummy endoscope.
  • FIG. 9 is a schematic sketch illustrating a tracker attached through a frame to a patient's head.
  • FIG. 10 is an illustration of a rendered virtual reality view of the CAD model of the head with superimposed images of specific critical structures.
  • FIG. 11 is an illustration of a rendered virtual reality endoscopic view as seen from an end of the endoscope with critical structures highlighted.
  • FIGS. 12 and 13 illustrate in full and cutaway views an endoscope inserted into nasal cavities of a physical model of the head, the physical model having both soft silicone and hard plastic components.
  • Our surgical simulation system is a mixed-reality surgical simulator with an immersive user experience that incorporates patient-specific anatomy.
  • a method 200 ( FIG. 2 ) of training or preparation for particular surgical cases begins with performing a computed tomography (CT) scan 202 using a CT scanner 102 ( FIG. 1 ) to obtain three-dimensional radiographic imaging of the head 106 of a patient 104 in tomographic image stack form; in a particular embodiment the three-dimensional radiographic imaging of the head 106 includes a stack of tomographic slice images with 0.625-millimeter slices however other resolutions may be used.
  • MRI imagers are used in place of CT scanner 102 to image the head and provide a similar stack of tomographic image slices, the stack of tomographic image slices being three-dimensional radiographic imaging.
  • the CT scan or MRI three-dimensional radiographic imaging is of the head of a specific patient for which a surgeon wishes a simulated dry run before performing surgery.
  • the CT scan or MRI three-dimensional radiographic imaging is, in succession, a CT scan or MRI of a training series of increasing difficulty, including radiographic imaging of heads of patients for which FESS surgery has been performed; with this series a beginning trainee surgeon can have a series of VR and physical head models prepared with which to learn by practicing basic, moderate, and difficult surgeries.
  • the three-dimensional radiographic imaging for a selected head is used to construct 204 , on a model extraction workstation 108 , a three-dimensional computer-aided design (CAD) model 110 of the head 106 of patient 104 , the CAD model 110 includes in separate files a mesh model of the hard-bony structures of skull, and a mesh model of soft tissue structures including mucosa as well as the skin and septal cartilage of nose as illustrated in FIG. 3B .
  • CAD computer-aided design
  • CT typically shows greater X-ray absorption for calcified bony tissue than for soft tissue
  • MRI images typically show less hydrogen for calcified bony tissue than for soft tissue, yet more hydrogen than for voids.
  • Extracted or segmented imaged bony, mucosal, and soft tissue 3D voxel models are processed into mesh models of bony, mucosal and soft tissue structure; mesh model boundaries are generated with any inconsistencies (or holes) in the mesh models repaired to form CAD model 110 .
  • Extracting and segmenting imaged bony and soft tissues into 3D mesh models is performed as illustrated in FIG. 2B .
  • Segmentation of bony tissues into a bony tissue voxel model in an embodiment is done using a region growing method and threshold process 254 , whereas for soft tissue and mucosa voxel models a semi-automatic region growing method is used 256 ; FastGrowCut and the segmentation module in 3-D Slicer functions are used for both the region growing and threshold process and the region growing methods. Skin, muscle, and other soft tissues are segmented from mucosal tissues based upon anatomic landmarks.
  • a marching-cube surface-mesh reconstruction 258 is then performed on the voxel models to prepare a mesh model of each of the hard bony, mucosal, and soft tissues.
  • Manual modification of the threshold and algorithm parameters on a regional basis may be done to avoid over-segmentation, under-segmentation and other artifacts, which may occur with volume averaging of bony and soft tissue densities in the head and neck.
  • the hard-bony tissue mesh model, mucosal tissue mesh model, and soft tissue mesh model from the marching cubes reconstructions are then repaired 260 , first with a surface toolbox mesh-repair module of 3-D slicer (http://www.slicer.org), and further with Autodesk 3ds Max.
  • a Lapacian smoothing routine was used to revise mesh models to improve 262 the approximation of curvature without losing volume.
  • Both the hard-bony tissue and soft tissue portions of CAD model 110 are constructed in mesh form using the FastGrowCut Segmentation and Paint (with Editable intensity range for masking) modules in 3-D Slicer and repaired to eliminate holes with the 3D Slicer surface toolbox.
  • the mesh models of CAD model 110 are further repaired using Autodesk 3ds Max to reduce the number of vertices for mesh optimization, and to prepare the model for 3D printing.
  • the generated and repaired mesh models of hard bony tissue, soft tissue, and mucosal tissue form parts of CAD model 110 and are exportable in mesh form for use in 3D printers.
  • CAD model 110 is annotated 206 to build and tag to identify one or both of a model of a surgical target and models of critical structures at risk during FESS surgery or located near to the surgical target, as illustrated in FIG. 3B .
  • Tagging may be in part manual and in embodiments may be in part automated using a classifier based on an appropriate machine-learning algorithm.
  • the annotated critical structures may include one or more of the cranial nerves (CN) including the olfactory and optic nerves (CN-1, CN-2) together with the optic chiasma, anterior and posterior portions of the pituitary gland, the cranial nerves CN-3, CN4, and CN6 that innervate the orbital muscles, and the cribriform plate.
  • the critical structures tagged may also include blood vessels such as the anterior ethmoidal artery, and internal carotid artery that can be at risk during endoscopic surgeries performed through the nares.
  • the critical structures are identified based upon known anatomic landmarks visible in the three-dimensional radiological imaging.
  • soft tissue including mucosa lining the nasal cavity and paranasal sinuses including the inferior, middle, and superior turbinates, maxillary sinuses, anterior ethmoid sinuses, agger nasi, ethmoid bullae, posterior ethmoid sinuses, sphenoid sinus, and frontal sinus.
  • Neural and arterial structures at risk for damage during surgery were identified and segmented separately, these are tagged as critical structures so that alarms can be sounded when a virtual surgical tool enters or touches them.
  • Surface meshes were generated within 3-D Slicer, and exported in wavefront (OBJ) format.
  • hard tissue is identified based on voxel density including the bone lining the medial orbit known as the lamina papyracea.
  • Bony structures of the skull identified in this embodiment include the Mandible, Maxilla, Sphenoid, Ethmoid, Frontal, and Temporal Bones.
  • Skin & muscle soft tissues are separated from mucosa based on known anatomic landmarks.
  • the bony structures of CAD model 110 are then replicated 208 on a 3D printer 112 to prepare a hard-plastic physical model 114 of those hard-bony structures.
  • a stereolithographic (SLS) 3-D printer based upon photopolymerization of liquid resin is used to prepare hard plastic physical model 114 .
  • a Formlabs Form2 (trademark of Formlabs, Inc., Somerville, Mass.) was used to prepare hard plastic physical model 114 of hard bony parts as defined in CAD model 110 .
  • a fused deposition (FDM) 3D printer such as a Creality Ender 3 (trademark of Shenzhen Creality 3D Technology Co., Ltd, Shenzhen, China) or a Zcorp 650 (3D Systems, Rock Hill, South 27 Carolina). was used to prepare hard plastic physical model 114 from polylactic acid (PLA) filament, in other alternative embodiments hard plastic physical model 114 may be prepared with an FDM printer using extrudable Nylon or polyethylene terephthalate filament using a dual-extruder printer with polystyrene (HIPS) temporary supporting structures.
  • FDM fused deposition
  • 3D printer 112 is also used to prepare 210 , by 3D printing, a casting mold 116 configured for casting 212 a soft silicone model 118 of selected soft tissue structures, including skin and septal cartilage of nose, as described in CAD model 110 .
  • a mold is directly printed.
  • a rigid model of the selected soft tissue structures is printed, this being then used as a form to cast a flexible silicone mold that is in turn used to cast soft silicone model 118 of soft tissue structures.
  • soft silicone model 118 is directly printed using a flexible UV-polymerizable resin in an SLA printer such as the Form2 printer
  • 3D printer 112 is also used to prepare 211 a casting mold 117 configured for casting 213 a soft silicone model 119 of selected mucosal structures, such as line the interior of nasal cavity and sinuses, as described in CAD model 110 .
  • a soft silicone model 119 of selected mucosal structures such as line the interior of nasal cavity and sinuses, as described in CAD model 110 .
  • the soft silicone mucosal model 119 is mounted 215 to the hard-plastic physical model 114 .
  • model 119 of mucosal structures has been directly 3D printed using an SLS-type 3D printer such as a Form2 printer and flexible, UV-curable, resin.
  • the soft silicone model 118 of soft tissue structures is mounted 214 to the hard plastic physical model 114 of bony tissues to create the head physical model 115 , the head physical model 115 including hard plastic physical model 114 , soft silicone model 119 of mucosal structures, and soft silicone model 118 of soft tissue structures.
  • CAD model 110 is also loaded 216 into a mechanical modeling and tracking machine 122 equipped with tracking receivers 124 , 126 .
  • Tracking receivers 124 , 126 are configured to track 218 location and orientation in three-dimensional space of a tracker 128 that is attached to a dummy endoscope 130 , in a particular embodiment, tracking receivers 124 , 126 and tracker 128 are HTC Vive (HTC, New Taipei City, Taiwan) trackers and the virtual reality goggles are an HTC Vive headset; other virtual reality goggles and trackers may be used.
  • head physical model 115 is at a known location, in other embodiments, hard plastic physical model 114 is attached to another tracker 150 through a short steel rod 152 as illustrated in FIG.
  • endoscope handle 132 that may include operative controls.
  • operative controls on endoscope handle 132 include camera angle selection buttons.
  • a real endoscope may be used in place of the dummy endoscope.
  • a device representing an endoscope may be either a dummy endoscope or a real endoscope.
  • the mechanical modeling and tracking machine 122 uses the location and orientation of the tracker 128 on the endoscope 130 to determine 220 a location and orientation of endoscope head 134 in the head physical model, which is in turn aligned and registered to a virtual head as modeled by CAD model 110 executing on modeling and tracking machine 122 , the CAD model 110 being derived from the 3D image stack determined from MRI or CT scans. Since the head physical model is registered to the CAD model 110 , each location of endoscope head 134 in the head physical model corresponds to a location in the CAD model 110 .
  • Interaction of the device representing an endoscope with the head physical model as the device is inserted into the model provides tactile or haptic feedback to a surgeon or trainee that resembles tactile or haptic feedback as the surgeon or trainee inserts a real endoscope into a patient's real head.
  • An endoscope alone is useful to visually inspect internal surfaces within the nasal cavity but cannot by itself perform FESS surgery.
  • additional surgical tools are inserted into a patient's head along with the endoscope.
  • one or more devices resembling surgical tools are provided ( FIG. 4 ). These tools may include any combination of an angle-tipped forceps or biter 460 illustrated in FIG. 4A , a straight-tipped forceps or biter 462 FIG. 4B , an angled-tipped scissors 464 FIG. 4C , a straight-tipped scissors 466 FIG. 4D , a straight-tipped electrocautery 468 FIG. 4E , a bent-tipped probe or alternatively a bent-tipped cutter 470 FIG.
  • an angle-tipped forceps or biter 460 illustrated in FIG. 4A a straight-tipped forceps or biter 462 FIG. 4B , an angled-tipped scissors 464 FIG. 4C , a straight-tipped scissors 466 FIG. 4D , a straight-tipped electrocautery 468 FIG. 4E , a bent-tipped probe or alternatively a bent-tipped cutter 470 FIG.
  • a bent-tipped electrocautery (not shown), a bent-tipped or straight-tipped drills (not shown),), straight or bent suction tubes (not shown), microdebriders (not shown), straight tipped probes and cutters (not shown), and other tools as known in the art of FESS surgery; the tool may in an embodiment be a new or experimental tool of unique shape.
  • Each device resembling surgical tools may be a 3-D print of a tool with an embedded 3-D tracker 402 , 404 , 406 , 408 , 410 , 412 , or may be a real surgical tool with a clamp-on 3-D tracker as illustrated in FIG. 4G .
  • the clamp-on 3-D tracker has a 3-D tracking device 420 and clamp 422 and is configured to mount with a setscrew 424 to a shaft or body 426 of a surgical tool or 3-D model of a tool.
  • Each tool has a corresponding 3D mesh model used in the gaming engine of the modeling and display machine to track position of the tool tip and to derive an image of the tool tip when the tool tip is in a field of view of the simulated endoscope tip.
  • the devices resembling surgical tools may be equipped with short-range radio-activated vibrators to provide haptic feedback resembling that of an operating drill or to provide alarms generated when tool tips approach critical structures.
  • Tools used in FESS such as forceps, biters, and scissors, often have a long, narrow, shaft 450 , 452 configured to fit through the nares into the nasal cavity, they also have a handle 440 , 442 , 444 that allows the user to control their angle of orientation within the nasal cavity.
  • These tools operate when an operating lever 430 , 432 , 434 is pressed, the operating lever being coupled through an operating rod that is typically disposed within the shaft 450 , 452 .
  • Simple cutters and probes as illustrated in FIG. 4F , do, however, lack an operating lever.
  • operating levers of devices resembling a surgical tool may be instrumented with sensors configured to sense pressure on the operating lever, and transmit sensed pressure to the video modeling and display machine 136 .
  • FIG. 4G illustrates a clamp-on tracker attachment that may be attached to a 3D printed model of a surgical tool, or to a real surgical tool, to track the tool in real time.
  • a computer model of each tracker-equipped surgical tool 460 , 462 , 464 , 466 , 468 , 470 is incorporated into the mechanical model and tracking machine 122 and video model and display machine 136 .
  • the mechanical model and tracking machine 122 uses information received through multiple tracking receivers 124 , 126 to determine position and orientation of both the tracker 128 on the endoscope 130 ( FIG. 3 ) and tracker 160 on the tracker-equipped tool 162 to determine position and orientation of the tip 134 of the endoscope and operating portion 164 of the tool.
  • a video modeling and display machine 136 executes a video game engine
  • the video game engine is the Unity Game Engine, in a particular embodiment Unity Engine V2017.3, (Trademark of Unity Technologies, San Francisco, Calif.) was used
  • the video modeling and display machine 136 also executes the CAD model 110 of the head.
  • the mechanical modeling and tracking machine and video modeling and display machine form a tracking, modeling, and display machine.
  • modeling and tracking machine 122 and video modeling and display machine 136 are combined within a single tracking, modeling, and display machine executing a plurality of modules.
  • Video modeling and display machine 136 executing a video gaming engine 138 determines objects represented in CAD model 110 that are in view of the endoscope head 134 , including anatomy of the head, at one of three selectable endoscope viewing angles, and renders 222 the objects in view of the endoscope head 134 into a video image.
  • the objects represented in CAD model 110 may include models of foreign objects or tumors 166 upon which surgery is to be conducted.
  • the gaming engine 138 also determines whether a tip 164 of any device resembling a surgical tool 162 is in a field of view of the endoscope as oriented, and renders that into the video image.
  • the game engine is the Unity Game Engine v2017.3 (Unity Technologies).
  • the present system is adapted to render objects in view of straight as well as angled endoscopes with accurate field of view.
  • the game engine includes capability of photo-realistic rendering in real-time with dynamic lighting sources and shadows, in an embodiment the dynamic lighting source is chosen to correspond to a lighting fiber of a real endoscope so rendered images strongly resemble images seen through an endoscope camera during live surgeries.
  • This video image represents a view corresponding to a view through an endoscope at a corresponding position in the patient's head 106 .
  • the video image corresponding to a view through the endoscope tip may then be tagged 224 with indications of critical structures and presented 226 to a trainee or operating surgeon through virtual reality goggles 140 as if on an endoscope monitor with images of other objects in a virtual operating room.
  • Virtual reality goggles 140 are also equipped with a tracker 146 .
  • Mechanical modeling and tracking machine 122 compares computed locations of both the endoscope tip 134 and tool tip 164 to locations of critical structures as flagged in model 110 , and provides alerts when either tip 134 , 164 is positioned to damage those critical structures.
  • These critical structures include the orbits, cribriform plate, cavernous sinus, and multiple cranial fossae of the skull; when the video model and game machine 136 detects entry of a simulated surgical tool tip into or against one of these critical structures, the video model and game machine sounds an audible alarm or displays a visual alarm; in some embodiments a vibrator is used to provide a haptic alarm.
  • alarms are generated upon a simulated surgical tool tip approaching one of these critical structures that have been tagged in the mucosal mesh model.
  • the system includes, within video model and game machine 136 , a virtual reality model of a virtual operating room, including 3-D models of much common operating-room equipment such as an operating table, instrument tray, electrocautery machine, endoscope illuminator/camera controller, and endoscope monitor.
  • a virtual reality model of a virtual operating room including 3-D models of much common operating-room equipment such as an operating table, instrument tray, electrocautery machine, endoscope illuminator/camera controller, and endoscope monitor.
  • a trainee or operating surgeon puts on virtual reality goggles 140 then picks up and manipulates the endoscope 130 to insert endoscope head 134 into nares 142 of head physical model 115 into nasal cavity 144 of head physical model 115 ; the trainee or surgeon may also insert one or more tools 162 through the nares 142 into nasal cavity 144 .
  • tracker 146 tracks location and angle of virtual reality goggles 140 to permit synthesis in video model and game machine 136 of a video stream incorporating a view of the virtual operating room with a virtual patient having head aligned and registered with a physical location of physical head model 115 , and draped as typical for FESS surgery.
  • the view of the virtual operating room includes an image of an endoscope aligned and positioned according to tracked position and orientation of endoscope 130 .
  • the virtual operating room includes a virtual operating room monitor providing the virtual reality rendered video image as viewed from the endoscope tip, potentially including an image of the surgical tool tip 504 as well as an image of tumor to be resected 506 , permitting the trainee or operating surgeon to view the rendered video image by aiming his or her head, and virtual reality goggles 140 , at the virtual operating room monitor 502 , as illustrated in FIG. 5B .
  • Also visible in the VR goggle display may be, depending on VR goggle position and orientation, the simulated head 508 of properly draped patient 510 , endoscope 512 , and surgical tool 514
  • the tracking and modeling machine 122 tracks position of the endoscope head 134 in physical model 115 and provides alerts when endoscope head 134 approaches locations corresponding to tagged critical structures in CAD model 110 .
  • these alerts are provided as aural alerts and as visual alerts by superimposing warnings and images of critical structures on the virtual endoscope images presented on the virtual operating room monitor thereby simulating an alternative embodiment that presents visual warnings on actual endoscope images during live surgeries.
  • endoscope 130 and endoscope head 134 While the trainee or surgeon manipulates the endoscope and surgical tool or tools, mechanical interactions of endoscope 130 and endoscope head 134 with the head physical model 115 provide tactile, or haptic, feedback to the trainee or operating surgeon, the tactile feedback greatly resembling tactile feedback felt during actual surgeries on sinuses, pituitary, and other organs accessible to endoscope 130 through nares 142 .
  • Tactile and haptic feedback is inherent to using 3D printed dummy endoscopes and other tools in the shape of real surgical tools, and having a trackable patient skull with anatomic features with which the endoscope and other surgical tools physically interact.
  • One aspect of tactile feedback is the feel of the endoscope and its controller, and when present the surgical tools, in the trainee's hands each with 6 full degrees of freedom, providing a proprioceptively authentic feel in a room-scale immersive virtual-reality environment.
  • dummy endoscopes and dummy surgical tools are 3D printed with FDM printers.
  • tactile feedback is enhanced with a vibratory mechanism within the dummy endoscope or other dummy tools to simulate a surgical drill, suction probe or suction cautery such as may be used during actual surgeries.
  • the virtual reality rendered video image presented on the virtual operating room monitor with virtual reality goggles 140 provides visual feedback like visual images seen by a trainee or operating surgeon while performing similar operations.
  • the position and angle of the VR goggles are tracked and the simulated OR environment is displayed through the VR goggle with position and size of the simulated endoscope monitor dependent on angle and position of the VR goggle.
  • movement of the trainee or operating surgeon's head provides realistic movement of stationary objects in his field of view like the simulated endoscope monitor while he is wearing the VR goggles.
  • Both the head physical model 115 and virtual reality rendered video based on CAD model 110 are patient-specific since CAD model 110 is derived from the three-dimensional radiographic images of a specific patient's head 106 .
  • dummy endoscope 130 has a lumen and operative tools can be inserted through that lumen, in particular embodiments these tools may include drills for penetrating through bone into sinuses or through bone to reach a pituitary gland; these tools can also penetrate through hard plastic of physical model 114 during practice procedures.
  • a frame 304 is attached to the patient's head 106 , and a tracker 306 is positioned on the frame.
  • the patient's head is registered to the tracking system with the CAD model 110 aligned to the patient's head 106 .
  • the tracking and modeling machine 122 tracks position of the endoscope head and provides alerts when tracked endoscope head 134 approaches tagged critical structures as identified in the CAD model 110 , in an embodiment these alerts are provided as aural alerts and as visual alerts by superimposing warnings and images of critical structures on images obtained through an endoscope camera viewing the patient's nasal cavity and sinuses from endoscope head 134 .
  • critical structures may be highlighted and displayed as illustrated in FIG. 10 as structures shown with reference to the head and FIG. 11 as highlighted structures in an endoscope view.
  • FIG. 10 illustrates critical structures viewed in projection superimposed on the CAD model 110 .
  • the entire motion of the endoscope and operative tools is recorded by the operating surgeon and then transmitted to another site to provide a tracing of the surgery to be then mirrored by a second surgeon performing live surgery (tele-surgery), or repeated by trainees to provide repetitive guided training.
  • positions of head physical model or patient head, and endoscope as detected by the trackers are recorded throughout a practice or real surgical procedure.
  • a score is produced based on time to perform a predetermined task with penalties applied for approach of simulated tool tip to simulated critical structures; in embodiments motion tracking of tool and endoscope is used to determine economy of movement and the score is also based on economy of movement.
  • a machine-learning procedure is trained on beginning and experienced surgeons and motion tracking to provide personalized feedback to trainee surgeons and score users on their relative level in performing surgery. This feedback could be used to advance users from a beginner to expert level, or evaluate the level of surgeons in the community.
  • Relative motions of endoscope and instrument to head as recorded are then analyzed using the 3D CAD model and critical structures tagged in the CAD model to provide feedback to the trainee surgeon.
  • Such analysis may include indications of safer or faster ways the procedure could be performed, or be used to evaluate surgeons already performing surgery.
  • derivation of the score and its use in training surgeons by giving real-time feedback to users, either by altering the level of difficulty of the simulation, providing visual/auditory/haptic feedback and cues to assist in surgery, and provide objective feedback or score on the simulation is known as the virtual coach. This could be used to evaluate proficiency during training, as well as provide a method of continued certification for practicing surgeons.
  • trackers are coupled to a real endoscope and real surgical tools, and a tracker on a frame is clamped to the same patient's head as used to generate the CT or MRI radiological tomographic image stack from which the CAD model was derived.
  • the physical head model is not used in this alternate embodiment, the CAD model is registered to the patient's head.
  • the modeling and display machine tracks locations of the endoscope and surgical tools tips in the CAD model—corresponding to positions in the patient's head—and generates visual or aural alarms when these tips approach critical structures tagged in the CAD model. These alarms serve to assist surgeons in avoiding damage to those critical structures.
  • resilient polymer shall include rubberlike polymeric materials, including polymerized Fromlabs elastic resin, resilient silicones and some soft carbon-based synthetic rubbers and flexible plastics like molded latex and sorbothane, adapted to being formed into flexible reproductions of human soft tissue such as skin and muscle and having Shore-A hardness no greater than 85.
  • hard plastic shall include polymeric materials significantly harder than resilient polymers as defined above, including most acrylonitrile butadiene styrene (ABS), high impact polystyrene (HIPS), and polylactic acid (PLA) 3D printer filaments, and polymerized Formlabs standard-hardness grey resin.
  • Vive trackers were reliably tracked by Vive lighthouse base stations to less than a centimeter, updating the position of the tools and user in the virtual environment without detectable latency.
  • the endoscope could register correctly the modeled danger-zones with audio and visual cues time-synchronously.
  • This framework provides a cost-effective methodology for high-fidelity surgical training simulation with haptic feedback.
  • personalized training programs could be developed for trainees that are adaptive and scalable on any range of difficulty and complexity.
  • Proposed approaches to VR can be extended to the telemedicine world, in which surgeons operating in remote locations can be assisted by the experts aiding from tertiary care centers.
  • State-of-the-art surgical navigation systems such as the system herein described provide reliable optical and electromagnetic-based tracking with accuracy within potentially 2 mm. These navigation workstations confirm anatomic location but do not reduce the risk of surgical complications down to 0%. Additional features from our technology could be translatable to develop AR-based navigation, which can further improve safety in the operating room.
  • a multimode VR apparatus designated A including an endoscope device adapted to represent an endoscope, the endoscope device selected from an endoscope and a dummy endoscope having shape and feel resembling that of an endoscope; a wireless tracker adapted to operate with a three-dimensional tracking system to track location and orientation of the endoscope device in three dimensions in a simulated operating room environment; and a video modeling and display machine configured with a computer-aided design (CAD) model of a head and adapted to provide a simulated head environment, providing a simulated endoscope view.
  • the apparatus also includes a physical head model comprising hard and soft physical components, the endoscope device being configured to be inserted into the physical head model to provide a tactile representation of manipulation of an endoscope in a head to a person handling the endo-scope device.
  • An apparatus designated AA including the multimode VR apparatus designated A wherein the video modeling and display machine comprises a gaming engine adapted simulate endoscope view of the simulated head environment
  • An apparatus designated AB including the apparatus designated A or AA wherein the physical head model comprises a wireless tracker, and where the computer-aided design (CAD) model of a head is registered to a tracked position of the physical head model.
  • CAD computer-aided design
  • An apparatus designated AC including the apparatus designated A, AA, or AB wherein the physical head model comprises a hard-plastic portion prepared by 3D printing representative of bony tissue and a resilient polymer portion representative of skin.
  • An apparatus designated AD including the apparatus designated A, AA, AB or AC and including a surgical tool device having shape and feel resembling that of a surgical tool adapted for functional endoscopic sinus surgery (FESS) selected from the group consisting of forceps, biting forceps, scissors, a probe, and an electrocautery, the tool device further comprising a wireless tracker adapted to operate with the three-dimensional tracking system to track location and orientation of the tool device in three dimensions in the simulated head environment.
  • FESS functional endoscopic sinus surgery
  • An apparatus designated AE including the apparatus designated AD wherein the simulated endoscope view includes a simulated view of the tool device.
  • An apparatus designated AF including the apparatus designated A, AA, AB, AC, AD or AE wherein the video modeling and display machine is further configured to provide a simulated operating room (OR) environment with the simulated endoscope view displayed on a simulated endoscope monitor.
  • OR operating room
  • An apparatus designated AFA including the apparatus designated A, AA, AB, AC, AD, AE, or AF wherein the tool device resembles a surgical tool selected from the group consisting of forceps, biting forceps, scissors, a probe, a drill, and an electrocautery.
  • An apparatus designated AG including the apparatus designated A, AA, AB, AC, AD, AE, or AF further including a virtual-reality (VR) goggle equipped with a wireless tracker, and wherein the simulated OR environment is displayed through the VR goggle with position and size of the simulated endoscope monitor on the VR goggle display dependent on angle and position of the VR goggle.
  • VR virtual-reality
  • a method designated B of preparing a physical model of a head and endoscope for surgical simulation includes importing into a workstation a radiological tomographic image stack of a head; segmenting the radiological tomographic image stack into hard tissue, soft tissue, and mucosal voxel models based at least in part on voxel intensity; and growing hard tissue, mucosal, and soft tissue regions in the hard tissue, soft tissue, and mucosal voxel models.
  • the method continues with converting the hard tissue, soft tissue, and mucosal models into a hard tissue mesh model, a soft tissue mesh model, and a mucosal mesh model; repairing the mesh models; exporting the mesh models from the workstation and using a 3D printer and the hard tis-sue mesh model to print a physical hard tissue model; preparing a physical mucosal tissue model from the mucosal mesh model; and mounting the physical mucosal tissue model to the physical hard tissue model.
  • the method also includes preparing a physical soft tissue model from the soft tissue mesh model; and mounting the physical soft tissue model to the physical hard tissue model to form a physical head model.
  • the method also includes loading the mesh models into a display system adapted to render images of surfaces of the mesh models as viewed from an endoscope; mounting a tracker to the physical head model; and preparing an endoscope device with a tracker.
  • a method designated BA including the method designated B and including: tracking the endoscope device to determine a tracked endoscope position and orientation; determining a location and orientation of a tip of the endoscope device from the tracked endoscope position and orientation, the position of the tip of the endoscope device being within the physical head model; rendering images of surfaces of the mesh models as viewed from the tip of the endo-scope device; and displaying the images of surfaces of the mesh models.
  • a method designated BB including the method designated B or BA further includes mounting a tracker to a device representing a surgical tool; tracking the device representing a surgical tool to determine a location of a tip of the device representing a surgical tool; determining when the surgical tool is in view of the tip of the endoscope device; and when the surgical tool is in view of the tip of the endoscope device, rendering an image of a surgical tool as viewed from the tip of the endoscope device.
  • a method designated BC including the method designated B, BA, or BB further includes: tracking location and orientation of the physical head model; and registering the mucosal mesh model to the location and orientation of the physical head model.
  • a method designated BD including the method designated B, BA, BB, or BC where the rendering images of surfaces of the mesh models is performed with a 3D gaming engine.
  • a method designated BE including the method designated B, BA, BB, BC, or BD wherein the preparing a physical mucosal model is performed by casting using a mold that has been prepared from the mucosal mesh model by a method comprising 3D printing.
  • a method designated BF including the method designated B, BA, BB, BC, BD, or BE further includes identifying critical anatomic structures imaged in the radiological tomographic image stack and tagging those critical structures in a model of the mesh models.
  • a method designated BG includes the method designated B, BA, BB, BC, BD, BE, or BF and further includes generating alarms upon approach of a tip of the device representing a surgical tool to a critical structure tagged in the mesh model.
  • An endoscopic surgical simulation system designated C includes a physical head model; a tracking system configured to track location and angle of a device representing an endoscope and a device representing a surgical tool; a computer-aided design (CAD) model in a modeling, and display machine, the CAD model registered to a location of the physical head model and comprising CAD representations of structures corresponding to structures of the physical head model; with the modeling, and display machine being configured to track the device representing an endoscope and determine a location of a tip of the device representing an endoscope within a nasal cavity of the physical head model, and to determine a field of view of an endoscope located at the location of the tip of the device representing an endoscope.
  • CAD computer-aided design
  • the modeling, and display machine is configured to track the device representing a surgical tool and determine a location of a tip of the device representing a surgical tool within the nasal cavity of the physical head model; and the modeling and display machine is configured to generate a video stream corresponding to a view of structures represented by the CAD model within the field of view.
  • the modeling and display machine is also configured to superimpose on the video stream an image corresponding to a tip of a surgical tool when the location of a tip of the device representing a surgical tool is in a field of view of view.
  • An endoscopic surgical simulation system designated CA including the endoscopic surgical simulation system designated C wherein the CAD model includes models of a plurality of structures tagged as critical structures.
  • An endoscopic surgical simulation system designated CB including the endoscopic surgical simulation system designated C or CA further including a tracker coupled to the physical head model, and wherein the CAD model is registered to a location of the physical head model.
  • An endoscopic surgical simulation system designated CBA including the endoscopic surgical simulation system designated C, CA, or CB wherein the physical head model and CAD model are derived from computed tomography (CT) or magnetic resonance imaging (MRI) scans of a particular patient, the system configured for preoperative planning and practice for that particular patient.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • An endoscopic surgical simulation system designated CBB including the endoscopic surgical simulation system designated C, CA, CB wherein there is a first physical head model and CAD model configured for a first task, and a second physical head model and CAD model configured for a second task, the second task of greater difficulty than the first task.
  • An endoscopic surgical simulation system designated CC including the endoscopic surgical simulation system designated C, CA, CB, CBA, or CBB wherein the modeling and display machine is configured to generate alarms upon approach of the location of a tip of the device representing a surgical tool to a structure tagged as a critical structure.
  • An endoscopic surgical simulation system designated CD including the endoscopic surgical simulation system designated C, CA, CB, CC, or CBA further including a model extraction workstation configured to extract three-dimensional mesh models from computed tomography (CT) or magnetic resonance imaging (MRI) radiographic images, and wherein the physical head model is generated by a method comprising 3D printing of extracted three-dimensional mesh models.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • An endoscopic surgical simulation system designated CD including the endoscopic surgical simulation system designated C, CA, CB, CBA, CBB, or CC further including virtual reality (VR) goggles, the VR goggles equipped with a tracker.
  • VR virtual reality
  • An endoscopic surgical simulation system designated CE including the endoscopic surgical simulation system designated CD wherein the video stream corresponding to a view of structures represented by the CAD model within the field of view is displayed upon a display of the VR goggles.
  • An endoscopic surgical simulation system designated CF including the endoscopic surgical simulation system designated CE where the video steam corresponding to a view of structures represented by the CAD model is displayed on the VR goggles at a position dependent on location and orientation of the VR goggles.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Medical Informatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Human Computer Interaction (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Chemical & Material Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Educational Administration (AREA)
  • Mathematical Optimization (AREA)
  • Educational Technology (AREA)
  • Medicinal Chemistry (AREA)
  • Algebra (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Robotics (AREA)
  • Pulmonology (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • Urology & Nephrology (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Materials Engineering (AREA)

Abstract

An apparatus has a device representing an endoscope, the device being either an endoscope or a dummy endoscope having shape and feel resembling an endoscope, and includes a tracker adapted to operate with a three-dimensional tracking system to track location and orientation of the device in three dimensions in a simulated operating-room environment. The apparatus also has a physical head model comprising hard and soft components, the device representing an endoscope configured to be inserted into the physical head model to provide a haptic feedback of endoscopic surgery.

Description

    PRIORITY CLAIM
  • The present application claims the benefit of priority to U.S. Provisional Patent Application Nos. 62/659,680, 62/659,685, and 62/659,672, all of which were filed on 18 Apr. 2018. The entire contents of each of the forgoing provisional applications are incorporated herein by reference.
  • RELATED APPLICATIONS
  • The present document relates to co-filed applications entitled System for Integrated Virtual-Reality Visual and Haptic Surgical Simulator and System for Generating 3D Models for 3D Printing, and for Generating Video for an Integrated Virtual-Reality Visual and Haptic Surgical Simulator.
  • FIELD
  • The present document describes a training, practice, and enhanced operating environment for surgeons training for, performing, or teaching surgery. In particular, a training, practice, and performing tele- or virtual surgical environment is described for Functional Endoscopic Sinus Surgery (FESS). This document also highlights segmentation of critical structures, including the orbit, brain, cranial nerves, and vessels in augmented-reality and the training and practice environment includes visual, auditory, and haptic feedback.
  • BACKGROUND
  • Functional endoscopic sinus surgery (FESS) utilizes surgical endoscopes that allow visualization, magnification and lighting of structures in the sinuses and nose to perform minimally invasive surgery through the nose. The use of image-guided surgery provides the surgeon with intraoperative landmarks to avoid critical structures in the sinonasal cavity, with the goal of reducing complications into the orbit, brain, cerebrospinal fluid, or major vessels. Although these complications are rare, they can be catastrophic if they occur. FESS is commonly used in the surgical treatment of chronic sinusitis, the removal of sinonasal tumors, or in access to other craniofacial structures such as the orbital or cranial cavities.
  • FESS requires rigorous preoperative planning and careful intraoperative dissection of intricate anatomic structures. Due to each individual's unique anatomy, image-guided surgery is commonly used in complex cases, in which real-time 3-dimensional (3D) tracking systems determine positions of instruments relative to known skull base anatomy shown on visual displays. Although image-guided surgery has been shown to be helpful, several studies have shown that complication rates have not significantly decreased.
  • The endoscopes used in FESS are typically rigid endoscopes, providing image pickup from the surgical field from their distal end. Tools used in FESS are typically rigid, having a handle, long tubular or rod-shaped shafts, and operative devices at their distal end. These tools are inserted alongside, over, or under the endoscope; once inserted into the surgical field they are manipulated under visual observation from the endoscope until their distal end and operative devices are positioned as needed for the operation being performed. When inserting these tools, it is necessary to avoid undue pressure on, or damage to, structures within the nasal cavity that are not part of the surgical field—these structures are known to. Safe manipulation of these tools and endoscope through the obstacle course of turbinates and other structures within the nasal cavity and into the surgical field, and use of the tools to perform desired functions, requires practice.
  • SUMMARY
  • Our surgical simulation system is a mixed-reality surgical simulator with an immersive user experience that may, in some embodiments, incorporate patient-specific anatomy and may be used for preoperative planning and practice. The system includes a physical head model, a real or dummy endoscope which can be navigated, a tracking system configured to track location and angle of the endoscope with 6-degrees-of-freedom in virtual space, trackable instruments either real surgical instruments or dummy instruments modeled after real surgical instruments. In some embodiments, new surgical instruments or models thereof may be used. The tracking system also tracks virtual-reality goggles. A tracking, modeling, and display machine is configured to track a tip of the endoscope within the physical head model and identify corresponding locations in a CAD model of the physical head and to generate a video stream corresponding to a view of the CAD model from the corresponding location in the CAD model. This model allows for: 1) surgical simulation on patient-specific models in virtual reality, 2) the development of an operating room environment virtually, 3) the use of augmented-reality to highlight critical structures that can be highlighted through visual or auditory cues, 4) the recording of this virtual surgery on a patient-specific model to then be used as a tracer or guide for trainees performing live surgery on the specific patient
  • In an embodiment, an apparatus has a device representing an endoscope, being either an endoscope or a dummy endoscope having shape and feel resembling an endoscope, having an attached tracker adapted to operate with a three-dimensional tracking system to track location and orientation of the device in three dimensions in a simulated operating-room environment. The apparatus also has a physical head model comprising hard and soft components, the device representing an endoscope is configured to be inserted into the physical head model to provide haptic feedback resembling that of using same or similar instruments and endoscopes in real endoscopic surgery.
  • BRIEF DESCRIPTION OF THE FIGURES
  • FIG. 1 is a block diagram of a system providing integrated virtual-reality and haptic feedback from an FESS endoscope to surgical trainees or to surgeons preparing for specific cases.
  • FIG. 2A is a flowchart of a method of training or surgical preparation using the system of FIG. 1.
  • FIG. 2B is a detail flowchart of preparing CAD models of hard bony, mucosal, and soft tissues from tomographic radiological image stacks.
  • FIG. 3 is a block diagram of a system providing integrated virtual-reality and haptic feedback from an FESS endoscope and surgical tools to surgical trainees or to surgeons preparing for specific cases.
  • FIG. 3A is a schematic illustration of critical structures identified in the radiographic three-dimensional images.
  • FIG. 3B is an illustration of bony, mucosa, and soft tissue portions of the computer-aided design (CAD) model of the head as replicated through 3D printing and assembled into the head physical model.
  • FIG. 4A-4F illustrate of tracker-equipped tools such as may be used with the endoscope. These include FIG. 4A illustrating an angle-tipped forceps or biter, FIG. 4B illustrating a straight-tipped forceps or biter, FIG. 4C illustrating an angled-tipped scissors, FIG. 4D illustrating a straight-tipped scissors, FIG. 4E illustrating a straight-tipped electrocautery, FIG. 4F illustrating a bent-tipped probe or alternatively a bent-tipped cutter.
  • FIG. 4G illustrates a clamp-on tracker attachment that may be attached to a 3D printed model of a surgical tool, or to a real surgical tool, to track the tool in real time.
  • FIG. 5A is a photograph illustrating a tracker attached to an endoscope.
  • FIG. 5B illustrates a virtual-reality operating room environment, with draped patient, endoscope and surgical tool, and endoscopic monitor.
  • FIG. 6 is a photograph illustrating a hard-plastic physical model
  • representing bone attached to a tracker to permit easy relative movement analysis between the physical model and the endoscope tip.
  • FIG. 7 is a top view photograph of the hard-plastic physical model representing bone.
  • FIG. 8 is a photographic view of a tracker attached to a dummy endoscope.
  • FIG. 9 is a schematic sketch illustrating a tracker attached through a frame to a patient's head.
  • FIG. 10 is an illustration of a rendered virtual reality view of the CAD model of the head with superimposed images of specific critical structures.
  • FIG. 11 is an illustration of a rendered virtual reality endoscopic view as seen from an end of the endoscope with critical structures highlighted.
  • FIGS. 12 and 13 illustrate in full and cutaway views an endoscope inserted into nasal cavities of a physical model of the head, the physical model having both soft silicone and hard plastic components.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • Our surgical simulation system is a mixed-reality surgical simulator with an immersive user experience that incorporates patient-specific anatomy.
  • In an embodiment, a method 200 (FIG. 2) of training or preparation for particular surgical cases begins with performing a computed tomography (CT) scan 202 using a CT scanner 102 (FIG. 1) to obtain three-dimensional radiographic imaging of the head 106 of a patient 104 in tomographic image stack form; in a particular embodiment the three-dimensional radiographic imaging of the head 106 includes a stack of tomographic slice images with 0.625-millimeter slices however other resolutions may be used. In an alternative embodiment, MRI imagers are used in place of CT scanner 102 to image the head and provide a similar stack of tomographic image slices, the stack of tomographic image slices being three-dimensional radiographic imaging.
  • In a particular embodiment, the CT scan or MRI three-dimensional radiographic imaging is of the head of a specific patient for which a surgeon wishes a simulated dry run before performing surgery. In alternative embodiments, the CT scan or MRI three-dimensional radiographic imaging is, in succession, a CT scan or MRI of a training series of increasing difficulty, including radiographic imaging of heads of patients for which FESS surgery has been performed; with this series a beginning trainee surgeon can have a series of VR and physical head models prepared with which to learn by practicing basic, moderate, and difficult surgeries.
  • The three-dimensional radiographic imaging for a selected head is used to construct 204, on a model extraction workstation 108, a three-dimensional computer-aided design (CAD) model 110 of the head 106 of patient 104, the CAD model 110 includes in separate files a mesh model of the hard-bony structures of skull, and a mesh model of soft tissue structures including mucosa as well as the skin and septal cartilage of nose as illustrated in FIG. 3B. In an embodiment, after importing 250 (FIG. 2B) the three dimensional radiographic image stack into a voxel-based 3-D model, the hard bony structures and soft tissue, including the mucosa, are initially automatically segmented 252, being distinguished from each other based at least in part on voxel intensity, and in some embodiments initial segmentation is based on voxel intensity; CT typically shows greater X-ray absorption for calcified bony tissue than for soft tissue, while MRI images typically show less hydrogen for calcified bony tissue than for soft tissue, yet more hydrogen than for voids. Extracted or segmented imaged bony, mucosal, and soft tissue 3D voxel models are processed into mesh models of bony, mucosal and soft tissue structure; mesh model boundaries are generated with any inconsistencies (or holes) in the mesh models repaired to form CAD model 110.
  • Extracting and segmenting imaged bony and soft tissues into 3D mesh models is performed as illustrated in FIG. 2B. Segmentation of bony tissues into a bony tissue voxel model in an embodiment is done using a region growing method and threshold process 254, whereas for soft tissue and mucosa voxel models a semi-automatic region growing method is used 256; FastGrowCut and the segmentation module in 3-D Slicer functions are used for both the region growing and threshold process and the region growing methods. Skin, muscle, and other soft tissues are segmented from mucosal tissues based upon anatomic landmarks. A marching-cube surface-mesh reconstruction 258 is then performed on the voxel models to prepare a mesh model of each of the hard bony, mucosal, and soft tissues. Manual modification of the threshold and algorithm parameters on a regional basis may be done to avoid over-segmentation, under-segmentation and other artifacts, which may occur with volume averaging of bony and soft tissue densities in the head and neck.
  • The hard-bony tissue mesh model, mucosal tissue mesh model, and soft tissue mesh model from the marching cubes reconstructions are then repaired 260, first with a surface toolbox mesh-repair module of 3-D slicer (http://www.slicer.org), and further with Autodesk 3ds Max. In a particular embodiment, a Lapacian smoothing routine was used to revise mesh models to improve 262 the approximation of curvature without losing volume.
  • Both the hard-bony tissue and soft tissue portions of CAD model 110 are constructed in mesh form using the FastGrowCut Segmentation and Paint (with Editable intensity range for masking) modules in 3-D Slicer and repaired to eliminate holes with the 3D Slicer surface toolbox. The mesh models of CAD model 110 are further repaired using Autodesk 3ds Max to reduce the number of vertices for mesh optimization, and to prepare the model for 3D printing. The generated and repaired mesh models of hard bony tissue, soft tissue, and mucosal tissue form parts of CAD model 110 and are exportable in mesh form for use in 3D printers.
  • In embodiments, CAD model 110 is annotated 206 to build and tag to identify one or both of a model of a surgical target and models of critical structures at risk during FESS surgery or located near to the surgical target, as illustrated in FIG. 3B. Tagging may be in part manual and in embodiments may be in part automated using a classifier based on an appropriate machine-learning algorithm. The annotated critical structures may include one or more of the cranial nerves (CN) including the olfactory and optic nerves (CN-1, CN-2) together with the optic chiasma, anterior and posterior portions of the pituitary gland, the cranial nerves CN-3, CN4, and CN6 that innervate the orbital muscles, and the cribriform plate. The critical structures tagged may also include blood vessels such as the anterior ethmoidal artery, and internal carotid artery that can be at risk during endoscopic surgeries performed through the nares. The critical structures are identified based upon known anatomic landmarks visible in the three-dimensional radiological imaging.
  • In an embodiment, during segmentation soft tissue is identified, including mucosa lining the nasal cavity and paranasal sinuses including the inferior, middle, and superior turbinates, maxillary sinuses, anterior ethmoid sinuses, agger nasi, ethmoid bullae, posterior ethmoid sinuses, sphenoid sinus, and frontal sinus.
  • Neural and arterial structures at risk for damage during surgery were identified and segmented separately, these are tagged as critical structures so that alarms can be sounded when a virtual surgical tool enters or touches them. These included the anterior ethmoidal artery, internal carotid artery, and cranial nerve II also known as the optic nerve and chiasm. Surface meshes were generated within 3-D Slicer, and exported in wavefront (OBJ) format.
  • Further, hard tissue is identified based on voxel density including the bone lining the medial orbit known as the lamina papyracea. Bony structures of the skull identified in this embodiment include the Mandible, Maxilla, Sphenoid, Ethmoid, Frontal, and Temporal Bones.
  • Skin & muscle soft tissues are separated from mucosa based on known anatomic landmarks.
  • The bony structures of CAD model 110 are then replicated 208 on a 3D printer 112 to prepare a hard-plastic physical model 114 of those hard-bony structures. In an embodiment, a stereolithographic (SLS) 3-D printer based upon photopolymerization of liquid resin is used to prepare hard plastic physical model 114. In a particular embodiment, a Formlabs Form2 (trademark of Formlabs, Inc., Somerville, Mass.) was used to prepare hard plastic physical model 114 of hard bony parts as defined in CAD model 110. In an alternative embodiment, a fused deposition (FDM) 3D printer, such as a Creality Ender 3 (trademark of Shenzhen Creality 3D Technology Co., Ltd, Shenzhen, China) or a Zcorp 650 (3D Systems, Rock Hill, South 27 Carolina). was used to prepare hard plastic physical model 114 from polylactic acid (PLA) filament, in other alternative embodiments hard plastic physical model 114 may be prepared with an FDM printer using extrudable Nylon or polyethylene terephthalate filament using a dual-extruder printer with polystyrene (HIPS) temporary supporting structures.
  • 3D printer 112 is also used to prepare 210, by 3D printing, a casting mold 116 configured for casting 212 a soft silicone model 118 of selected soft tissue structures, including skin and septal cartilage of nose, as described in CAD model 110. In an embodiment, a mold is directly printed. In an alternative embodiment, a rigid model of the selected soft tissue structures is printed, this being then used as a form to cast a flexible silicone mold that is in turn used to cast soft silicone model 118 of soft tissue structures. In an alternative embodiment, soft silicone model 118 is directly printed using a flexible UV-polymerizable resin in an SLA printer such as the Form2 printer
  • 3D printer 112 is also used to prepare 211 a casting mold 117 configured for casting 213 a soft silicone model 119 of selected mucosal structures, such as line the interior of nasal cavity and sinuses, as described in CAD model 110. Once cast 213, the soft silicone mucosal model 119 is mounted 215 to the hard-plastic physical model 114. In an alternative embodiment, model 119 of mucosal structures has been directly 3D printed using an SLS-type 3D printer such as a Form2 printer and flexible, UV-curable, resin.
  • With reference to FIG. 3B, once the soft silicone model 118 of soft tissue structures is cast, 212, and after mounting 215 the mucosal model 119 to the hard plastic bony tissue model, the soft silicone model 118 of soft tissue structures is mounted 214 to the hard plastic physical model 114 of bony tissues to create the head physical model 115, the head physical model 115 including hard plastic physical model 114, soft silicone model 119 of mucosal structures, and soft silicone model 118 of soft tissue structures.
  • CAD model 110 is also loaded 216 into a mechanical modeling and tracking machine 122 equipped with tracking receivers 124, 126. Tracking receivers 124, 126 are configured to track 218 location and orientation in three-dimensional space of a tracker 128 that is attached to a dummy endoscope 130, in a particular embodiment, tracking receivers 124, 126 and tracker 128 are HTC Vive (HTC, New Taipei City, Taiwan) trackers and the virtual reality goggles are an HTC Vive headset; other virtual reality goggles and trackers may be used. In an embodiment, head physical model 115 is at a known location, in other embodiments, hard plastic physical model 114 is attached to another tracker 150 through a short steel rod 152 as illustrated in FIG. 6. Also attached to dummy endoscope 130 is endoscope handle 132 that may include operative controls. In a particular embodiment, operative controls on endoscope handle 132 include camera angle selection buttons. In an alternative embodiment, a real endoscope may be used in place of the dummy endoscope. For purposes of this document, a device representing an endoscope may be either a dummy endoscope or a real endoscope.
  • The mechanical modeling and tracking machine 122 then uses the location and orientation of the tracker 128 on the endoscope 130 to determine 220 a location and orientation of endoscope head 134 in the head physical model, which is in turn aligned and registered to a virtual head as modeled by CAD model 110 executing on modeling and tracking machine 122, the CAD model 110 being derived from the 3D image stack determined from MRI or CT scans. Since the head physical model is registered to the CAD model 110, each location of endoscope head 134 in the head physical model corresponds to a location in the CAD model 110.
  • Interaction of the device representing an endoscope with the head physical model as the device is inserted into the model provides tactile or haptic feedback to a surgeon or trainee that resembles tactile or haptic feedback as the surgeon or trainee inserts a real endoscope into a patient's real head.
  • An endoscope alone, however, is useful to visually inspect internal surfaces within the nasal cavity but cannot by itself perform FESS surgery. To perform surgery, additional surgical tools are inserted into a patient's head along with the endoscope.
  • To provide simulated tactile or haptic feedback to a surgeon or surgical trainee of manipulation of surgical tools in a head as well as feedback of manipulating the endoscope, in embodiments one or more devices resembling surgical tools are provided (FIG. 4). These tools may include any combination of an angle-tipped forceps or biter 460 illustrated in FIG. 4A, a straight-tipped forceps or biter 462 FIG. 4B, an angled-tipped scissors 464 FIG. 4C, a straight-tipped scissors 466 FIG. 4D, a straight-tipped electrocautery 468 FIG. 4E, a bent-tipped probe or alternatively a bent-tipped cutter 470 FIG. 4F, a bent-tipped electrocautery (not shown), a bent-tipped or straight-tipped drills (not shown),), straight or bent suction tubes (not shown), microdebriders (not shown), straight tipped probes and cutters (not shown), and other tools as known in the art of FESS surgery; the tool may in an embodiment be a new or experimental tool of unique shape. Each device resembling surgical tools may be a 3-D print of a tool with an embedded 3- D tracker 402, 404, 406, 408, 410, 412, or may be a real surgical tool with a clamp-on 3-D tracker as illustrated in FIG. 4G. The clamp-on 3-D tracker has a 3-D tracking device 420 and clamp 422 and is configured to mount with a setscrew 424 to a shaft or body 426 of a surgical tool or 3-D model of a tool. Each tool has a corresponding 3D mesh model used in the gaming engine of the modeling and display machine to track position of the tool tip and to derive an image of the tool tip when the tool tip is in a field of view of the simulated endoscope tip. In an embodiment, the devices resembling surgical tools may be equipped with short-range radio-activated vibrators to provide haptic feedback resembling that of an operating drill or to provide alarms generated when tool tips approach critical structures.
  • Tools used in FESS, such as forceps, biters, and scissors, often have a long, narrow, shaft 450, 452 configured to fit through the nares into the nasal cavity, they also have a handle 440, 442, 444 that allows the user to control their angle of orientation within the nasal cavity. These tools operate when an operating lever 430, 432, 434 is pressed, the operating lever being coupled through an operating rod that is typically disposed within the shaft 450, 452. Simple cutters and probes, as illustrated in FIG. 4F, do, however, lack an operating lever. For increased realism, operating levers of devices resembling a surgical tool may be instrumented with sensors configured to sense pressure on the operating lever, and transmit sensed pressure to the video modeling and display machine 136.
  • FIG. 4G illustrates a clamp-on tracker attachment that may be attached to a 3D printed model of a surgical tool, or to a real surgical tool, to track the tool in real time.
  • A computer model of each tracker-equipped surgical tool 460, 462, 464, 466, 468, 470 is incorporated into the mechanical model and tracking machine 122 and video model and display machine 136. The mechanical model and tracking machine 122 uses information received through multiple tracking receivers 124, 126 to determine position and orientation of both the tracker 128 on the endoscope 130 (FIG. 3) and tracker 160 on the tracker-equipped tool 162 to determine position and orientation of the tip 134 of the endoscope and operating portion 164 of the tool.
  • Meanwhile, a video modeling and display machine 136 executes a video game engine, in an embodiment the video game engine is the Unity Game Engine, in a particular embodiment Unity Engine V2017.3, (Trademark of Unity Technologies, San Francisco, Calif.) was used, the video modeling and display machine 136 also executes the CAD model 110 of the head. Together the mechanical modeling and tracking machine and video modeling and display machine form a tracking, modeling, and display machine. In an alternative embodiment, modeling and tracking machine 122 and video modeling and display machine 136 are combined within a single tracking, modeling, and display machine executing a plurality of modules.
  • Video modeling and display machine 136 executing a video gaming engine 138 determines objects represented in CAD model 110 that are in view of the endoscope head 134, including anatomy of the head, at one of three selectable endoscope viewing angles, and renders 222 the objects in view of the endoscope head 134 into a video image. The objects represented in CAD model 110 may include models of foreign objects or tumors 166 upon which surgery is to be conducted. The gaming engine 138 also determines whether a tip 164 of any device resembling a surgical tool 162 is in a field of view of the endoscope as oriented, and renders that into the video image. In an embodiment the game engine is the Unity Game Engine v2017.3 (Unity Technologies). The present system is adapted to render objects in view of straight as well as angled endoscopes with accurate field of view. The game engine includes capability of photo-realistic rendering in real-time with dynamic lighting sources and shadows, in an embodiment the dynamic lighting source is chosen to correspond to a lighting fiber of a real endoscope so rendered images strongly resemble images seen through an endoscope camera during live surgeries. This video image represents a view corresponding to a view through an endoscope at a corresponding position in the patient's head 106. The video image corresponding to a view through the endoscope tip may then be tagged 224 with indications of critical structures and presented 226 to a trainee or operating surgeon through virtual reality goggles 140 as if on an endoscope monitor with images of other objects in a virtual operating room. Virtual reality goggles 140 are also equipped with a tracker 146.
  • Mechanical modeling and tracking machine 122 compares computed locations of both the endoscope tip 134 and tool tip 164 to locations of critical structures as flagged in model 110, and provides alerts when either tip 134, 164 is positioned to damage those critical structures. These critical structures include the orbits, cribriform plate, cavernous sinus, and multiple cranial fossae of the skull; when the video model and game machine 136 detects entry of a simulated surgical tool tip into or against one of these critical structures, the video model and game machine sounds an audible alarm or displays a visual alarm; in some embodiments a vibrator is used to provide a haptic alarm. In an alternative embodiment, alarms are generated upon a simulated surgical tool tip approaching one of these critical structures that have been tagged in the mucosal mesh model.
  • The system includes, within video model and game machine 136, a virtual reality model of a virtual operating room, including 3-D models of much common operating-room equipment such as an operating table, instrument tray, electrocautery machine, endoscope illuminator/camera controller, and endoscope monitor.
  • In operation, a trainee or operating surgeon puts on virtual reality goggles 140 then picks up and manipulates the endoscope 130 to insert endoscope head 134 into nares 142 of head physical model 115 into nasal cavity 144 of head physical model 115; the trainee or surgeon may also insert one or more tools 162 through the nares 142 into nasal cavity 144. While the surgeon is inserting the endoscope and tools, tracker 146 tracks location and angle of virtual reality goggles 140 to permit synthesis in video model and game machine 136 of a video stream incorporating a view of the virtual operating room with a virtual patient having head aligned and registered with a physical location of physical head model 115, and draped as typical for FESS surgery. In an embodiment, the view of the virtual operating room includes an image of an endoscope aligned and positioned according to tracked position and orientation of endoscope 130. The virtual operating room includes a virtual operating room monitor providing the virtual reality rendered video image as viewed from the endoscope tip, potentially including an image of the surgical tool tip 504 as well as an image of tumor to be resected 506, permitting the trainee or operating surgeon to view the rendered video image by aiming his or her head, and virtual reality goggles 140, at the virtual operating room monitor 502, as illustrated in FIG. 5B. Also visible in the VR goggle display may be, depending on VR goggle position and orientation, the simulated head 508 of properly draped patient 510, endoscope 512, and surgical tool 514
  • In an embodiment, the tracking and modeling machine 122 tracks position of the endoscope head 134 in physical model 115 and provides alerts when endoscope head 134 approaches locations corresponding to tagged critical structures in CAD model 110. In an embodiment these alerts are provided as aural alerts and as visual alerts by superimposing warnings and images of critical structures on the virtual endoscope images presented on the virtual operating room monitor thereby simulating an alternative embodiment that presents visual warnings on actual endoscope images during live surgeries.
  • While the trainee or surgeon manipulates the endoscope and surgical tool or tools, mechanical interactions of endoscope 130 and endoscope head 134 with the head physical model 115 provide tactile, or haptic, feedback to the trainee or operating surgeon, the tactile feedback greatly resembling tactile feedback felt during actual surgeries on sinuses, pituitary, and other organs accessible to endoscope 130 through nares 142. Tactile and haptic feedback is inherent to using 3D printed dummy endoscopes and other tools in the shape of real surgical tools, and having a trackable patient skull with anatomic features with which the endoscope and other surgical tools physically interact. One aspect of tactile feedback is the feel of the endoscope and its controller, and when present the surgical tools, in the trainee's hands each with 6 full degrees of freedom, providing a proprioceptively authentic feel in a room-scale immersive virtual-reality environment.
  • In embodiments, dummy endoscopes and dummy surgical tools are 3D printed with FDM printers.
  • In an alternative embodiment tactile feedback is enhanced with a vibratory mechanism within the dummy endoscope or other dummy tools to simulate a surgical drill, suction probe or suction cautery such as may be used during actual surgeries.
  • Similarly, the virtual reality rendered video image presented on the virtual operating room monitor with virtual reality goggles 140 provides visual feedback like visual images seen by a trainee or operating surgeon while performing similar operations. The position and angle of the VR goggles are tracked and the simulated OR environment is displayed through the VR goggle with position and size of the simulated endoscope monitor dependent on angle and position of the VR goggle. In this way, movement of the trainee or operating surgeon's head provides realistic movement of stationary objects in his field of view like the simulated endoscope monitor while he is wearing the VR goggles. Both the head physical model 115 and virtual reality rendered video based on CAD model 110 are patient-specific since CAD model 110 is derived from the three-dimensional radiographic images of a specific patient's head 106.
  • In an embodiment, dummy endoscope 130 has a lumen and operative tools can be inserted through that lumen, in particular embodiments these tools may include drills for penetrating through bone into sinuses or through bone to reach a pituitary gland; these tools can also penetrate through hard plastic of physical model 114 during practice procedures.
  • In an alternative embodiment, for use in live surgeries, a frame 304 is attached to the patient's head 106, and a tracker 306 is positioned on the frame. The patient's head is registered to the tracking system with the CAD model 110 aligned to the patient's head 106. In this embodiment, the tracking and modeling machine 122 tracks position of the endoscope head and provides alerts when tracked endoscope head 134 approaches tagged critical structures as identified in the CAD model 110, in an embodiment these alerts are provided as aural alerts and as visual alerts by superimposing warnings and images of critical structures on images obtained through an endoscope camera viewing the patient's nasal cavity and sinuses from endoscope head 134.
  • In an alternative embodiment, critical structures may be highlighted and displayed as illustrated in FIG. 10 as structures shown with reference to the head and FIG. 11 as highlighted structures in an endoscope view. FIG. 10 illustrates critical structures viewed in projection superimposed on the CAD model 110.
  • In an alternative embodiment, the entire motion of the endoscope and operative tools is recorded by the operating surgeon and then transmitted to another site to provide a tracing of the surgery to be then mirrored by a second surgeon performing live surgery (tele-surgery), or repeated by trainees to provide repetitive guided training.
  • In an alternative embodiment, positions of head physical model or patient head, and endoscope as detected by the trackers are recorded throughout a practice or real surgical procedure. In an embodiment, a score is produced based on time to perform a predetermined task with penalties applied for approach of simulated tool tip to simulated critical structures; in embodiments motion tracking of tool and endoscope is used to determine economy of movement and the score is also based on economy of movement. In a particular embodiment, a machine-learning procedure is trained on beginning and experienced surgeons and motion tracking to provide personalized feedback to trainee surgeons and score users on their relative level in performing surgery. This feedback could be used to advance users from a beginner to expert level, or evaluate the level of surgeons in the community. Relative motions of endoscope and instrument to head as recorded are then analyzed using the 3D CAD model and critical structures tagged in the CAD model to provide feedback to the trainee surgeon. Such analysis may include indications of safer or faster ways the procedure could be performed, or be used to evaluate surgeons already performing surgery. For purposes of this document, derivation of the score and its use in training surgeons by giving real-time feedback to users, either by altering the level of difficulty of the simulation, providing visual/auditory/haptic feedback and cues to assist in surgery, and provide objective feedback or score on the simulation is known as the virtual coach. This could be used to evaluate proficiency during training, as well as provide a method of continued certification for practicing surgeons.
  • In an alternate embodiment, trackers are coupled to a real endoscope and real surgical tools, and a tracker on a frame is clamped to the same patient's head as used to generate the CT or MRI radiological tomographic image stack from which the CAD model was derived. The physical head model is not used in this alternate embodiment, the CAD model is registered to the patient's head. The modeling and display machine tracks locations of the endoscope and surgical tools tips in the CAD model—corresponding to positions in the patient's head—and generates visual or aural alarms when these tips approach critical structures tagged in the CAD model. These alarms serve to assist surgeons in avoiding damage to those critical structures.
  • For purposes of this document, the term “resilient polymer” shall include rubberlike polymeric materials, including polymerized Fromlabs elastic resin, resilient silicones and some soft carbon-based synthetic rubbers and flexible plastics like molded latex and sorbothane, adapted to being formed into flexible reproductions of human soft tissue such as skin and muscle and having Shore-A hardness no greater than 85. The term “hard plastic” shall include polymeric materials significantly harder than resilient polymers as defined above, including most acrylonitrile butadiene styrene (ABS), high impact polystyrene (HIPS), and polylactic acid (PLA) 3D printer filaments, and polymerized Formlabs standard-hardness grey resin.
  • Experimentally, Vive trackers were reliably tracked by Vive lighthouse base stations to less than a centimeter, updating the position of the tools and user in the virtual environment without detectable latency. The endoscope could register correctly the modeled danger-zones with audio and visual cues time-synchronously. This framework provides a cost-effective methodology for high-fidelity surgical training simulation with haptic feedback. Through virtual simulation, personalized training programs could be developed for trainees that are adaptive and scalable on any range of difficulty and complexity. Proposed approaches to VR can be extended to the telemedicine world, in which surgeons operating in remote locations can be assisted by the experts aiding from tertiary care centers. State-of-the-art surgical navigation systems such as the system herein described provide reliable optical and electromagnetic-based tracking with accuracy within potentially 2 mm. These navigation workstations confirm anatomic location but do not reduce the risk of surgical complications down to 0%. Additional features from our technology could be translatable to develop AR-based navigation, which can further improve safety in the operating room.
  • Combinations of Features
  • The features herein described may be combined into a functional surgical simulation system and environment in many ways. Among ways anticipated by the inventors that these features can be combined in various embodiments are:
  • A multimode VR apparatus designated A including an endoscope device adapted to represent an endoscope, the endoscope device selected from an endoscope and a dummy endoscope having shape and feel resembling that of an endoscope; a wireless tracker adapted to operate with a three-dimensional tracking system to track location and orientation of the endoscope device in three dimensions in a simulated operating room environment; and a video modeling and display machine configured with a computer-aided design (CAD) model of a head and adapted to provide a simulated head environment, providing a simulated endoscope view. The apparatus also includes a physical head model comprising hard and soft physical components, the endoscope device being configured to be inserted into the physical head model to provide a tactile representation of manipulation of an endoscope in a head to a person handling the endo-scope device.
  • An apparatus designated AA including the multimode VR apparatus designated A wherein the video modeling and display machine comprises a gaming engine adapted simulate endoscope view of the simulated head environment
  • An apparatus designated AB including the apparatus designated A or AA wherein the physical head model comprises a wireless tracker, and where the computer-aided design (CAD) model of a head is registered to a tracked position of the physical head model.
  • An apparatus designated AC including the apparatus designated A, AA, or AB wherein the physical head model comprises a hard-plastic portion prepared by 3D printing representative of bony tissue and a resilient polymer portion representative of skin.
  • An apparatus designated AD including the apparatus designated A, AA, AB or AC and including a surgical tool device having shape and feel resembling that of a surgical tool adapted for functional endoscopic sinus surgery (FESS) selected from the group consisting of forceps, biting forceps, scissors, a probe, and an electrocautery, the tool device further comprising a wireless tracker adapted to operate with the three-dimensional tracking system to track location and orientation of the tool device in three dimensions in the simulated head environment.
  • An apparatus designated AE including the apparatus designated AD wherein the simulated endoscope view includes a simulated view of the tool device.
  • An apparatus designated AF including the apparatus designated A, AA, AB, AC, AD or AE wherein the video modeling and display machine is further configured to provide a simulated operating room (OR) environment with the simulated endoscope view displayed on a simulated endoscope monitor.
  • An apparatus designated AFA including the apparatus designated A, AA, AB, AC, AD, AE, or AF wherein the tool device resembles a surgical tool selected from the group consisting of forceps, biting forceps, scissors, a probe, a drill, and an electrocautery.
  • An apparatus designated AG including the apparatus designated A, AA, AB, AC, AD, AE, or AF further including a virtual-reality (VR) goggle equipped with a wireless tracker, and wherein the simulated OR environment is displayed through the VR goggle with position and size of the simulated endoscope monitor on the VR goggle display dependent on angle and position of the VR goggle.
  • A method designated B of preparing a physical model of a head and endoscope for surgical simulation includes importing into a workstation a radiological tomographic image stack of a head; segmenting the radiological tomographic image stack into hard tissue, soft tissue, and mucosal voxel models based at least in part on voxel intensity; and growing hard tissue, mucosal, and soft tissue regions in the hard tissue, soft tissue, and mucosal voxel models. The method continues with converting the hard tissue, soft tissue, and mucosal models into a hard tissue mesh model, a soft tissue mesh model, and a mucosal mesh model; repairing the mesh models; exporting the mesh models from the workstation and using a 3D printer and the hard tis-sue mesh model to print a physical hard tissue model; preparing a physical mucosal tissue model from the mucosal mesh model; and mounting the physical mucosal tissue model to the physical hard tissue model. The method also includes preparing a physical soft tissue model from the soft tissue mesh model; and mounting the physical soft tissue model to the physical hard tissue model to form a physical head model. The method also includes loading the mesh models into a display system adapted to render images of surfaces of the mesh models as viewed from an endoscope; mounting a tracker to the physical head model; and preparing an endoscope device with a tracker.
  • A method designated BA including the method designated B and including: tracking the endoscope device to determine a tracked endoscope position and orientation; determining a location and orientation of a tip of the endoscope device from the tracked endoscope position and orientation, the position of the tip of the endoscope device being within the physical head model; rendering images of surfaces of the mesh models as viewed from the tip of the endo-scope device; and displaying the images of surfaces of the mesh models.
  • A method designated BB including the method designated B or BA further includes mounting a tracker to a device representing a surgical tool; tracking the device representing a surgical tool to determine a location of a tip of the device representing a surgical tool; determining when the surgical tool is in view of the tip of the endoscope device; and when the surgical tool is in view of the tip of the endoscope device, rendering an image of a surgical tool as viewed from the tip of the endoscope device.
  • A method designated BC including the method designated B, BA, or BB further includes: tracking location and orientation of the physical head model; and registering the mucosal mesh model to the location and orientation of the physical head model.
  • A method designated BD including the method designated B, BA, BB, or BC where the rendering images of surfaces of the mesh models is performed with a 3D gaming engine.
  • A method designated BE including the method designated B, BA, BB, BC, or BD wherein the preparing a physical mucosal model is performed by casting using a mold that has been prepared from the mucosal mesh model by a method comprising 3D printing.
  • A method designated BF including the method designated B, BA, BB, BC, BD, or BE further includes identifying critical anatomic structures imaged in the radiological tomographic image stack and tagging those critical structures in a model of the mesh models.
  • A method designated BG includes the method designated B, BA, BB, BC, BD, BE, or BF and further includes generating alarms upon approach of a tip of the device representing a surgical tool to a critical structure tagged in the mesh model.
  • An endoscopic surgical simulation system designated C includes a physical head model; a tracking system configured to track location and angle of a device representing an endoscope and a device representing a surgical tool; a computer-aided design (CAD) model in a modeling, and display machine, the CAD model registered to a location of the physical head model and comprising CAD representations of structures corresponding to structures of the physical head model; with the modeling, and display machine being configured to track the device representing an endoscope and determine a location of a tip of the device representing an endoscope within a nasal cavity of the physical head model, and to determine a field of view of an endoscope located at the location of the tip of the device representing an endoscope. The modeling, and display machine is configured to track the device representing a surgical tool and determine a location of a tip of the device representing a surgical tool within the nasal cavity of the physical head model; and the modeling and display machine is configured to generate a video stream corresponding to a view of structures represented by the CAD model within the field of view. The modeling and display machine is also configured to superimpose on the video stream an image corresponding to a tip of a surgical tool when the location of a tip of the device representing a surgical tool is in a field of view of view.
  • An endoscopic surgical simulation system designated CA including the endoscopic surgical simulation system designated C wherein the CAD model includes models of a plurality of structures tagged as critical structures.
  • An endoscopic surgical simulation system designated CB including the endoscopic surgical simulation system designated C or CA further including a tracker coupled to the physical head model, and wherein the CAD model is registered to a location of the physical head model.
  • An endoscopic surgical simulation system designated CBA including the endoscopic surgical simulation system designated C, CA, or CB wherein the physical head model and CAD model are derived from computed tomography (CT) or magnetic resonance imaging (MRI) scans of a particular patient, the system configured for preoperative planning and practice for that particular patient.
  • An endoscopic surgical simulation system designated CBB including the endoscopic surgical simulation system designated C, CA, CB wherein there is a first physical head model and CAD model configured for a first task, and a second physical head model and CAD model configured for a second task, the second task of greater difficulty than the first task.
  • An endoscopic surgical simulation system designated CC including the endoscopic surgical simulation system designated C, CA, CB, CBA, or CBB wherein the modeling and display machine is configured to generate alarms upon approach of the location of a tip of the device representing a surgical tool to a structure tagged as a critical structure.
  • An endoscopic surgical simulation system designated CD including the endoscopic surgical simulation system designated C, CA, CB, CC, or CBA further including a model extraction workstation configured to extract three-dimensional mesh models from computed tomography (CT) or magnetic resonance imaging (MRI) radiographic images, and wherein the physical head model is generated by a method comprising 3D printing of extracted three-dimensional mesh models.
  • An endoscopic surgical simulation system designated CD including the endoscopic surgical simulation system designated C, CA, CB, CBA, CBB, or CC further including virtual reality (VR) goggles, the VR goggles equipped with a tracker.
  • An endoscopic surgical simulation system designated CE including the endoscopic surgical simulation system designated CD wherein the video stream corresponding to a view of structures represented by the CAD model within the field of view is displayed upon a display of the VR goggles.
  • An endoscopic surgical simulation system designated CF including the endoscopic surgical simulation system designated CE where the video steam corresponding to a view of structures represented by the CAD model is displayed on the VR goggles at a position dependent on location and orientation of the VR goggles.
  • It should thus be noted that the matter contained in the above description or shown in the accompanying drawings should be interpreted as illustrative and not in a limiting sense. The following claims are intended to cover all generic and specific features described herein, as well as all statements of the scope of the present method and system, which, as a matter of language, might be said to fall therebetween.

Claims (14)

What is claimed is:
1. An endoscopic surgical simulation system comprising:
a physical head model;
a tracking system configured to track location and angle of a device representing an endoscope and a device representing a surgical tool;
a computer-aided design (CAD) model in a modeling, and display machine, the CAD model registered to a location of the physical head model and comprising CAD representations of structures corresponding to structures of the physical head model;
the modeling, and display machine being configured to track the device representing an endoscope and determine a location of a tip of the device representing an endoscope within a nasal cavity of the physical head model, and to determine a field of view of an endoscope located at the location of the tip of the device representing an endoscope;
the modeling, and display machine being configured to track the device representing a surgical tool and determine a location of a tip of the device representing a surgical tool within the nasal cavity of the physical head model;
the modeling and display machine being configured to generate a video stream corresponding to a view of structures represented by the CAD model within the field of view; and
the modeling and display machine being configured to superimpose on the video stream an image corresponding to a tip of a surgical tool when the location of a tip of the device representing a surgical tool is in a field of view of view.
2. The endoscopic surgical simulation system of claim 1 wherein the CAD model comprises models of a plurality of structures tagged as critical structures.
3. The endoscopic surgical simulation system of claim 2, further comprising a tracker coupled to the physical head model, and wherein the CAD model is registered to a location of the physical head model.
4. The endoscopic surgical simulation system of claim 3 wherein the physical head model and CAD model are derived from computed tomography (CT) or magnetic resonance imaging (MRI) scans of a particular patient, the system configured for preoperative planning and practice for that particular patient.
5. The endoscopic surgical simulation system of claim 3 wherein there is a first physical head model and CAD model configured for a first task, and a second physical head model and CAD model configured for a second task, the second task of greater difficulty than the first task.
6. The endoscopic surgical simulation system of claim 2 wherein the modeling and display machine is configured to generate alarms upon approach of the location of a tip of the device representing a surgical tool to a structure tagged as a critical structure.
7. The endoscopic surgical simulation system of claim 1 wherein the device representing an endoscope and the physical head model are configured to mechanically interact upon insertion of the device representing an endoscope into the nasal cavity of the physical head model by a user, the mechanical interaction providing haptic feedback to the user, the haptic feedback to the user approximating haptic feedback obtained when a user inserts a real endoscope into a real human head.
8. The endoscopic surgical simulation system of claim 7 wherein the device representing a surgical tool and the physical head model are configured to mechanically interact upon insertion of the device representing a surgical tool into the nasal cavity of the physical head model by a user, the mechanical interaction providing haptic feedback to the user, the haptic feedback to the user approximating haptic feedback obtained when a user inserts a real endoscope into a real human head.
9. The endoscopic surgical simulation system of claim 8 wherein the CAD model comprises models of a plurality of structures tagged as critical structures, and wherein the modeling and display machine is configured to generate alarms upon approach of the location of a tip of the device representing a surgical tool to a structure tagged as a critical structure.
10. The endoscope surgical simulation system of claim 9 further comprising a scoring module configured to provide a score for a user based on at least time taken by the user to complete a task involving insertion of the device representing the endoscope into the physical head model and alarms generated while the user performs the task.
11. The endoscopic surgical simulation system of claim 1, 2, 3, 4, 5, 6, 7, 8, 9, or 10 further comprising a model extraction workstation configured to extract three dimensional mesh models from computed tomography (CT) or magnetic resonance imaging (MRI) radiographic images, and wherein the physical head model is generated by a method comprising 3D printing of extracted three dimensional mesh models.
12. The endoscopic surgical simulation system of claim 1, 2, 3, 4, 5, 6, 7, 8, 9, or 10 further comprising virtual reality (VR) goggles, the VR goggles equipped with a tracker.
13. The endoscopic surgical simulation system of claim 12 wherein the video stream corresponding to a view of structures represented by the CAD model within the field of view is displayed upon a display of the VR goggles.
14. The endoscopic surgical simulation system of claim 13 where the video steam corresponding to a view of structures represented by the cad model is displayed on the VR goggles at a position dependent on location and orientation of the VR goggles.
US17/048,940 2018-04-18 2019-04-18 Mixed-reality endoscope and surgical tools with haptic feedback for integrated virtual-reality visual and haptic surgical simulation Pending US20210233429A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/048,940 US20210233429A1 (en) 2018-04-18 2019-04-18 Mixed-reality endoscope and surgical tools with haptic feedback for integrated virtual-reality visual and haptic surgical simulation

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201862659672P 2018-04-18 2018-04-18
US201862659680P 2018-04-18 2018-04-18
US201862659685P 2018-04-18 2018-04-18
PCT/US2019/028136 WO2019204615A1 (en) 2018-04-18 2019-04-18 Mixed-reality endoscope and surgical tools with haptic feedback for integrated virtual-reality visual and haptic surgical simulation
US17/048,940 US20210233429A1 (en) 2018-04-18 2019-04-18 Mixed-reality endoscope and surgical tools with haptic feedback for integrated virtual-reality visual and haptic surgical simulation

Publications (1)

Publication Number Publication Date
US20210233429A1 true US20210233429A1 (en) 2021-07-29

Family

ID=68239214

Family Applications (3)

Application Number Title Priority Date Filing Date
US17/048,991 Pending US20210244474A1 (en) 2018-04-18 2019-04-18 Mixed-reality endoscope and surgical tools with haptic feedback for integrated virtual-reality visual and haptic surgical simulation
US17/048,940 Pending US20210233429A1 (en) 2018-04-18 2019-04-18 Mixed-reality endoscope and surgical tools with haptic feedback for integrated virtual-reality visual and haptic surgical simulation
US17/048,962 Pending US20210241656A1 (en) 2018-04-18 2019-04-18 Mixed-reality endoscope and surgical tools with haptic feedback for integrated virtual-reality visual and haptic surgical simulation

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US17/048,991 Pending US20210244474A1 (en) 2018-04-18 2019-04-18 Mixed-reality endoscope and surgical tools with haptic feedback for integrated virtual-reality visual and haptic surgical simulation

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/048,962 Pending US20210241656A1 (en) 2018-04-18 2019-04-18 Mixed-reality endoscope and surgical tools with haptic feedback for integrated virtual-reality visual and haptic surgical simulation

Country Status (2)

Country Link
US (3) US20210244474A1 (en)
WO (3) WO2019204611A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220139260A1 (en) * 2019-02-15 2022-05-05 Virtamed Ag Compact haptic mixed reality simulator
US20240153408A1 (en) * 2021-07-30 2024-05-09 Anne Marie LARIVIERE Training station for surgical procedures

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3109237B1 (en) 2020-04-10 2022-08-05 Virtualisurg SYSTEM FOR CONNECTING A SURGICAL TRAINER TO A VIRTUAL DEVICE
FR3109078B1 (en) 2020-04-10 2022-05-06 Virtualisurg SURGICAL SIMULATION DEVICE
CN111583221B (en) * 2020-04-30 2021-06-29 赤峰学院附属医院 Analysis method and device for craniomaxillofacial soft and hard tissues and electronic equipment
US20220061922A1 (en) * 2020-08-25 2022-03-03 Acclarent, Inc. Apparatus and method for posterior nasal nerve ablation
NL2026875B1 (en) 2020-11-11 2022-06-30 Elitac B V Device, method and system for aiding a surgeon while operating
CN112509410A (en) * 2020-12-08 2021-03-16 中日友好医院(中日友好临床医学研究所) Virtual reality-based auxiliary teaching system for hip arthroscopy operation
WO2022251649A1 (en) * 2021-05-28 2022-12-01 University Of South Florida 3d-printed medical simulator and method
WO2023062231A1 (en) * 2021-10-15 2023-04-20 Hightech Simulations Gmbh Surgical system comprising haptics

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070020598A1 (en) * 2003-03-26 2007-01-25 National Institute Of Advanced Industrial Science And Technology Manikin and method of manufacturing the same
US20100248200A1 (en) * 2008-09-26 2010-09-30 Ladak Hanif M System, Method and Computer Program for Virtual Reality Simulation for Medical Procedure Skills Training
US20160332388A1 (en) * 2015-05-12 2016-11-17 Seoul National University R&Db Foundation Method of forming transparent 3d object and transparent 3d object formed thereby
US20170035517A1 (en) * 2014-04-04 2017-02-09 Surgical Theater LLC Dynamic and interactive navigation in a surgical environment
US20170312031A1 (en) * 2016-04-27 2017-11-02 Arthrology Consulting, Llc Method for augmenting a surgical field with virtual guidance content
US20170372640A1 (en) * 2015-01-10 2017-12-28 University Of Florida Research Foundation, Inc. Simulation features combining mixed reality and modular tracking

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5275166A (en) * 1992-11-16 1994-01-04 Ethicon, Inc. Method and apparatus for performing ultrasonic assisted surgical procedures

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070020598A1 (en) * 2003-03-26 2007-01-25 National Institute Of Advanced Industrial Science And Technology Manikin and method of manufacturing the same
US20100248200A1 (en) * 2008-09-26 2010-09-30 Ladak Hanif M System, Method and Computer Program for Virtual Reality Simulation for Medical Procedure Skills Training
US20170035517A1 (en) * 2014-04-04 2017-02-09 Surgical Theater LLC Dynamic and interactive navigation in a surgical environment
US20170372640A1 (en) * 2015-01-10 2017-12-28 University Of Florida Research Foundation, Inc. Simulation features combining mixed reality and modular tracking
US20160332388A1 (en) * 2015-05-12 2016-11-17 Seoul National University R&Db Foundation Method of forming transparent 3d object and transparent 3d object formed thereby
US20170312031A1 (en) * 2016-04-27 2017-11-02 Arthrology Consulting, Llc Method for augmenting a surgical field with virtual guidance content

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220139260A1 (en) * 2019-02-15 2022-05-05 Virtamed Ag Compact haptic mixed reality simulator
US20240153408A1 (en) * 2021-07-30 2024-05-09 Anne Marie LARIVIERE Training station for surgical procedures

Also Published As

Publication number Publication date
WO2019204615A1 (en) 2019-10-24
WO2019204607A1 (en) 2019-10-24
US20210244474A1 (en) 2021-08-12
WO2019204611A1 (en) 2019-10-24
US20210241656A1 (en) 2021-08-05

Similar Documents

Publication Publication Date Title
US20210233429A1 (en) Mixed-reality endoscope and surgical tools with haptic feedback for integrated virtual-reality visual and haptic surgical simulation
US10951872B2 (en) Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays with real time visualization of tracked instruments
Morris et al. Visuohaptic simulation of bone surgery for training and evaluation
US7121832B2 (en) Three-dimensional surgery simulation system
JP2019523049A (en) A system for robot-assisted re-replacement procedures
KR20170034393A (en) Guidewire navigation for sinuplasty
KR20200118255A (en) Systems and methods for generating customized haptic boundaries
WO2013163800A2 (en) Oral surgery auxiliary guidance method
WO2012123943A1 (en) Training, skill assessment and monitoring users in ultrasound guided procedures
KR102536732B1 (en) Device and method for the computer-assisted simulation of surgical interventions
CN106806021A (en) A kind of VR surgery simulation systems and method based on human organ 3D models
Tai et al. A high-immersive medical training platform using direct intraoperative data
CN113554912A (en) Planting operation training system based on mixed reality technology
JP6803239B2 (en) Surgical training system
JP2023505956A (en) Anatomical feature extraction and presentation using augmented reality
Bluteau et al. Vibrotactile guidance for trajectory following in computer aided surgery
Briner et al. Evaluation of an anatomic model of the paranasal sinuses for endonasal surgical training
Müller-Wittig Virtual reality in medicine
US20220354579A1 (en) Systems and methods for planning and simulation of minimally invasive therapy
Cameron et al. Virtual-reality-assisted interventional procedures.
Marcacci et al. A navigation system for computer assisted unicompartmental arthroplasty
Monahan et al. Verifying the effectiveness of a computer-aided navigation system for arthroscopic hip surgery
Zachow et al. 3D osteotomy planning in cranio-maxillofacial surgery: experiences and results of surgery planning and volumetric finite-element soft tissue prediction in three clinical cases
JP2004348091A (en) Entity model and operation support system using the same
Neumann et al. Using virtual reality techniques in maxillofacial surgery planning

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: ARIZONA BOARD OF REGENTS ON BEHALF OF THE UNIVERSITY OF ARIZONA, ARIZONA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BARBER, SAMUEL;JAIN, SAURABH;SON, YOUNG-JUN;AND OTHERS;SIGNING DATES FROM 20231223 TO 20240125;REEL/FRAME:066672/0206

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER