US20220346889A1 - Graphical user interface for use in a surgical navigation system with a robot arm - Google Patents
Graphical user interface for use in a surgical navigation system with a robot arm Download PDFInfo
- Publication number
- US20220346889A1 US20220346889A1 US17/698,779 US202217698779A US2022346889A1 US 20220346889 A1 US20220346889 A1 US 20220346889A1 US 202217698779 A US202217698779 A US 202217698779A US 2022346889 A1 US2022346889 A1 US 2022346889A1
- Authority
- US
- United States
- Prior art keywords
- robot arm
- virtual representation
- surgical
- orientation
- surgical navigation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 210000003484 anatomy Anatomy 0.000 claims abstract description 63
- 238000000034 method Methods 0.000 claims description 21
- 238000001356 surgical procedure Methods 0.000 claims description 16
- 230000015654 memory Effects 0.000 claims description 9
- 230000003190 augmentative effect Effects 0.000 description 24
- 210000003128 head Anatomy 0.000 description 13
- 239000003550 marker Substances 0.000 description 13
- 230000008569 process Effects 0.000 description 6
- 210000000988 bone and bone Anatomy 0.000 description 5
- 239000011521 glass Substances 0.000 description 5
- 239000007943 implant Substances 0.000 description 5
- 238000002591 computed tomography Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 238000002595 magnetic resonance imaging Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000011218 segmentation Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000003709 image segmentation Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 210000004115 mitral valve Anatomy 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 208000011682 Mitral valve disease Diseases 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000011882 arthroplasty Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 230000037182 bone density Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000001054 cortical effect Effects 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000001647 drug administration Methods 0.000 description 1
- 230000005672 electromagnetic field Effects 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 210000000245 forearm Anatomy 0.000 description 1
- 210000001624 hip Anatomy 0.000 description 1
- 210000004394 hip joint Anatomy 0.000 description 1
- 230000036512 infertility Effects 0.000 description 1
- 238000002697 interventional radiology Methods 0.000 description 1
- 210000000629 knee joint Anatomy 0.000 description 1
- 208000005907 mitral valve insufficiency Diseases 0.000 description 1
- 208000006887 mitral valve stenosis Diseases 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000414 obstructive effect Effects 0.000 description 1
- 210000004197 pelvis Anatomy 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 210000000323 shoulder joint Anatomy 0.000 description 1
- 210000003625 skull Anatomy 0.000 description 1
- 230000000638 stimulation Effects 0.000 description 1
- 210000001103 thalamus Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G06T5/002—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00216—Electrical control of surgical instruments with eye tracking or head position tracking control
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2063—Acoustic tracking systems, e.g. using ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
- A61B2090/3618—Image-producing devices, e.g. surgical cameras with a mirror
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/363—Use of fiducial points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/367—Correlation of different images or relation of image positions in respect to the body creating a 3D dataset from 2D images using position information
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/368—Correlation of different images or relation of image positions in respect to the body changing the image on a display according to the operator's position
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/372—Details of monitor hardware
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/376—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
- A61B2090/3762—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3983—Reference marker arrangements for use with image guided surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
- A61B2090/502—Headgear, e.g. helmet, spectacles
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0132—Head-up displays characterised by optical features comprising binocular systems
- G02B2027/0134—Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0132—Head-up displays characterised by optical features comprising binocular systems
- G02B2027/0136—Head-up displays characterised by optical features comprising binocular systems with a single image source for both eyes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B2027/0192—Supplementary details
- G02B2027/0196—Supplementary details having transparent supporting structure for display mounting, e.g. to a window or a windshield
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
- G06F18/2148—Generating training patterns; Bootstrap methods, e.g. bagging or boosting characterised by the process organisation or structure, e.g. boosting cascade
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30008—Bone
- G06T2207/30012—Spine; Backbone
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
- G06T2207/30208—Marker matrix
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/004—Annotating, labelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
- G06V2201/033—Recognition of patterns in medical or anatomical images of skeletal patterns
Definitions
- the present disclosure relates to graphical user interfaces for use in surgical navigation systems with a robot arm, in particular to a system and method for operative planning and real time execution of a surgical procedure including the use of the robot arm.
- CAS computer-assisted surgery
- the CAS system presents status information about a. procedure as it takes place in real time, displaying the preoperative plan along with intraoperative data.
- the CAS system may be used for procedures in traditional operating rooms, interventional radiology suites, mobile operating rooms or outpatient clinics.
- the procedure may be any medical procedure, whether surgical or non-surgical.
- Surgical navigation systems are used to display the position and orientation of surgical instruments and medical implants with respect to presurgical or intraoperative medical imagery datasets of a patient.
- These images include pre and intraoperative images, such as two-dimensional (2D) fluoroscopic images and three-dimensional (3D) magnetic resonance imaging (MM) or computed tomography (CT).
- pre and intraoperative images such as two-dimensional (2D) fluoroscopic images and three-dimensional (3D) magnetic resonance imaging (MM) or computed tomography (CT).
- Navigation systems locate markers attached or fixed to an object, such as surgical instruments and patient. Most commonly these tracking systems are optical and electromagnetic. Optical tracking systems have one or more stationary cameras that observe passive reflective markers or active infrared LEDs attached to the tracked instruments or the patient. Eye-tracking solutions are specialized optical tracking systems that measure gaze and eye motion relative to a user's head. Electromagnetic systems have a stationary field generator that emits an electromagnetic field that is sensed by coils integrated into tracked medical tools and surgical instruments.
- Incorporating image segmentation processes that automatically identify various bone landmarks, based on their density, can increase planning accuracy.
- One such bone landmark is the spinal pedicle, which is made up of dense cortical bone making its identification utilizing image segmentation easier.
- the pedicle is used as an anchor point for various types of medical implants. Achieving proper implant placement in the pedicle is heavily dependent on the trajectory selected for implant placement. Ideal trajectory is identified by surgeon based on review of advanced imaging (e.g., CT or MRI), goals of the surgical procedure, bone density, presence or absence of deformity, anomaly, prior surgery, and other factors. The surgeon then selects the appropriate trajectory for each spinal level. Proper trajectory generally involves placing an appropriately sized implant in the center of a pedicle. ideal trajectories are also critical for placement of inter-vertebral biomechanical devices.
- Electrodes in the thalamus for the treatment of functional disorders, such as Parkinson's.
- the most important determinant of success in patients undergoing deep brain stimulation surgery is the optimal placement of the electrode.
- Proper trajectory is defined based on preoperative imaging (such as MM or CT) and allows for proper electrode positioning.
- Another example is minimally invasive replacement of prosthetic/biologic mitral valve in for the treatment of mitral valve disorders, such as mitral valve stenosis or regurgitation.
- mitral valve disorders such as mitral valve stenosis or regurgitation.
- the most important determinant of success in patients undergoing minimally invasive mitral valve surgery is the optimal placement of the three dimensional valve.
- one or several computer monitors are placed at some distance away from the surgical field. They require the surgeon to focus the visual attention away from the surgical field to see the monitors across the operating room. This results in a disruption of surgical workflow.
- the monitors of current navigation systems are limited to displaying multiple slices through three-dimensional diagnostic image datasets, which are difficult to interpret for complex 3D anatomy.
- the surgeon When defining and later executing an operative plan, the surgeon interacts with the navigation system via a keyboard and mouse, touchscreen, voice commands, control pendant, foot pedals, haptic devices, and tracked surgical instruments. Based on the complexity of the 3D anatomy, it can be difficult to simultaneously position and orient the instrument in the 3D surgical field only based on the information displayed on the monitors of the navigation system. Similarly, when aligning a tracked instrument with an operative plan, it is difficult to control the 3D position and orientation of the instrument with respect to the patient anatomy. This can result in an unacceptable degree of error in the preoperative plan that will translate to poor surgical outcome.
- a robot arm which may operate some of the surgical instruments used during the operation.
- a robot arm is a relatively large structure and may obstruct the operative field.
- a surgical navigation system comprising: a tracker for real-time tracking of a position and orientation of a robot arm a surgeon's head, a 3D display system and a patient anatomy to provide current position and orientation data; a source of a patient anatomical data and a robot arm virtual image; a surgical navigation image generator configured to generate a surgical navigation image comprising the patient anatomy and the robot arm virtual image in accordance to the current position and orientation data provided by the tracker; and a 3D display system configured to show the surgical navigation image.
- the display of the robot arm virtual image may be configurable such that it can be selectively visible or hidden.
- the display of the robot arm virtual image may be configurable such that its opacity can be adjusted.
- the patient anatomical data may comprise a three-dimensional reconstruction of a segmented model comprising at least two sections representing parts of the anatomy; and wherein the display of the patient anatomy is configurable such that at least one section of the anatomy is displayed and at least one other section of the anatomy is not displayed.
- the system may further comprise a source of at least one of: an operative plan and a virtual surgical instrument model; wherein the tracker is further configured for real-time tracking of surgical instruments; wherein the surgical navigation image further comprises a three-dimensional image representing a virtual image of the surgical instruments.
- the system may further comprise a source of information about suggested positions and/or orientations of the surgical instruments, and the virtual image of the surgical instruments may be configured to indicate the suggested positions and/or orientations of the surgical instruments according to the operative plan data.
- the three-dimensional image of the surgical navigation image may further comprise a graphical cue indicating the required change of position and orientation of the surgical instrument to match the suggested position and orientation according to the preoperative plan data.
- the surgical navigation image may further comprise a. set of orthogonal (axial, sagittal, and coronal) and/or arbitrary planes of the patient anatomical data.
- the 3D display system may be configured to show the surgical navigation image at a see-through device, and wherein the tracker may be configured for real-time tracking of the position and orientation of the see-through device such that an augmented reality image collocated with the patient anatomy in the surgical field underneath the see-through device is visible to a viewer looking from above the see-through device towards the surgical field.
- the patient anatomical data may comprise output data of a semantic segmentation process of an anatomy scan image.
- the system may further comprise a convolutional neural network system configured to perform the semantic segmentation process to generate the patient anatomical data.
- a method for providing an augmented reality image during an operation comprising: providing a source of a patient anatomical data and a robot arm virtual image; real-time tracking, by means of a tracker, a position and orientation of a robot arm, a surgeon's head, a 3D display system and a patient anatomy to provide current position and orientation data generating, by a surgical navigation image generator, a surgical navigation image comprising the patient anatomy and the robot arm virtual image in accordance to the current position and orientation data provided by the tracker; and showing the surgical navigation image at a 3D display system.
- FIG. 1A shows a layout of a surgical room employing the surgical navigation system in accordance with an embodiment of the invention
- FIG. 1B shows a layout of a surgical room employing the surgical navigation system in accordance with an embodiment of the invention
- FIG. 1C shows a layout of a surgical room employing the surgical navigation system in accordance with an embodiment of the invention
- FIG. 2A shows components of the surgical navigation system in accordance with an embodiment of the invention
- FIG. 2B shows components of the surgical navigation system in accordance with an embodiment of the invention
- FIG. 3A shows an example of an augmented reality display in accordance with an embodiment of the invention
- FIG. 3B shows an example of an augmented reality display in accordance with an embodiment of the invention
- FIG. 3C shows an example of an augmented reality display in accordance with an embodiment of the invention
- FIG. 3D shows an example of an augmented reality display in accordance with an embodiment of the invention
- FIG. 3E shows an example of an augmented reality display in accordance with an embodiment of the invention
- FIG. 3F shows an example of an augmented reality display in accordance with an embodiment of the invention
- FIG. 3G shows an example of an augmented reality display in accordance with an embodiment of the invention
- FIG. 3H shows an example of an augmented reality display in accordance with an embodiment of the invention
- FIG. 3I shows an example of an augmented reality display in accordance with an embodiment of the invention
- FIG. 3J shows an example of an augmented reality display in accordance with an embodiment of the invention
- FIG. 4A shows an embodiment of a 3D display system for use in an embodiment of the invention.
- FIG. 4B shows another embodiment of a 3D display system for use in an embodiment of the invention.
- FIG. 4C shows another embodiment of a 3D display system for use in an embodiment of the invention.
- FIG. 4D shows another embodiment of a 3D display system for use in an embodiment of the invention.
- FIG. 4E shows another embodiment of a 3D display system for use in an embodiment of the invention.
- FIG. 5A show eye tracking in accordance with an embodiment of the invention.
- FIG. 5B show eye tracking in accordance with an embodiment of the invention.
- FIG. 6 shows a 3D representation of the results of the semantic segmentation on one vertebrae for use in an embodiment of the invention.
- the system presented herein is comprises a 3D display system 140 to be implemented directly on real surgical applications in a surgical room as shown in FIGS. 1A-1C .
- the 3D display system 140 as shown in the example embodiment comprises a 3D display 142 for emitting a surgical navigation image 142 A towards a see-through mirror 141 that is partially transparent and partially reflective, such that an augmented reality image 141 A collocated with the patient anatomy in the surgical field 108 underneath the see-through mirror 141 is visible to a viewer looking from above the see-through mirror 141 towards the surgical field 108 .
- the surgical room typically comprises a floor 101 on which an operating table 104 is positioned.
- a patient 105 lies on the operating table 104 while being operated by a surgeon 106 with the use of various surgical instruments 107 .
- the surgical navigation system as described in details below can have its components, in particular the 3D display system 140 , mounted to a ceiling 102 , or alternatively to the floor 101 or a side wall 103 of the operating room.
- the components, in particular the 3D display system 140 can be mounted to an adjustable and/or movable floor-supported structure (such as a tripod).
- Components other than the 3D display system 140 such as the surgical image generator 131 , can be implemented in a dedicated computing device 109 , such as a stand-alone PC computer, which may have its own input controllers and display(s) 110 ,
- the system is designed for use in such a configuration wherein the distance d 1 between the surgeon's eyes and the see-through mirror 141 , is shorter than the distance d 2 , between the see-through mirror 141 and the operative field at the patient anatomy 105 being operated.
- the system comprises a robot arm 191 for handling some of the surgical tools.
- the robot arm 191 may have two closed loop control systems: its own position system and one used with the optical tracker as presented herein. Both systems of control may work together to ensure that the robot arm is on the right position.
- the robot arm's position system may comprise encoders placed at each joint to determine the angle or position of each element of the arm.
- the second system may comprise a robot arm marker array 126 attached to the robot arm to be tracked by the tracker 125 , as described below. Any kind of surgical robotic system can be used, preferably one that follows standards of the U.S. Food & Drug Administration.
- FIG. 2A shows a functional schematic presenting connections between the components of the surgical navigation system and FIG. 2B shows examples of physical embodiments of various components.
- the surgical navigation system comprises a tracking system for tracking in real time the position and/or orientation of various entities to provide current position and/or orientation data.
- the system may comprise a plurality of arranged fiducial markers, which are trackable by a fiducial marker tracker 125 .
- Any known type of tracking system can be used.
- 4-point marker arrays are tracked by a three-camera sensor to provide movement along six degrees of freedom.
- a head position marker array 121 can be attached to the surgeon's head for tracking of the position and orientation of the surgeon and the direction of gaze of the surgeon—for example, the head position marker array 121 can be integrated with the wearable 3D glasses 151 or can be attached to a strip worn over the surgeon's head.
- a display marker array 122 can be attached to the see-through mirror 141 of the 3D display system 140 for tracking its position and orientation, as the see-through mirror 141 is movable and can be placed according to the current needs of the operative setup.
- a patient anatomy marker array 123 can be attached at a particular position and orientation of the anatomy of the patient.
- a surgical instrument marker array 124 can be attached to the instrument whose position and orientation shall be tracked.
- a robot arm marker array 126 can be attached to at least one robot arm 191 to track its position.
- the markers in at least one of the marker arrays 121 - 124 are not coplanar, which helps to improve the accuracy of the tracking system.
- the tracking system comprises means for real-time tracking of the position and orientation of at least one of: a surgeon's head 106 , a 3D display 142 , a patient anatomy 105 , and surgical instruments 107 .
- a fiducial marker tracker 125 Preferably, all of these elements are tracked by a fiducial marker tracker 125 .
- a surgical navigation image generator 131 is configured to generate an image to be viewed via the see-through mirror 141 of the 3D display system, It generates a surgical navigation image 142 A comprising data of at least one of: the pre-operative plan 161 (which are generated and stored in a database before the operation), data of the intra-operative plan 162 (which can be generated live during the operation), data of the patient anatomy scan 163 (which can be generated before the operation or live during the operation) and virtual images 164 of surgical instruments used during the operation (which are stored as 3D models in a database), as well as virtual image 166 of the robot arm 191 .
- the surgical navigation image generator 131 can be controlled by a user (i.e. a surgeon or support staff) by one or more user interfaces 132 , such as foot-operable pedals (which are convenient to be operated by the surgeon), a keyboard, a mouse, a joystick, a button, a switch, an audio interface (such as a microphone), a gesture interface, a gaze detecting interface etc.
- the input interface(s) are for inputting instructions and/or commands.
- All system components are controlled by one or more computers which are/is controlled by an operating system and one or more software applications.
- the computer may be equipped with a suitable memory which may store computer program or programs executed by the computer in order to execute steps of the methods utilized in the system.
- Computer programs are preferably stored on a non-transitory medium.
- An example of a non-transitory medium is a non-volatile memory, for example a flash memory while an example of a. volatile memory is RAM,
- the computer instructions are executed by a processor,
- These memories are exemplary recording media for storing computer programs comprising computer-executable instructions performing all the steps of the computer-implemented method according the technical concept presented herein.
- the computer(s) can he placed within the operating room or outside the operating room. Communication between the computer(s) and the components of the system may be performed by wire or wirelessly, according to known communication means.
- the aim of the system in some embodiments is to generate, via the 3D display system 140 , an augmented reality image such as shown in FIG. 3J , and also possibly 3 A- 31 .
- an augmented reality image such as shown in FIG. 3J , and also possibly 3 A- 31 .
- the surgeon looks via the 3D display system 140 , the surgeon sees the augmented reality image 141 A which comprises:
- the real world image the patient anatomy, surgeon's hands and the instrument currently in use (which may be partially inserted into the patient's body and hidden under the skin);
- a computer-generated surgical navigation image 142 A comprising the patient anatomy 163 and a virtual image 166 of the robot arm.
- the augmented reality image comprises a virtual image 166 of the robot arm collocated with the real physical anatomy of the patient, as shown in FIG. 3B .
- the augmented reality image may comprise a guidance image 166 A that indicates, according to the preoperative plan data, the suggested position and orientation of the robot arm 191 .
- the virtual image 166 of the robot arm may be configurable such that it can be selectively displayed or hidden, in full or in part (for example, some parts of the robot arm can be hidden (such as the forearm) and some (such as the surgical tool holder) can be visible). Moreover, the opacity of the robot arm virtual image 166 can be selectively changed, such that it does not obstruct the patient anatomy.
- the display of the patient anatomy 163 can be configurable, such that at least one section of the anatomy 163 A- 163 F is displayed and at least one other section of the anatomy 163 A- 163 F is not displayed, as shown in FIGS. 3F-31 .
- the surgical navigation image may further comprise a 3D image 171 representing at least one of: the virtual image of the instrument 164 or surgical guidance indicating suggested (ideal) trajectory and placement of surgical instruments 107 , according to the pre-operative plans 161 . (as shown in FIG. 3C ); preferably, three different orthogonal planes of the patient anatomical data 163 : coronal 174 , sagittal 173 , axial 172 ; preferably, a menu 175 for controlling the system operation.
- the surgeon shall use a pair of 3D glasses 151 to view the augmented reality image 141 A.
- the 3D display 142 is autostereoscopic, it may be not necessary for the surgeon to use the 3D glasses 151 to view the augmented reality image 141 A.
- the virtual image of the patient anatomy 163 is generated based on data representing a three-dimensional segmented model comprising at least two sections representing parts of the anatomy.
- the anatomy can be for example a bone structure, such as a spine, skull, pelvis, long bones, shoulder joint, hip joint, knee joint etc. This description presents examples related particularly to a spine, but a skilled person will realize how to adapt the embodiments to be applicable to the other bony structures or other anatomy parts as well.
- the model can represent a spine, as shown in FIG. 6 , with the following section: spinous process 163 A, lamina 163 B, articular process 163 C, transverse process 163 D, pedicles 163 E, vertebral body 163 F.
- the model can be generated based on a pre-operative scan of the patient and then segmented manually by a user or automatically by a computer, using dedicated algorithms and/or neural networks, or in a hybrid approach including a computer-assisted manual segmentation.
- a convolutional neural network can be employed.
- the images of the orthogonal planes 172 , 173 , 174 are displayed in an area next (preferably, above) to the area of the 3D image 171 , as shown in FIG. 3A , wherein the 3D image 171 occupies more than 50% of the area of the see-through device 141 .
- the location of the images of the orthogonal planes 172 , 173 , 174 may be adjusted in real time depending on the location of the 3D image 171 , when the surgeon changes the position of the head during operation, such as not to interfere with the 3D image 171 .
- the anatomical information of the user is shown in two different layouts that merge for an augmented and mixed reality feature.
- the first layout is the anatomical information that is projected in 3D in the surgical field.
- the second layout is in the orthogonal planes.
- the surgical navigation image 142 A is generated by the image generator 131 in accordance with the tracking data provided by the fiducial marker tracker 125 , in order to superimpose the anatomy images and the instrument images exactly over the real objects, in accordance with the position and orientation of the surgeon's head.
- the markers are tracked in real time and the image is generated in real time. Therefore, the surgical navigation image generator 131 provides graphics rendering of the virtual objects (patient anatomy, surgical plan and instruments) collocated to the real objects according to the perspective of the surgeon's perspective.
- surgical guidance may relate to suggestions (virtual guidance clues 164 ) for placement of a pedicle screw in spine surgery or the ideal orientation of an acetabular component in hip arthroplasty surgery.
- These suggestions may take a form of animations that show the surgeon whether the placement is correct.
- the suggestions may be displayed both on the 3D holographic display and the orthogonal planes. The surgeon may use the system to plan these orientations before or during the surgical procedure.
- the 3D image 171 is adapted in real time to the position and orientation of the surgeon's head.
- the display of the different orthogonal planes 172 , 173 , 174 may be adapted according to the current position and orientation of the surgical instruments used.
- the aligning the line of sight of the surgeon onto the see-through mirror with the patient anatomy underneath the see-through mirror, involving the scaling and orientation of the image can be realized based on known solutions in the field of computer graphics processing, in particular for virtual reality, including virtual scene generation, using well-known mathematical formulas and algorithms related to viewer centered perspective.
- solutions are known from various tutorials and textbooks (such as The Future of the CAVE′′ by T. A. DeFanti et al, Central European Journal of Engineering, 2010, DOI: 10.2478/s13531-010-0002-5).
- FIG. 3B shows an example indicating collocation of the virtual image of the patient anatomy 163 and the real anatomy 105 .
- the 3D image 171 may demonstrate a mismatch between a supposed/suggested position of the instrument according to the pre-operative plan 161 , displayed as a first virtual image of the instrument 164 A located at its supposed/suggested position, and an actual position of the instrument, visible either as the real instrument via the see-through display and/or a second virtual image of the instrument 164 B overlaid on the current position of the instrument.
- graphical guiding cues such as arrows 165 indicating the direction of the supposed change of position, can be displayed.
- FIG. 3D shows a situation wherein the tip of the supposed position of the instrument displayed as the first virtual image 164 A according to the pre-operative plan 161 matches the tip of the real surgical instrument visible or displayed as the second virtual image 164 B. However, the remainder of objects do not match, therefore the graphical cues 165 still indicate the need to change position.
- the surgical instrument is close to the correct position and the system may provide information on how close the surgical instrument is to the planned position.
- FIG. 3E shows a situation wherein the supposed position of the real surgical instrument matches the position of the instrument according to the pre-operative plan 161 , i.e. the correct position for surgery.
- the graphical cues 165 are no longer displayed, but the virtual images 164 A, 164 B may be changed to indicate the correct position, e.g. by highlighting it or blinking.
- the image of the full patient anatomy 163 may be obstructive.
- the system allows a selective display of the parts of the anatomy 163 , such that at least one part of the anatomy is shown and at least one other part of the anatomy is not shown.
- the surgeon may only want to see isolated parts of the spinal anatomy during spine surgery (only vertebral body or only the pedicle). Each part of the spinal anatomy is displayed at the request of the surgeon. For example the surgeon may only want to see the virtual representation of the pedicle during placement of bony anchors. This would be advantageous, as it would not have any visual interference from the surrounding anatomical structures.
- a single part of the anatomy may be displayed, for example only the vertebral body 163 F ( FIG. 3F ) or only the pedicles 163 E ( FIG. 3G ).
- two parts of the anatomy may be displayed, for example the vertebral body 163 F and the pedicles 163 E ( FIG. 3H ); or a larger group of anatomy parts may be displayed, such as the top parts of 163 A-D of the spine ( FIG. 3I ).
- the user may select the parts that are to be displayed via the input interface 132 .
- the GUI may comprise a set of predefined display templates, each template defining a particular part of the anatomy to be displayed (such as FIG. 3F, 3G ) or a plurality of parts of the anatomy to be displayed (such as FIG. 3H, 3I ).
- the user may then use a dedicated touch-screen button, keyboard key, pedal or other user interface navigation element to select a particular template to be displayed or to switch between consecutive templates.
- the GUI may display a list of available parts of anatomy to be displayed and the user may select the parts to be displayed.
- the GUI interface for configuring the parts that are to be displayed can be configured to be operated directly by the surgeon or by an assistant person.
- 3D display 142 with a see-through mirror 141 , which is particularly effective to provide the surgical navigation data.
- other 3D display systems can be used as well to show the automatically segmented parts of anatomy, such as 3D head-mounted displays.
- the see-through mirror (also called a half-silvered mirror) 141 is at least partially transparent and partially reflective, such that the viewer can see the real world behind the mirror but the mirror also reflects the surgical navigation image generated by the display apparatus located above it.
- a see-through mirror as commonly used in teleprompters can be used.
- the see-through mirror 141 can have a reflective and transparent rate of 50R/50T, but other rates can be used as well.
- the surgical navigation image is emitted from above the see-through mirror 141 by the 3D display 142 .
- a special design of the 3D display 142 is provided that is compact in size to facilitate its mounting within a limited space at the operating room. That design allows generating images of relatively large size, taking into account the small distance between the 3D display 142 and the see-through mirror 141 , without the need to use wide-angle lens that could distort the image.
- the 3D display 142 comprises a 3D projector 143 , such as a DLP projector, that is configured to generate an image, as shown in FIG. 4B (by the dashed lines showing image projection and solid lines showing images generated on particular reflective planes).
- the image from the 3D projector 143 is firstly refracted by an opaque top mirror 144 , then it is refracted by an opaque vertical mirror 145 and subsequently placed on the correct dimensions on a projection screen 146 (which can be simply a glass panel).
- the projection screen 146 works as a rear-projection screen or a small bright 3D display.
- the image displayed at the projection screen 146 is reflected by the see-through mirror 141 which works as an augmented reality device.
- Such configuration of the mirrors 144 , 145 allows the image generated by the 3D projector 143 to be shown with an appropriate size at the projection screen 146 .
- the fact that the projection screen 146 emits an enlarged image generated by the 3D projector 143 makes the emitted surgical navigation image bright, and therefore well visible when reflected at the see-through mirror 141 .
- Reference 141 A indicates the augmented reality image as perceived by the surgeon when looking at the see-through mirror 141 .
- the see-through mirror 141 is held at a predefined position with respect to the 3D projector 143 , in particular with respect to the 3D projector 143 , by an arm 147 , which may have a first portion 147 A fixed to the casing of the 3D display 142 and a second portion 147 B detachably fixed to the first portion 147 A.
- the first portion 147 A may have a protective sleeve overlaid on it.
- the second portion 147 B, together with the see-through mirror 141 may be disposable in order to maintain the sterility of the operating room, as it is relatively close to the operating field and may be contaminated during the operation.
- the arm can also be foldable upwards to leave free space of the work space when the arm and augmented reality are not needed.
- alternative devices may be used in the 3D display system 140 in place of the see-through mirror 141 and the 3D display 142 .
- a 3D monitor 146 A can be used directly in place of the projection screen 146 .
- a 3D projector 143 can be used instead of the 3D display 142 of FIG. 4A , to project the surgical navigation image onto a see-through projection screen 141 B, which is partially transparent and partially reflective, for showing the surgical navigation image 142 A and allowing the surgical field 108 to be seen.
- a lens 141 C can be used to provide appropriate focal position of the surgical navigation image.
- the surgical navigation image can be displayed at a three-dimensional see-through display 141 D and viewed by the user via a lens 141 C used to provide appropriate focal position of the surgical navigation image.
- each of the see-through projection screen 141 B, the see-through display 141 D and the see-through mirror 141 can be commonly called a see-through device.
- the position of the whole 3D display system 140 can be changed, for example by manipulating an adjustable holder (a surgical boom) 149 on FIG. 1A , by which the 3D display 142 is attachable to an operating room structure, such as a ceiling, a wall or a floor.
- an adjustable holder a surgical boom
- An eye tracker 148 module can be installed at the casing of the 3D display 142 or at the see-through device 141 or at the wearable glasses 151 , to track the position and orientation of the eyes of the surgeon and input that as commands via the gaze input interface to control the display parameters at the surgical navigation image generator 131 , for example to activate different functions based on the location that is being looked at, as shown in FIGS. 5A and 5B .
- the eye tracker 148 may use infrared light to illuminate the eyes of the user without affecting the visibility of the user, wherein the reflection and refraction of the patterns on the eyes are utilized to determine the gaze vector (i.e. the direction at which the eye is pointing out).
- the gaze vector along with the position and orientation of the user's head is used to interact with the graphical user interface.
- other eye tracking algorithms techniques can be used as well.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Public Health (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Robotics (AREA)
- General Engineering & Computer Science (AREA)
- Pathology (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Artificial Intelligence (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radiology & Medical Imaging (AREA)
- Gynecology & Obstetrics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Physiology (AREA)
- Psychiatry (AREA)
- Signal Processing (AREA)
- Biophysics (AREA)
- Processing Or Creating Images (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
- This application is a continuation of U.S. patent application Ser. No. 16/217,073, filed Dec. 12, 2018, entitled “A Graphical User Interface for Use in a Surgical Navigation System with a Robot Arm,” the disclosure of which is hereby incorporated by reference.
- U.S. patent application Ser. No. 16/217,073 claims priority to and the benefit of European Application No. 17206558.3, filed Dec. 12, 2017, entitled “A Graphical User Interface for Use in a Surgical Navigation System with a Robot Arm.”
- The present disclosure relates to graphical user interfaces for use in surgical navigation systems with a robot arm, in particular to a system and method for operative planning and real time execution of a surgical procedure including the use of the robot arm.
- Some of the typical functions of a computer-assisted surgery (CAS) system with navigation include presurgical planning of a procedure and presenting preoperative diagnostic information and images in useful formats. The CAS system presents status information about a. procedure as it takes place in real time, displaying the preoperative plan along with intraoperative data. The CAS system may be used for procedures in traditional operating rooms, interventional radiology suites, mobile operating rooms or outpatient clinics. The procedure may be any medical procedure, whether surgical or non-surgical.
- Surgical navigation systems are used to display the position and orientation of surgical instruments and medical implants with respect to presurgical or intraoperative medical imagery datasets of a patient. These images include pre and intraoperative images, such as two-dimensional (2D) fluoroscopic images and three-dimensional (3D) magnetic resonance imaging (MM) or computed tomography (CT).
- Navigation systems locate markers attached or fixed to an object, such as surgical instruments and patient. Most commonly these tracking systems are optical and electromagnetic. Optical tracking systems have one or more stationary cameras that observe passive reflective markers or active infrared LEDs attached to the tracked instruments or the patient. Eye-tracking solutions are specialized optical tracking systems that measure gaze and eye motion relative to a user's head. Electromagnetic systems have a stationary field generator that emits an electromagnetic field that is sensed by coils integrated into tracked medical tools and surgical instruments.
- Incorporating image segmentation processes that automatically identify various bone landmarks, based on their density, can increase planning accuracy. One such bone landmark is the spinal pedicle, which is made up of dense cortical bone making its identification utilizing image segmentation easier. The pedicle is used as an anchor point for various types of medical implants. Achieving proper implant placement in the pedicle is heavily dependent on the trajectory selected for implant placement. Ideal trajectory is identified by surgeon based on review of advanced imaging (e.g., CT or MRI), goals of the surgical procedure, bone density, presence or absence of deformity, anomaly, prior surgery, and other factors. The surgeon then selects the appropriate trajectory for each spinal level. Proper trajectory generally involves placing an appropriately sized implant in the center of a pedicle. ideal trajectories are also critical for placement of inter-vertebral biomechanical devices.
- Another example is placement of electrodes in the thalamus for the treatment of functional disorders, such as Parkinson's. The most important determinant of success in patients undergoing deep brain stimulation surgery is the optimal placement of the electrode. Proper trajectory is defined based on preoperative imaging (such as MM or CT) and allows for proper electrode positioning.
- Another example is minimally invasive replacement of prosthetic/biologic mitral valve in for the treatment of mitral valve disorders, such as mitral valve stenosis or regurgitation. The most important determinant of success in patients undergoing minimally invasive mitral valve surgery is the optimal placement of the three dimensional valve.
- The fundamental limitation of surgical navigation systems is that they provide restricted means of communicating to the surgeon. Currently-available navigation systems present some drawbacks.
- Typically, one or several computer monitors are placed at some distance away from the surgical field. They require the surgeon to focus the visual attention away from the surgical field to see the monitors across the operating room. This results in a disruption of surgical workflow. Moreover, the monitors of current navigation systems are limited to displaying multiple slices through three-dimensional diagnostic image datasets, which are difficult to interpret for complex 3D anatomy.
- The fact that the screen of the surgical navigation system is located away from the region of interest (ROI) of the surgical field requires the surgeon to continuously look back and forth between the screen and the ROI. This task is not intuitive and results in a disruption to surgical workflow and decreases planning accuracy.
- When defining and later executing an operative plan, the surgeon interacts with the navigation system via a keyboard and mouse, touchscreen, voice commands, control pendant, foot pedals, haptic devices, and tracked surgical instruments. Based on the complexity of the 3D anatomy, it can be difficult to simultaneously position and orient the instrument in the 3D surgical field only based on the information displayed on the monitors of the navigation system. Similarly, when aligning a tracked instrument with an operative plan, it is difficult to control the 3D position and orientation of the instrument with respect to the patient anatomy. This can result in an unacceptable degree of error in the preoperative plan that will translate to poor surgical outcome.
- There are known surgical robot arms which may operate some of the surgical instruments used during the operation. However, a robot arm is a relatively large structure and may obstruct the operative field.
- There is disclosed a surgical navigation system comprising: a tracker for real-time tracking of a position and orientation of a robot arm a surgeon's head, a 3D display system and a patient anatomy to provide current position and orientation data; a source of a patient anatomical data and a robot arm virtual image; a surgical navigation image generator configured to generate a surgical navigation image comprising the patient anatomy and the robot arm virtual image in accordance to the current position and orientation data provided by the tracker; and a 3D display system configured to show the surgical navigation image.
- The display of the robot arm virtual image may be configurable such that it can be selectively visible or hidden.
- The display of the robot arm virtual image may be configurable such that its opacity can be adjusted.
- The patient anatomical data may comprise a three-dimensional reconstruction of a segmented model comprising at least two sections representing parts of the anatomy; and wherein the display of the patient anatomy is configurable such that at least one section of the anatomy is displayed and at least one other section of the anatomy is not displayed.
- The system may further comprise a source of at least one of: an operative plan and a virtual surgical instrument model; wherein the tracker is further configured for real-time tracking of surgical instruments; wherein the surgical navigation image further comprises a three-dimensional image representing a virtual image of the surgical instruments.
- The system may further comprise a source of information about suggested positions and/or orientations of the surgical instruments, and the virtual image of the surgical instruments may be configured to indicate the suggested positions and/or orientations of the surgical instruments according to the operative plan data.
- The three-dimensional image of the surgical navigation image may further comprise a graphical cue indicating the required change of position and orientation of the surgical instrument to match the suggested position and orientation according to the preoperative plan data.
- The surgical navigation image may further comprise a. set of orthogonal (axial, sagittal, and coronal) and/or arbitrary planes of the patient anatomical data.
- The 3D display system may be configured to show the surgical navigation image at a see-through device, and wherein the tracker may be configured for real-time tracking of the position and orientation of the see-through device such that an augmented reality image collocated with the patient anatomy in the surgical field underneath the see-through device is visible to a viewer looking from above the see-through device towards the surgical field.
- The patient anatomical data may comprise output data of a semantic segmentation process of an anatomy scan image.
- The system may further comprise a convolutional neural network system configured to perform the semantic segmentation process to generate the patient anatomical data.
- There is also disclosed a method for providing an augmented reality image during an operation, comprising: providing a source of a patient anatomical data and a robot arm virtual image; real-time tracking, by means of a tracker, a position and orientation of a robot arm, a surgeon's head, a 3D display system and a patient anatomy to provide current position and orientation data generating, by a surgical navigation image generator, a surgical navigation image comprising the patient anatomy and the robot arm virtual image in accordance to the current position and orientation data provided by the tracker; and showing the surgical navigation image at a 3D display system.
- These and other features, aspects and advantages of the invention will become better understood with reference to the following drawings, descriptions and claims.
- The surgical navigation system and method are presented herein by means of non-limiting example embodiments shown in a drawing, wherein:
-
FIG. 1A shows a layout of a surgical room employing the surgical navigation system in accordance with an embodiment of the invention; -
FIG. 1B shows a layout of a surgical room employing the surgical navigation system in accordance with an embodiment of the invention; -
FIG. 1C shows a layout of a surgical room employing the surgical navigation system in accordance with an embodiment of the invention; -
FIG. 2A shows components of the surgical navigation system in accordance with an embodiment of the invention; -
FIG. 2B shows components of the surgical navigation system in accordance with an embodiment of the invention; -
FIG. 3A shows an example of an augmented reality display in accordance with an embodiment of the invention; -
FIG. 3B shows an example of an augmented reality display in accordance with an embodiment of the invention; -
FIG. 3C shows an example of an augmented reality display in accordance with an embodiment of the invention; -
FIG. 3D shows an example of an augmented reality display in accordance with an embodiment of the invention; -
FIG. 3E shows an example of an augmented reality display in accordance with an embodiment of the invention; -
FIG. 3F shows an example of an augmented reality display in accordance with an embodiment of the invention; -
FIG. 3G shows an example of an augmented reality display in accordance with an embodiment of the invention; -
FIG. 3H shows an example of an augmented reality display in accordance with an embodiment of the invention; -
FIG. 3I shows an example of an augmented reality display in accordance with an embodiment of the invention; -
FIG. 3J shows an example of an augmented reality display in accordance with an embodiment of the invention; -
FIG. 4A shows an embodiment of a 3D display system for use in an embodiment of the invention. -
FIG. 4B shows another embodiment of a 3D display system for use in an embodiment of the invention. -
FIG. 4C shows another embodiment of a 3D display system for use in an embodiment of the invention. -
FIG. 4D shows another embodiment of a 3D display system for use in an embodiment of the invention. -
FIG. 4E shows another embodiment of a 3D display system for use in an embodiment of the invention. -
FIG. 5A show eye tracking in accordance with an embodiment of the invention. -
FIG. 5B show eye tracking in accordance with an embodiment of the invention. -
FIG. 6 shows a 3D representation of the results of the semantic segmentation on one vertebrae for use in an embodiment of the invention. - The following detailed description is of the best currently contemplated modes of carrying out the invention. The description is not to be taken in a limiting sense, but is made merely for the purpose of illustrating the general principles of the invention.
- The system presented herein is comprises a
3D display system 140 to be implemented directly on real surgical applications in a surgical room as shown inFIGS. 1A-1C . The3D display system 140 as shown in the example embodiment comprises a3D display 142 for emitting asurgical navigation image 142A towards a see-throughmirror 141 that is partially transparent and partially reflective, such that anaugmented reality image 141A collocated with the patient anatomy in thesurgical field 108 underneath the see-throughmirror 141 is visible to a viewer looking from above the see-throughmirror 141 towards thesurgical field 108. - The surgical room typically comprises a
floor 101 on which an operating table 104 is positioned. Apatient 105 lies on the operating table 104 while being operated by asurgeon 106 with the use of varioussurgical instruments 107. The surgical navigation system as described in details below can have its components, in particular the3D display system 140, mounted to aceiling 102, or alternatively to thefloor 101 or aside wall 103 of the operating room. Furthermore, the components, in particular the3D display system 140, can be mounted to an adjustable and/or movable floor-supported structure (such as a tripod). Components other than the3D display system 140, such as thesurgical image generator 131, can be implemented in adedicated computing device 109, such as a stand-alone PC computer, which may have its own input controllers and display(s) 110, - In general, the system is designed for use in such a configuration wherein the distance d1 between the surgeon's eyes and the see-through
mirror 141, is shorter than the distance d2, between the see-throughmirror 141 and the operative field at thepatient anatomy 105 being operated. - In addition, the system comprises a
robot arm 191 for handling some of the surgical tools. Therobot arm 191 may have two closed loop control systems: its own position system and one used with the optical tracker as presented herein. Both systems of control may work together to ensure that the robot arm is on the right position. The robot arm's position system may comprise encoders placed at each joint to determine the angle or position of each element of the arm. The second system may comprise a robotarm marker array 126 attached to the robot arm to be tracked by thetracker 125, as described below. Any kind of surgical robotic system can be used, preferably one that follows standards of the U.S. Food & Drug Administration. -
FIG. 2A shows a functional schematic presenting connections between the components of the surgical navigation system andFIG. 2B shows examples of physical embodiments of various components. - The surgical navigation system comprises a tracking system for tracking in real time the position and/or orientation of various entities to provide current position and/or orientation data. For example, the system may comprise a plurality of arranged fiducial markers, which are trackable by a
fiducial marker tracker 125. Any known type of tracking system can be used. For example in case of a marker tracking system, 4-point marker arrays are tracked by a three-camera sensor to provide movement along six degrees of freedom. A headposition marker array 121 can be attached to the surgeon's head for tracking of the position and orientation of the surgeon and the direction of gaze of the surgeon—for example, the headposition marker array 121 can be integrated with thewearable 3D glasses 151 or can be attached to a strip worn over the surgeon's head. - A
display marker array 122 can be attached to the see-throughmirror 141 of the3D display system 140 for tracking its position and orientation, as the see-throughmirror 141 is movable and can be placed according to the current needs of the operative setup. - A patient
anatomy marker array 123 can be attached at a particular position and orientation of the anatomy of the patient. - A surgical
instrument marker array 124 can be attached to the instrument whose position and orientation shall be tracked. - A robot
arm marker array 126 can be attached to at least onerobot arm 191 to track its position. - Preferably, the markers in at least one of the marker arrays 121-124 are not coplanar, which helps to improve the accuracy of the tracking system.
- Therefore, the tracking system comprises means for real-time tracking of the position and orientation of at least one of: a surgeon's
head 106, a3D display 142, apatient anatomy 105, andsurgical instruments 107. Preferably, all of these elements are tracked by afiducial marker tracker 125. - A surgical
navigation image generator 131 is configured to generate an image to be viewed via the see-throughmirror 141 of the 3D display system, It generates asurgical navigation image 142A comprising data of at least one of: the pre-operative plan 161 (which are generated and stored in a database before the operation), data of the intra-operative plan 162 (which can be generated live during the operation), data of the patient anatomy scan 163 (which can be generated before the operation or live during the operation) andvirtual images 164 of surgical instruments used during the operation (which are stored as 3D models in a database), as well asvirtual image 166 of therobot arm 191. - The surgical
navigation image generator 131, as well as other components of the system, can be controlled by a user (i.e. a surgeon or support staff) by one ormore user interfaces 132, such as foot-operable pedals (which are convenient to be operated by the surgeon), a keyboard, a mouse, a joystick, a button, a switch, an audio interface (such as a microphone), a gesture interface, a gaze detecting interface etc. The input interface(s) are for inputting instructions and/or commands. - All system components are controlled by one or more computers which are/is controlled by an operating system and one or more software applications. The computer may be equipped with a suitable memory which may store computer program or programs executed by the computer in order to execute steps of the methods utilized in the system. Computer programs are preferably stored on a non-transitory medium. An example of a non-transitory medium is a non-volatile memory, for example a flash memory while an example of a. volatile memory is RAM, The computer instructions are executed by a processor, These memories are exemplary recording media for storing computer programs comprising computer-executable instructions performing all the steps of the computer-implemented method according the technical concept presented herein. The computer(s) can he placed within the operating room or outside the operating room. Communication between the computer(s) and the components of the system may be performed by wire or wirelessly, according to known communication means.
- The aim of the system in some embodiments is to generate, via the
3D display system 140, an augmented reality image such as shown inFIG. 3J , and also possibly 3A-31. When the surgeon looks via the3D display system 140, the surgeon sees the augmentedreality image 141A which comprises: - the real world image: the patient anatomy, surgeon's hands and the instrument currently in use (which may be partially inserted into the patient's body and hidden under the skin);
- and a computer-generated
surgical navigation image 142A comprising thepatient anatomy 163 and avirtual image 166 of the robot arm. - As a result, the augmented reality image comprises a
virtual image 166 of the robot arm collocated with the real physical anatomy of the patient, as shown inFIG. 3B . Furthermore, the augmented reality image may comprise aguidance image 166A that indicates, according to the preoperative plan data, the suggested position and orientation of therobot arm 191. - The
virtual image 166 of the robot arm may be configurable such that it can be selectively displayed or hidden, in full or in part (for example, some parts of the robot arm can be hidden (such as the forearm) and some (such as the surgical tool holder) can be visible). Moreover, the opacity of the robot armvirtual image 166 can be selectively changed, such that it does not obstruct the patient anatomy. - The display of the
patient anatomy 163 can be configurable, such that at least one section of theanatomy 163A-163F is displayed and at least one other section of theanatomy 163A-163F is not displayed, as shown inFIGS. 3F-31 . - Furthermore, the surgical navigation image may further comprise a
3D image 171 representing at least one of: the virtual image of theinstrument 164 or surgical guidance indicating suggested (ideal) trajectory and placement ofsurgical instruments 107, according to thepre-operative plans 161. (as shown inFIG. 3C ); preferably, three different orthogonal planes of the patient anatomical data 163: coronal 174, sagittal 173, axial 172; preferably, amenu 175 for controlling the system operation. - If the
3D display 142 is stereoscopic, the surgeon shall use a pair of3D glasses 151 to view theaugmented reality image 141A. However, if the3D display 142 is autostereoscopic, it may be not necessary for the surgeon to use the3D glasses 151 to view theaugmented reality image 141A. - The virtual image of the
patient anatomy 163 is generated based on data representing a three-dimensional segmented model comprising at least two sections representing parts of the anatomy. The anatomy can be for example a bone structure, such as a spine, skull, pelvis, long bones, shoulder joint, hip joint, knee joint etc. This description presents examples related particularly to a spine, but a skilled person will realize how to adapt the embodiments to be applicable to the other bony structures or other anatomy parts as well. - For example, the model can represent a spine, as shown in
FIG. 6 , with the following section:spinous process 163A, lamina 163B,articular process 163C,transverse process 163D, pedicles 163E,vertebral body 163F. - The model can be generated based on a pre-operative scan of the patient and then segmented manually by a user or automatically by a computer, using dedicated algorithms and/or neural networks, or in a hybrid approach including a computer-assisted manual segmentation.
- For example, a convolutional neural network can be employed.
- Preferably, the images of the
orthogonal planes 3D image 171, as shown inFIG. 3A , wherein the3D image 171 occupies more than 50% of the area of the see-throughdevice 141. - The location of the images of the
orthogonal planes 3D image 171, when the surgeon changes the position of the head during operation, such as not to interfere with the3D image 171. - Therefore, in general, the anatomical information of the user is shown in two different layouts that merge for an augmented and mixed reality feature. The first layout is the anatomical information that is projected in 3D in the surgical field. The second layout is in the orthogonal planes.
- The
surgical navigation image 142A is generated by theimage generator 131 in accordance with the tracking data provided by thefiducial marker tracker 125, in order to superimpose the anatomy images and the instrument images exactly over the real objects, in accordance with the position and orientation of the surgeon's head. The markers are tracked in real time and the image is generated in real time. Therefore, the surgicalnavigation image generator 131 provides graphics rendering of the virtual objects (patient anatomy, surgical plan and instruments) collocated to the real objects according to the perspective of the surgeon's perspective. - For example, surgical guidance may relate to suggestions (virtual guidance clues 164) for placement of a pedicle screw in spine surgery or the ideal orientation of an acetabular component in hip arthroplasty surgery. These suggestions may take a form of animations that show the surgeon whether the placement is correct. The suggestions may be displayed both on the 3D holographic display and the orthogonal planes. The surgeon may use the system to plan these orientations before or during the surgical procedure.
- In particular, the
3D image 171 is adapted in real time to the position and orientation of the surgeon's head. The display of the differentorthogonal planes - The aligning the line of sight of the surgeon onto the see-through mirror with the patient anatomy underneath the see-through mirror, involving the scaling and orientation of the image, can be realized based on known solutions in the field of computer graphics processing, in particular for virtual reality, including virtual scene generation, using well-known mathematical formulas and algorithms related to viewer centered perspective. For example, such solutions are known from various tutorials and textbooks (such as The Future of the CAVE″ by T. A. DeFanti et al, Central European Journal of Engineering, 2010, DOI: 10.2478/s13531-010-0002-5).
-
FIG. 3B shows an example indicating collocation of the virtual image of thepatient anatomy 163 and thereal anatomy 105. - For example, as shown in
FIG. 3C , the3D image 171 may demonstrate a mismatch between a supposed/suggested position of the instrument according to thepre-operative plan 161, displayed as a first virtual image of theinstrument 164A located at its supposed/suggested position, and an actual position of the instrument, visible either as the real instrument via the see-through display and/or a second virtual image of theinstrument 164B overlaid on the current position of the instrument. Additionally, graphical guiding cues, such asarrows 165 indicating the direction of the supposed change of position, can be displayed. -
FIG. 3D shows a situation wherein the tip of the supposed position of the instrument displayed as the firstvirtual image 164A according to thepre-operative plan 161 matches the tip of the real surgical instrument visible or displayed as the secondvirtual image 164B. However, the remainder of objects do not match, therefore thegraphical cues 165 still indicate the need to change position. The surgical instrument is close to the correct position and the system may provide information on how close the surgical instrument is to the planned position. -
FIG. 3E shows a situation wherein the supposed position of the real surgical instrument matches the position of the instrument according to thepre-operative plan 161, i.e. the correct position for surgery. In this situation thegraphical cues 165 are no longer displayed, but thevirtual images - In some situations, the image of the
full patient anatomy 163, as shown inFIG. 3A , may be obstructive. To solve this problem, the system allows a selective display of the parts of theanatomy 163, such that at least one part of the anatomy is shown and at least one other part of the anatomy is not shown. - For example, the surgeon may only want to see isolated parts of the spinal anatomy during spine surgery (only vertebral body or only the pedicle). Each part of the spinal anatomy is displayed at the request of the surgeon. For example the surgeon may only want to see the virtual representation of the pedicle during placement of bony anchors. This would be advantageous, as it would not have any visual interference from the surrounding anatomical structures.
- Therefore, a single part of the anatomy may be displayed, for example only the
vertebral body 163F (FIG. 3F ) or only thepedicles 163E (FIG. 3G ). Alternatively, two parts of the anatomy may be displayed, for example thevertebral body 163F and thepedicles 163E (FIG. 3H ); or a larger group of anatomy parts may be displayed, such as the top parts of 163A-D of the spine (FIG. 3I ). - The user may select the parts that are to be displayed via the
input interface 132. - For example, the GUI may comprise a set of predefined display templates, each template defining a particular part of the anatomy to be displayed (such as
FIG. 3F, 3G ) or a plurality of parts of the anatomy to be displayed (such asFIG. 3H, 3I ). The user may then use a dedicated touch-screen button, keyboard key, pedal or other user interface navigation element to select a particular template to be displayed or to switch between consecutive templates. - Alternatively, the GUI may display a list of available parts of anatomy to be displayed and the user may select the parts to be displayed.
- The GUI interface for configuring the parts that are to be displayed can be configured to be operated directly by the surgeon or by an assistant person.
- The foregoing description will provide examples of a
3D display 142 with a see-throughmirror 141, which is particularly effective to provide the surgical navigation data. However, other 3D display systems can be used as well to show the automatically segmented parts of anatomy, such as 3D head-mounted displays. - The see-through mirror (also called a half-silvered mirror) 141 is at least partially transparent and partially reflective, such that the viewer can see the real world behind the mirror but the mirror also reflects the surgical navigation image generated by the display apparatus located above it.
- For example, a see-through mirror as commonly used in teleprompters can be used. For example, the see-through
mirror 141 can have a reflective and transparent rate of 50R/50T, but other rates can be used as well. - The surgical navigation image is emitted from above the see-through
mirror 141 by the3D display 142. - In an example embodiment as shown in
FIGS. 4A and 4B , a special design of the3D display 142 is provided that is compact in size to facilitate its mounting within a limited space at the operating room. That design allows generating images of relatively large size, taking into account the small distance between the3D display 142 and the see-throughmirror 141, without the need to use wide-angle lens that could distort the image. - The
3D display 142 comprises a3D projector 143, such as a DLP projector, that is configured to generate an image, as shown inFIG. 4B (by the dashed lines showing image projection and solid lines showing images generated on particular reflective planes). The image from the3D projector 143 is firstly refracted by an opaquetop mirror 144, then it is refracted by an opaquevertical mirror 145 and subsequently placed on the correct dimensions on a projection screen 146 (which can be simply a glass panel). Theprojection screen 146 works as a rear-projection screen or a small bright 3D display. The image displayed at theprojection screen 146 is reflected by the see-throughmirror 141 which works as an augmented reality device. Such configuration of themirrors 3D projector 143 to be shown with an appropriate size at theprojection screen 146. The fact that theprojection screen 146 emits an enlarged image generated by the3D projector 143 makes the emitted surgical navigation image bright, and therefore well visible when reflected at the see-throughmirror 141.Reference 141A indicates the augmented reality image as perceived by the surgeon when looking at the see-throughmirror 141. - The see-through
mirror 141 is held at a predefined position with respect to the3D projector 143, in particular with respect to the3D projector 143, by anarm 147, which may have afirst portion 147A fixed to the casing of the3D display 142 and asecond portion 147B detachably fixed to thefirst portion 147A. Thefirst portion 147A may have a protective sleeve overlaid on it. Thesecond portion 147B, together with the see-throughmirror 141, may be disposable in order to maintain the sterility of the operating room, as it is relatively close to the operating field and may be contaminated during the operation. The arm can also be foldable upwards to leave free space of the work space when the arm and augmented reality are not needed. - In alternative embodiments, as shown for example in
FIGS. 4C, 4D, 4E , alternative devices may be used in the3D display system 140 in place of the see-throughmirror 141 and the3D display 142. - As shown in
FIG. 4C , a3D monitor 146A can be used directly in place of theprojection screen 146. - As shown in
FIG. 4D , a3D projector 143 can be used instead of the3D display 142 ofFIG. 4A , to project the surgical navigation image onto a see-throughprojection screen 141B, which is partially transparent and partially reflective, for showing thesurgical navigation image 142A and allowing thesurgical field 108 to be seen. Alens 141C can be used to provide appropriate focal position of the surgical navigation image. - As shown in
FIG. 4F , the surgical navigation image can be displayed at a three-dimensional see-throughdisplay 141D and viewed by the user via alens 141C used to provide appropriate focal position of the surgical navigation image. - Therefore, each of the see-through
projection screen 141B, the see-throughdisplay 141D and the see-throughmirror 141 can be commonly called a see-through device. - If a need arises to adapt the position of the augmented reality screen with respect to the surgeon's head (for example, to accommodate the position depending on the height of the particular surgeon), the position of the whole
3D display system 140 can be changed, for example by manipulating an adjustable holder (a surgical boom) 149 onFIG. 1A , by which the3D display 142 is attachable to an operating room structure, such as a ceiling, a wall or a floor. - An
eye tracker 148 module can be installed at the casing of the3D display 142 or at the see-throughdevice 141 or at thewearable glasses 151, to track the position and orientation of the eyes of the surgeon and input that as commands via the gaze input interface to control the display parameters at the surgicalnavigation image generator 131, for example to activate different functions based on the location that is being looked at, as shown inFIGS. 5A and 5B . - For example, the
eye tracker 148 may use infrared light to illuminate the eyes of the user without affecting the visibility of the user, wherein the reflection and refraction of the patterns on the eyes are utilized to determine the gaze vector (i.e. the direction at which the eye is pointing out). The gaze vector along with the position and orientation of the user's head is used to interact with the graphical user interface. However, other eye tracking algorithms techniques can be used as well. - It is particularly useful to use the
eye tracker 148 along with thepedals 132 as the input interface, wherein the surgeon may navigate the system by moving a cursor by eyesight and inputting commands (such as select or cancel) by pedals. - While the invention has been described with respect to a limited number of embodiments, it will be appreciated that many variations, modifications and other applications of the invention may be made. Therefore, the claimed invention as recited in the claims that follow is not limited to the embodiments described herein.
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/698,779 US20220346889A1 (en) | 2017-08-15 | 2022-03-18 | Graphical user interface for use in a surgical navigation system with a robot arm |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP17186307.9A EP3445048A1 (en) | 2017-08-15 | 2017-08-15 | A graphical user interface for a surgical navigation system for providing an augmented reality image during operation |
EP17206558.3 | 2017-12-12 | ||
EP17206558.3A EP3443924B8 (en) | 2017-08-15 | 2017-12-12 | A graphical user interface for use in a surgical navigation system with a robot arm |
US16/217,073 US11278359B2 (en) | 2017-08-15 | 2018-12-12 | Graphical user interface for use in a surgical navigation system with a robot arm |
US17/698,779 US20220346889A1 (en) | 2017-08-15 | 2022-03-18 | Graphical user interface for use in a surgical navigation system with a robot arm |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/217,073 Continuation US11278359B2 (en) | 2017-08-15 | 2018-12-12 | Graphical user interface for use in a surgical navigation system with a robot arm |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220346889A1 true US20220346889A1 (en) | 2022-11-03 |
Family
ID=59649522
Family Applications (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/059,061 Active US10646285B2 (en) | 2017-08-15 | 2018-08-09 | Graphical user interface for a surgical navigation system and method for providing an augmented reality image during operation |
US16/186,549 Active 2038-11-14 US11622818B2 (en) | 2017-08-15 | 2018-11-11 | Graphical user interface for displaying automatically segmented individual parts of anatomy in a surgical navigation system |
US16/217,073 Active 2040-01-24 US11278359B2 (en) | 2017-08-15 | 2018-12-12 | Graphical user interface for use in a surgical navigation system with a robot arm |
US16/842,793 Abandoned US20200229877A1 (en) | 2017-08-15 | 2020-04-08 | Graphical user interface for a surgical navigation system and method for providing an augmented reality image during operation |
US17/145,178 Granted US20210267698A1 (en) | 2017-08-15 | 2021-01-08 | Graphical user interface for a surgical navigation system and method for providing an augmented reality image during operation |
US17/698,779 Pending US20220346889A1 (en) | 2017-08-15 | 2022-03-18 | Graphical user interface for use in a surgical navigation system with a robot arm |
US18/298,235 Pending US20240074822A1 (en) | 2017-08-15 | 2023-04-10 | Graphical user interface for displaying automatically segmented individual parts of anatomy in a surgical navigation system |
Family Applications Before (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/059,061 Active US10646285B2 (en) | 2017-08-15 | 2018-08-09 | Graphical user interface for a surgical navigation system and method for providing an augmented reality image during operation |
US16/186,549 Active 2038-11-14 US11622818B2 (en) | 2017-08-15 | 2018-11-11 | Graphical user interface for displaying automatically segmented individual parts of anatomy in a surgical navigation system |
US16/217,073 Active 2040-01-24 US11278359B2 (en) | 2017-08-15 | 2018-12-12 | Graphical user interface for use in a surgical navigation system with a robot arm |
US16/842,793 Abandoned US20200229877A1 (en) | 2017-08-15 | 2020-04-08 | Graphical user interface for a surgical navigation system and method for providing an augmented reality image during operation |
US17/145,178 Granted US20210267698A1 (en) | 2017-08-15 | 2021-01-08 | Graphical user interface for a surgical navigation system and method for providing an augmented reality image during operation |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/298,235 Pending US20240074822A1 (en) | 2017-08-15 | 2023-04-10 | Graphical user interface for displaying automatically segmented individual parts of anatomy in a surgical navigation system |
Country Status (2)
Country | Link |
---|---|
US (7) | US10646285B2 (en) |
EP (3) | EP3445048A1 (en) |
Families Citing this family (88)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2955481B1 (en) | 2010-01-27 | 2013-06-14 | Tornier Sa | DEVICE AND METHOD FOR GLENOIDAL CHARACTERIZATION OF PROSTHETIC OR RESURFACING OMOPLATE |
US8672837B2 (en) | 2010-06-24 | 2014-03-18 | Hansen Medical, Inc. | Methods and devices for controlling a shapeable medical device |
US20140067869A1 (en) | 2012-08-30 | 2014-03-06 | Atheer, Inc. | Method and apparatus for content association and history tracking in virtual and augmented reality |
US9057600B2 (en) | 2013-03-13 | 2015-06-16 | Hansen Medical, Inc. | Reducing incremental measurement sensor error |
US9014851B2 (en) | 2013-03-15 | 2015-04-21 | Hansen Medical, Inc. | Systems and methods for tracking robotically controlled medical instruments |
JP6138566B2 (en) * | 2013-04-24 | 2017-05-31 | 川崎重工業株式会社 | Component mounting work support system and component mounting method |
US11020016B2 (en) | 2013-05-30 | 2021-06-01 | Auris Health, Inc. | System and method for displaying anatomy and devices on a movable display |
US10013808B2 (en) | 2015-02-03 | 2018-07-03 | Globus Medical, Inc. | Surgeon head-mounted display apparatuses |
GB2536650A (en) | 2015-03-24 | 2016-09-28 | Augmedics Ltd | Method and system for combining video-based and optic-based augmented reality in a near eye display |
EP3376987B1 (en) * | 2015-11-19 | 2020-10-28 | EOS Imaging | Method of preoperative planning to correct spine misalignment of a patient |
GB201617507D0 (en) | 2016-10-14 | 2016-11-30 | Axial3D Limited | Axial3D UK |
US11138790B2 (en) | 2016-10-14 | 2021-10-05 | Axial Medical Printing Limited | Method for generating a 3D physical model of a patient specific anatomic feature from 2D medical images |
EP3585254B1 (en) | 2017-02-24 | 2024-03-20 | Masimo Corporation | Medical device cable and method of sharing data between connected medical devices |
WO2018156809A1 (en) * | 2017-02-24 | 2018-08-30 | Masimo Corporation | Augmented reality system for displaying patient data |
EP4278956A3 (en) | 2017-03-10 | 2024-02-21 | Biomet Manufacturing, LLC | Augmented reality supported knee surgery |
WO2018208616A1 (en) | 2017-05-08 | 2018-11-15 | Masimo Corporation | System for pairing a medical system to a network controller by use of a dongle |
US10140421B1 (en) | 2017-05-25 | 2018-11-27 | Enlitic, Inc. | Medical scan annotator system |
US10022192B1 (en) | 2017-06-23 | 2018-07-17 | Auris Health, Inc. | Automatically-initialized robotic systems for navigation of luminal networks |
EP3445048A1 (en) | 2017-08-15 | 2019-02-20 | Holo Surgical Inc. | A graphical user interface for a surgical navigation system for providing an augmented reality image during operation |
EP3470006B1 (en) | 2017-10-10 | 2020-06-10 | Holo Surgical Inc. | Automated segmentation of three dimensional bony structure images |
US20190254753A1 (en) | 2018-02-19 | 2019-08-22 | Globus Medical, Inc. | Augmented reality navigation systems for use with robotic surgical systems and methods of their use |
KR102500422B1 (en) | 2018-03-28 | 2023-02-20 | 아우리스 헬스, 인코포레이티드 | System and method for displaying the estimated position of an instrument |
US10383692B1 (en) * | 2018-04-13 | 2019-08-20 | Taiwan Main Orthopaedic Biotechnology Co., Ltd. | Surgical instrument guidance system |
EP3787543A4 (en) | 2018-05-02 | 2022-01-19 | Augmedics Ltd. | Registration of a fiducial marker for an augmented reality system |
JP7146949B2 (en) | 2018-05-31 | 2022-10-04 | オーリス ヘルス インコーポレイテッド | Image-based airway analysis and mapping |
WO2019245857A1 (en) | 2018-06-19 | 2019-12-26 | Tornier, Inc. | Neural network for diagnosis of shoulder condition |
EP3608870A1 (en) | 2018-08-10 | 2020-02-12 | Holo Surgical Inc. | Computer assisted identification of appropriate anatomical structure for medical device placement during a surgical procedure |
US11164067B2 (en) * | 2018-08-29 | 2021-11-02 | Arizona Board Of Regents On Behalf Of Arizona State University | Systems, methods, and apparatuses for implementing a multi-resolution neural network for use with imaging intensive applications including medical imaging |
EP3629340A1 (en) * | 2018-09-28 | 2020-04-01 | Siemens Healthcare GmbH | Medical imaging device comprising a medical scanner unit and at least one display, and method for controlling at least one display of a medical imaging device |
US11287874B2 (en) | 2018-11-17 | 2022-03-29 | Novarad Corporation | Using optical codes with augmented reality displays |
US11457871B2 (en) | 2018-11-21 | 2022-10-04 | Enlitic, Inc. | Medical scan artifact detection system and methods for use therewith |
US11282198B2 (en) | 2018-11-21 | 2022-03-22 | Enlitic, Inc. | Heat map generating system and methods for use therewith |
US11145059B2 (en) | 2018-11-21 | 2021-10-12 | Enlitic, Inc. | Medical scan viewing system with enhanced training and methods for use therewith |
US11011257B2 (en) | 2018-11-21 | 2021-05-18 | Enlitic, Inc. | Multi-label heat map display system |
US11766296B2 (en) | 2018-11-26 | 2023-09-26 | Augmedics Ltd. | Tracking system for image-guided surgery |
US11995854B2 (en) * | 2018-12-19 | 2024-05-28 | Nvidia Corporation | Mesh reconstruction using data-driven priors |
US11475565B2 (en) * | 2018-12-21 | 2022-10-18 | GE Precision Healthcare LLC | Systems and methods for whole-body spine labeling |
GB201900437D0 (en) | 2019-01-11 | 2019-02-27 | Axial Medical Printing Ltd | Axial3d big book 2 |
US11532132B2 (en) * | 2019-03-08 | 2022-12-20 | Mubayiwa Cornelious MUSARA | Adaptive interactive medical training program with virtual patients |
EP3937825A4 (en) * | 2019-03-13 | 2022-12-28 | Smith&Nephew, Inc. | Augmented reality assisted surgical tool alignment |
EP3948800A4 (en) * | 2019-04-04 | 2023-05-10 | Centerline Biomedical, Inc. | Registration of spatial tracking system with augmented reality display |
CN110215284B (en) * | 2019-06-06 | 2021-04-02 | 上海木木聚枞机器人科技有限公司 | Visualization system and method |
US11980506B2 (en) | 2019-07-29 | 2024-05-14 | Augmedics Ltd. | Fiducial marker |
JP7451686B2 (en) | 2019-08-30 | 2024-03-18 | オーリス ヘルス インコーポレイテッド | Instrument image reliability system and method |
EP4021331A4 (en) | 2019-08-30 | 2023-08-30 | Auris Health, Inc. | Systems and methods for weight-based registration of location sensors |
US11791044B2 (en) * | 2019-09-06 | 2023-10-17 | RedNova Innovations, Inc. | System for generating medical reports for imaging studies |
US11426178B2 (en) | 2019-09-27 | 2022-08-30 | Globus Medical Inc. | Systems and methods for navigating a pin guide driver |
US11890066B2 (en) | 2019-09-30 | 2024-02-06 | Globus Medical, Inc | Surgical robot with passive end effector |
US11864857B2 (en) | 2019-09-27 | 2024-01-09 | Globus Medical, Inc. | Surgical robot with passive end effector |
EP3808304A1 (en) * | 2019-10-16 | 2021-04-21 | DePuy Ireland Unlimited Company | Method and system for guiding position and orientation of a robotic device holding a surgical tool |
US11253324B1 (en) * | 2019-11-06 | 2022-02-22 | Cognistic, LLC | Determination of appendix position using a two stage deep neural network |
US11462315B2 (en) | 2019-11-26 | 2022-10-04 | Enlitic, Inc. | Medical scan co-registration and methods for use therewith |
US11992373B2 (en) | 2019-12-10 | 2024-05-28 | Globus Medical, Inc | Augmented reality headset with varied opacity for navigated robotic surgery |
US11382712B2 (en) | 2019-12-22 | 2022-07-12 | Augmedics Ltd. | Mirroring in image guided surgery |
TWI793390B (en) * | 2019-12-25 | 2023-02-21 | 財團法人工業技術研究院 | Method, processing device, and display system for information display |
US11237627B2 (en) | 2020-01-16 | 2022-02-01 | Novarad Corporation | Alignment of medical images in augmented reality displays |
US11464581B2 (en) | 2020-01-28 | 2022-10-11 | Globus Medical, Inc. | Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums |
US11382699B2 (en) | 2020-02-10 | 2022-07-12 | Globus Medical Inc. | Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery |
US11207150B2 (en) | 2020-02-19 | 2021-12-28 | Globus Medical, Inc. | Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment |
US11607277B2 (en) | 2020-04-29 | 2023-03-21 | Globus Medical, Inc. | Registration of surgical tool with reference array tracked by cameras of an extended reality headset for assisted navigation during surgery |
US11636628B2 (en) | 2020-05-01 | 2023-04-25 | International Business Machines Corporation | Composite imagery rendering in diminished reality environment for medical diagnosis |
US20210346093A1 (en) * | 2020-05-06 | 2021-11-11 | Warsaw Orthopedic, Inc. | Spinal surgery system and methods of use |
US20210349534A1 (en) * | 2020-05-07 | 2021-11-11 | Alcon Inc. | Eye-tracking system for entering commands |
US11153555B1 (en) | 2020-05-08 | 2021-10-19 | Globus Medical Inc. | Extended reality headset camera system for computer assisted navigation in surgery |
US11510750B2 (en) | 2020-05-08 | 2022-11-29 | Globus Medical, Inc. | Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications |
US11382700B2 (en) | 2020-05-08 | 2022-07-12 | Globus Medical Inc. | Extended reality headset tool tracking and control |
EP3944254A1 (en) * | 2020-07-21 | 2022-01-26 | Siemens Healthcare GmbH | System for displaying an augmented reality and method for generating an augmented reality |
US11737831B2 (en) | 2020-09-02 | 2023-08-29 | Globus Medical Inc. | Surgical object tracking template generation for computer assisted navigation during surgical procedure |
CN112190331A (en) * | 2020-10-15 | 2021-01-08 | 北京爱康宜诚医疗器材有限公司 | Method, device and system for determining surgical navigation information and electronic device |
CN112263331B (en) * | 2020-10-30 | 2022-04-05 | 上海初云开锐管理咨询有限公司 | System and method for presenting medical instrument vision in vivo |
US11786309B2 (en) * | 2020-12-28 | 2023-10-17 | Advanced Neuromodulation Systems, Inc. | System and method for facilitating DBS electrode trajectory planning |
US11669678B2 (en) | 2021-02-11 | 2023-06-06 | Enlitic, Inc. | System with report analysis and methods for use therewith |
GB202101908D0 (en) | 2021-02-11 | 2021-03-31 | Axial Medical Printing Ltd | Axial3D pathology |
NL2027671B1 (en) * | 2021-02-26 | 2022-09-26 | Eindhoven Medical Robotics B V | Augmented reality system to simulate an operation on a patient |
CN113509265A (en) * | 2021-04-01 | 2021-10-19 | 上海复拓知达医疗科技有限公司 | Dynamic position identification prompting system and method thereof |
US11967066B2 (en) * | 2021-04-12 | 2024-04-23 | Daegu Gyeongbuk Institute Of Science And Technology | Method and apparatus for processing image |
EP4329660A1 (en) * | 2021-04-30 | 2024-03-06 | Augmedics, Inc. | Graphical user interface for a surgical navigation system |
US11896445B2 (en) | 2021-07-07 | 2024-02-13 | Augmedics Ltd. | Iliac pin and adapter |
CN113786228B (en) * | 2021-09-15 | 2024-04-12 | 苏州朗润医疗系统有限公司 | Auxiliary puncture navigation system based on AR augmented reality |
WO2023047355A1 (en) * | 2021-09-26 | 2023-03-30 | Augmedics Ltd. | Surgical planning and display |
CA3236128A1 (en) * | 2021-10-23 | 2023-04-27 | Nelson STONE | Procedure guidance and training apparatus, methods and systems |
BE1029880B1 (en) * | 2021-10-26 | 2023-05-30 | Rods&Cones Holding Bv | Automated user preferences |
US11995853B2 (en) * | 2021-11-04 | 2024-05-28 | Honeywell Federal Manufacturing & Technologies, Llc | System and method for transparent augmented reality |
WO2023129934A1 (en) * | 2021-12-31 | 2023-07-06 | Intuitive Surgical Operations, Inc. | Systems and methods for integrating intra-operative image data with minimally invasive medical techniques |
WO2023156608A1 (en) * | 2022-02-21 | 2023-08-24 | Universität Zürich, Prorektorat Forschung | Method, computing device, system, and computer program product for assisting positioning of a tool with respect to a specific body part of a patient |
WO2023175588A1 (en) * | 2022-03-18 | 2023-09-21 | DePuy Synthes Products, Inc. | Surgical systems, methods, and devices employing augmented reality (ar) guidance |
CN115778544B (en) * | 2022-12-05 | 2024-02-27 | 方田医创(成都)科技有限公司 | Surgical navigation precision indicating system, method and storage medium based on mixed reality |
CN117137450B (en) * | 2023-08-30 | 2024-05-10 | 哈尔滨海鸿基业科技发展有限公司 | Flap implantation imaging method and system based on flap blood transport assessment |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020082498A1 (en) * | 2000-10-05 | 2002-06-27 | Siemens Corporate Research, Inc. | Intra-operative image-guided neurosurgery with augmented reality visualization |
US10292768B2 (en) * | 2016-03-12 | 2019-05-21 | Philipp K. Lang | Augmented reality guidance for articular procedures |
Family Cites Families (103)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6405072B1 (en) | 1991-01-28 | 2002-06-11 | Sherwood Services Ag | Apparatus and method for determining a location of an anatomical target with reference to a medical apparatus |
AU2003246906A1 (en) | 2002-06-25 | 2004-01-06 | Michael Nicholas Dalton | Apparatus and method for superimposing images over an object |
US20050190446A1 (en) | 2002-06-25 | 2005-09-01 | Carl Zeiss Amt Ag | Catadioptric reduction objective |
US7376903B2 (en) * | 2004-06-29 | 2008-05-20 | Ge Medical Systems Information Technologies | 3D display system and method |
WO2006086223A2 (en) * | 2005-02-08 | 2006-08-17 | Blue Belt Technologies, Inc. | Augmented reality device and method |
WO2006111965A2 (en) | 2005-04-20 | 2006-10-26 | Visionsense Ltd. | System and method for producing an augmented image of an organ of a patient |
US9289267B2 (en) | 2005-06-14 | 2016-03-22 | Siemens Medical Solutions Usa, Inc. | Method and apparatus for minimally invasive surgery using endoscopes |
EP1919390B1 (en) | 2005-08-05 | 2012-12-19 | DePuy Orthopädie GmbH | Computer assisted surgery system |
US10653497B2 (en) | 2006-02-16 | 2020-05-19 | Globus Medical, Inc. | Surgical tool systems and methods |
ITTO20060223A1 (en) | 2006-03-24 | 2007-09-25 | I Med S R L | PROCEDURE AND SYSTEM FOR THE AUTOMATIC RECOGNITION OF PRENEOPLASTIC ANOMALIES IN ANATOMICAL STRUCTURES, AND RELATIVE PROGRAM FOR PROCESSOR |
WO2007115826A2 (en) * | 2006-04-12 | 2007-10-18 | Nassir Navab | Virtual penetrating mirror device for visualizing of virtual objects within an augmented reality environment |
US9532848B2 (en) | 2007-06-15 | 2017-01-03 | Othosoft, Inc. | Computer-assisted surgery system and method |
CN101226325B (en) | 2008-02-03 | 2010-06-02 | 李志扬 | Three-dimensional display method and apparatus based on accidental constructive interference |
EP2194486A1 (en) | 2008-12-04 | 2010-06-09 | Koninklijke Philips Electronics N.V. | A method, apparatus, and computer program product for acquiring medical image data |
DE102010009554A1 (en) | 2010-02-26 | 2011-09-01 | Lüllau Engineering Gmbh | Method and irradiation apparatus for irradiating curved surfaces with non-ionizing radiation |
US8693755B2 (en) * | 2010-06-17 | 2014-04-08 | Siemens Medical Solutions Usa, Inc. | System for adjustment of image data acquired using a contrast agent to enhance vessel visualization for angiography |
US8675939B2 (en) | 2010-07-13 | 2014-03-18 | Stryker Leibinger Gmbh & Co. Kg | Registration of anatomical data sets |
EP2598034B1 (en) | 2010-07-26 | 2018-07-18 | Kjaya, LLC | Adaptive visualization for direct physician use |
CN103153239B (en) | 2010-08-13 | 2017-11-21 | 史密夫和内修有限公司 | System and method for optimizing orthopaedics process parameter |
US9358114B2 (en) | 2010-08-25 | 2016-06-07 | Smith & Nephew, Inc. | Intraoperative scanning for implant optimization |
US9785246B2 (en) | 2010-10-06 | 2017-10-10 | Nuvasive, Inc. | Imaging system and method for use in surgical and interventional medical procedures |
US9510771B1 (en) | 2011-10-28 | 2016-12-06 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
CA2794898C (en) | 2011-11-10 | 2019-10-29 | Victor Yang | Method of rendering and manipulating anatomical images on mobile computing device |
EP2797542B1 (en) | 2011-12-30 | 2019-08-28 | MAKO Surgical Corp. | Systems and methods for customizing interactive haptic boundaries |
EP2890300B1 (en) | 2012-08-31 | 2019-01-02 | Kenji Suzuki | Supervised machine learning technique for reduction of radiation dose in computed tomography imaging |
WO2014160342A1 (en) | 2013-03-13 | 2014-10-02 | The University Of North Carolina At Chapel Hill | Low latency stabilization for head-worn displays |
US9782159B2 (en) | 2013-03-13 | 2017-10-10 | Camplex, Inc. | Surgical visualization systems |
WO2015001806A1 (en) | 2013-07-05 | 2015-01-08 | パナソニック株式会社 | Projection system |
WO2015058816A1 (en) | 2013-10-25 | 2015-04-30 | Brainlab Ag | Hybrid medical marker |
US9715739B2 (en) | 2013-11-07 | 2017-07-25 | The Johns Hopkins University | Bone fragment tracking |
US20170329402A1 (en) * | 2014-03-17 | 2017-11-16 | Spatial Intelligence Llc | Stereoscopic display |
US9723300B2 (en) | 2014-03-17 | 2017-08-01 | Spatial Intelligence Llc | Stereoscopic display |
KR20150108701A (en) | 2014-03-18 | 2015-09-30 | 삼성전자주식회사 | System and method for visualizing anatomic elements in a medical image |
EP3125759B1 (en) | 2014-03-27 | 2021-01-06 | Bresmedical Pty Limited | Computer aided surgical navigation and planning in implantology |
WO2015164402A1 (en) | 2014-04-22 | 2015-10-29 | Surgerati, Llc | Intra-operative medical image viewing system and method |
EP3151736A2 (en) | 2014-07-15 | 2017-04-12 | Sony Corporation | Computer assisted surgical system with position registration mechanism and method of operation thereof |
US20160015469A1 (en) | 2014-07-17 | 2016-01-21 | Kyphon Sarl | Surgical tissue recognition and navigation apparatus and method |
EP3221809A1 (en) | 2014-11-18 | 2017-09-27 | Koninklijke Philips N.V. | User guidance system and method, use of an augmented reality device |
JP6553354B2 (en) | 2014-12-22 | 2019-07-31 | Toyo Tire株式会社 | Pneumatic radial tire |
US10073516B2 (en) * | 2014-12-29 | 2018-09-11 | Sony Interactive Entertainment Inc. | Methods and systems for user interaction within virtual reality scene using head mounted display |
US10154239B2 (en) | 2014-12-30 | 2018-12-11 | Onpoint Medical, Inc. | Image-guided surgery with surface reconstruction and augmented reality visualization |
US10013808B2 (en) | 2015-02-03 | 2018-07-03 | Globus Medical, Inc. | Surgeon head-mounted display apparatuses |
US20160324580A1 (en) | 2015-03-23 | 2016-11-10 | Justin Esterberg | Systems and methods for assisted surgical navigation |
GB2536650A (en) | 2015-03-24 | 2016-09-28 | Augmedics Ltd | Method and system for combining video-based and optic-based augmented reality in a near eye display |
WO2016157260A1 (en) | 2015-03-31 | 2016-10-06 | パナソニックIpマネジメント株式会社 | Visible light projection device |
WO2016162789A2 (en) | 2015-04-07 | 2016-10-13 | King Abdullah University Of Science And Technology | Method, apparatus, and system for utilizing augmented reality to improve surgery |
US10835322B2 (en) | 2015-04-24 | 2020-11-17 | Medtronic Navigation, Inc. | Direct visualization of a device location |
US9940539B2 (en) | 2015-05-08 | 2018-04-10 | Samsung Electronics Co., Ltd. | Object recognition apparatus and method |
US20180174311A1 (en) | 2015-06-05 | 2018-06-21 | Siemens Aktiengesellschaft | Method and system for simultaneous scene parsing and model fusion for endoscopic and laparoscopic navigation |
CN107708568B (en) | 2015-06-30 | 2020-11-20 | 佳能美国公司 | Registered fiducial markers, systems, and methods |
US10070928B2 (en) | 2015-07-01 | 2018-09-11 | Mako Surgical Corp. | Implant placement planning |
US9949700B2 (en) * | 2015-07-22 | 2018-04-24 | Inneroptic Technology, Inc. | Medical device approaches |
CA2976516C (en) | 2015-07-27 | 2022-11-22 | Synaptive Medical (Barbados) Inc. | Navigational feedback for intraoperative waypoint |
US10105187B2 (en) | 2015-08-27 | 2018-10-23 | Medtronic, Inc. | Systems, apparatus, methods and computer-readable storage media facilitating surgical procedures utilizing augmented reality |
US20170084036A1 (en) | 2015-09-21 | 2017-03-23 | Siemens Aktiengesellschaft | Registration of video camera with medical imaging |
BR112018007473A2 (en) | 2015-10-14 | 2018-10-23 | Surgical Theater LLC | augmented reality surgical navigation |
US10390886B2 (en) * | 2015-10-26 | 2019-08-27 | Siemens Healthcare Gmbh | Image-based pedicle screw positioning |
EP3373815A4 (en) | 2015-11-13 | 2019-07-17 | Stryker European Holdings I, LLC | Adaptive positioning technology |
CN108603922A (en) | 2015-11-29 | 2018-09-28 | 阿特瑞斯公司 | Automatic cardiac volume is divided |
US9675319B1 (en) * | 2016-02-17 | 2017-06-13 | Inneroptic Technology, Inc. | Loupe display |
US10788672B2 (en) * | 2016-03-01 | 2020-09-29 | Mirus Llc | Augmented visualization during surgery |
US10191615B2 (en) | 2016-04-28 | 2019-01-29 | Medtronic Navigation, Inc. | Method and apparatus for image-based navigation |
US11515030B2 (en) | 2016-06-23 | 2022-11-29 | Siemens Healthcare Gmbh | System and method for artificial agent based cognitive operating rooms |
EP3478211A4 (en) | 2016-07-04 | 2020-02-19 | 7D Surgical Inc. | Systems and methods for determining intraoperative spinal orientation |
US20180049622A1 (en) | 2016-08-16 | 2018-02-22 | Insight Medical Systems, Inc. | Systems and methods for sensory augmentation in medical procedures |
GB2568426B (en) | 2016-08-17 | 2021-12-15 | Synaptive Medical Inc | Methods and systems for registration of virtual space with real space in an augmented reality system |
EP3509697B1 (en) | 2016-09-07 | 2024-04-17 | Elekta, Inc. | System and method for learning models of radiotherapy treatment plans to predict radiotherapy dose distributions |
CN110248618B (en) | 2016-09-09 | 2024-01-09 | 莫比乌斯成像公司 | Method and system for displaying patient data in computer-assisted surgery |
WO2018052966A1 (en) | 2016-09-16 | 2018-03-22 | Zimmer, Inc. | Augmented reality surgical technique guidance |
US11839433B2 (en) | 2016-09-22 | 2023-12-12 | Medtronic Navigation, Inc. | System for guided procedures |
EP3375399B1 (en) | 2016-10-05 | 2022-05-25 | NuVasive, Inc. | Surgical navigation system |
CN106600568B (en) | 2017-01-19 | 2019-10-11 | 东软医疗系统股份有限公司 | A kind of low-dose CT image de-noising method and device |
EP3574504A1 (en) | 2017-01-24 | 2019-12-04 | Tietronix Software, Inc. | System and method for three-dimensional augmented reality guidance for use of medical equipment |
US20180271484A1 (en) | 2017-03-21 | 2018-09-27 | General Electric Company | Method and systems for a hand-held automated breast ultrasound device |
EP4131165A1 (en) | 2017-03-22 | 2023-02-08 | Brainlab AG | Augmented reality patient positioning using an atlas |
US10169873B2 (en) | 2017-03-23 | 2019-01-01 | International Business Machines Corporation | Weakly supervised probabilistic atlas generation through multi-atlas label fusion |
US10667864B2 (en) | 2017-04-19 | 2020-06-02 | Brainlab Ag | Inline-view determination |
US10624702B2 (en) | 2017-04-28 | 2020-04-21 | Medtronic Navigation, Inc. | Automatic identification of instruments |
CA3056260C (en) | 2017-05-09 | 2022-04-12 | Brainlab Ag | Generation of augmented reality image of a medical device |
JP2020525127A (en) | 2017-06-26 | 2020-08-27 | ザ・リサーチ・ファウンデーション・フォー・ザ・ステイト・ユニヴァーシティ・オブ・ニューヨーク | System, method, and computer-accessible medium for virtual pancreatography |
EP3432263B1 (en) | 2017-07-17 | 2020-09-16 | Siemens Healthcare GmbH | Semantic segmentation for cancer detection in digital breast tomosynthesis |
US11135015B2 (en) | 2017-07-21 | 2021-10-05 | Globus Medical, Inc. | Robot surgical platform |
EP3658978A4 (en) | 2017-07-27 | 2021-04-21 | Invuity, Inc. | Projection scanning system |
US11166764B2 (en) | 2017-07-27 | 2021-11-09 | Carlsmed, Inc. | Systems and methods for assisting and augmenting surgical procedures |
EP3470006B1 (en) | 2017-10-10 | 2020-06-10 | Holo Surgical Inc. | Automated segmentation of three dimensional bony structure images |
EP3445048A1 (en) | 2017-08-15 | 2019-02-20 | Holo Surgical Inc. | A graphical user interface for a surgical navigation system for providing an augmented reality image during operation |
ES2945711T3 (en) | 2017-08-15 | 2023-07-06 | Holo Surgical Inc | Surgical navigation system to provide an augmented reality image during the operation |
US10783640B2 (en) | 2017-10-30 | 2020-09-22 | Beijing Keya Medical Technology Co., Ltd. | Systems and methods for image segmentation using a scalable and compact convolutional neural network |
US20190192230A1 (en) | 2017-12-12 | 2019-06-27 | Holo Surgical Inc. | Method for patient registration, calibration, and real-time augmented reality image display during surgery |
US11179200B2 (en) | 2017-12-15 | 2021-11-23 | Medtronic, Inc. | Augmented reality solution to disrupt, transform and enhance cardiovascular surgical and/or procedural mapping navigation and diagnostics |
EP3509013A1 (en) | 2018-01-04 | 2019-07-10 | Holo Surgical Inc. | Identification of a predefined object in a set of images from a medical image scanner during a surgical procedure |
US20190254753A1 (en) | 2018-02-19 | 2019-08-22 | Globus Medical, Inc. | Augmented reality navigation systems for use with robotic surgical systems and methods of their use |
WO2019195926A1 (en) | 2018-04-09 | 2019-10-17 | 7D Surgical Inc. | Systems and methods for performing intraoperative guidance |
US10736699B2 (en) | 2018-04-27 | 2020-08-11 | Medtronic Navigation, Inc. | System and method for a tracked procedure |
EP3608870A1 (en) | 2018-08-10 | 2020-02-12 | Holo Surgical Inc. | Computer assisted identification of appropriate anatomical structure for medical device placement during a surgical procedure |
EP4095797B1 (en) | 2018-11-08 | 2024-01-24 | Augmedics Inc. | Autonomous segmentation of three-dimensional nervous system structures from medical images |
US10939977B2 (en) | 2018-11-26 | 2021-03-09 | Augmedics Ltd. | Positioning marker |
US11766296B2 (en) | 2018-11-26 | 2023-09-26 | Augmedics Ltd. | Tracking system for image-guided surgery |
US11406472B2 (en) | 2018-12-13 | 2022-08-09 | DePuy Synthes Products, Inc. | Surgical instrument mounted display system |
EP3893793A4 (en) | 2018-12-14 | 2022-08-31 | MAKO Surgical Corp. | Systems and methods for preoperative planning and postoperative analysis of surgical procedures |
EP3726466A1 (en) | 2019-04-15 | 2020-10-21 | Holo Surgical Inc. | Autonomous level identification of anatomical bony structures on 3d medical imagery |
US11974819B2 (en) | 2019-05-10 | 2024-05-07 | Nuvasive Inc. | Three-dimensional visualization during surgery |
EP3751516B1 (en) | 2019-06-11 | 2023-06-28 | Holo Surgical Inc. | Autonomous multidimensional segmentation of anatomical structures on three-dimensional medical imaging |
-
2017
- 2017-08-15 EP EP17186307.9A patent/EP3445048A1/en active Pending
- 2017-12-12 EP EP17206558.3A patent/EP3443924B8/en active Active
- 2017-12-12 EP EP23214121.8A patent/EP4353177A2/en active Pending
-
2018
- 2018-08-09 US US16/059,061 patent/US10646285B2/en active Active
- 2018-11-11 US US16/186,549 patent/US11622818B2/en active Active
- 2018-12-12 US US16/217,073 patent/US11278359B2/en active Active
-
2020
- 2020-04-08 US US16/842,793 patent/US20200229877A1/en not_active Abandoned
-
2021
- 2021-01-08 US US17/145,178 patent/US20210267698A1/en active Granted
-
2022
- 2022-03-18 US US17/698,779 patent/US20220346889A1/en active Pending
-
2023
- 2023-04-10 US US18/298,235 patent/US20240074822A1/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020082498A1 (en) * | 2000-10-05 | 2002-06-27 | Siemens Corporate Research, Inc. | Intra-operative image-guided neurosurgery with augmented reality visualization |
US10292768B2 (en) * | 2016-03-12 | 2019-05-21 | Philipp K. Lang | Augmented reality guidance for articular procedures |
Also Published As
Publication number | Publication date |
---|---|
US20240074822A1 (en) | 2024-03-07 |
US20200229877A1 (en) | 2020-07-23 |
EP3443924A1 (en) | 2019-02-20 |
US11278359B2 (en) | 2022-03-22 |
EP3443924B1 (en) | 2023-12-06 |
US20190142519A1 (en) | 2019-05-16 |
US10646285B2 (en) | 2020-05-12 |
EP4353177A2 (en) | 2024-04-17 |
EP3443924B8 (en) | 2024-01-17 |
US11622818B2 (en) | 2023-04-11 |
EP3445048A1 (en) | 2019-02-20 |
US20190053855A1 (en) | 2019-02-21 |
US20190175285A1 (en) | 2019-06-13 |
US20210267698A1 (en) | 2021-09-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220346889A1 (en) | Graphical user interface for use in a surgical navigation system with a robot arm | |
EP3443923B1 (en) | Surgical navigation system for providing an augmented reality image during operation | |
EP3498212A1 (en) | A method for patient registration, calibration, and real-time augmented reality image display during surgery | |
US11750788B1 (en) | Augmented reality guidance for spinal surgery with stereoscopic display of images and tracked instruments | |
JP6751456B2 (en) | Augmented reality navigation system for use in a robotic surgical system and method of use thereof | |
Iqbal | Augmented Reality for Computer Assisted Orthopaedic Surgery | |
EP4329660A1 (en) | Graphical user interface for a surgical navigation system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HOLO SURGICAL INC., ILLINOIS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SIEMIONOW, KRZYSZTOF B.;REEL/FRAME:059385/0128 Effective date: 20210630 Owner name: HOLO SURGICAL INC., ILLINOIS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SIEMIONOW, KRZYSZTOF B.;LUCIANO, CRISTIAN J.;MEJIA OROZCO, EDWING ISSAC;REEL/FRAME:059385/0065 Effective date: 20181212 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
AS | Assignment |
Owner name: AUGMEDICS, INC., ILLINOIS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HOLO SURGICAL INC.;REEL/FRAME:064851/0521 Effective date: 20230811 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |