WO2015100310A1 - Surgical visualization systems - Google Patents

Surgical visualization systems Download PDF

Info

Publication number
WO2015100310A1
WO2015100310A1 PCT/US2014/072121 US2014072121W WO2015100310A1 WO 2015100310 A1 WO2015100310 A1 WO 2015100310A1 US 2014072121 W US2014072121 W US 2014072121W WO 2015100310 A1 WO2015100310 A1 WO 2015100310A1
Authority
WO
WIPO (PCT)
Prior art keywords
medical apparatus
display
image
assistant
primary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2014/072121
Other languages
English (en)
French (fr)
Inventor
John Tesar
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Camplex Inc
Original Assignee
Camplex Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Camplex Inc filed Critical Camplex Inc
Priority to EP14873324.9A priority Critical patent/EP3087424A4/en
Priority to JP2016542194A priority patent/JP2017507680A/ja
Publication of WO2015100310A1 publication Critical patent/WO2015100310A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/0012Surgical microscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/02Surgical instruments, devices or methods for holding wounds open, e.g. retractors; Tractors
    • A61B17/0206Surgical instruments, devices or methods for holding wounds open, e.g. retractors; Tractors with antagonistic arms as supports for retractor elements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/16Instruments for performing osteoclasis; Drills or chisels for bones; Trepans
    • A61B17/1604Chisels; Rongeurs; Punches; Stamps
    • A61B17/1606Chisels; Rongeurs; Punches; Stamps of forceps type, i.e. having two jaw elements moving relative to each other
    • A61B17/1608Chisels; Rongeurs; Punches; Stamps of forceps type, i.e. having two jaw elements moving relative to each other the two jaw elements being linked to two elongated shaft elements moving longitudinally relative to each other
    • A61B17/1611Chisels; Rongeurs; Punches; Stamps of forceps type, i.e. having two jaw elements moving relative to each other the two jaw elements being linked to two elongated shaft elements moving longitudinally relative to each other the two jaw elements being integral with respective elongate shaft elements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/20Surgical microscopes characterised by non-optical aspects
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/16Microscopes adapted for ultraviolet illumination ; Fluorescence microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/18Arrangements with more than one light path, e.g. for comparing two specimens
    • G02B21/20Binocular arrangements
    • G02B21/22Stereoscopic arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00535Surgical instruments, devices or methods pneumatically or hydraulically operated
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • A61B2090/306Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using optical fibres
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10064Fluorescence image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • Some surgical operations involve the use of large incisions. These open surgical procedures provide ready access for surgical instruments and the hand or hands of the surgeon, allowing the user to visually observe and work in the surgical site, either directly or through an operating microscope or with the aide of loupes. Open surgery is associated with significant drawbacks, however, as the relatively large incisions result in pain, scarring, and the risk of infection as well as extended recovery time. To reduce these deleterious effects, techniques have been developed to provide for minimally invasive surgery. Minimally invasive surgical techniques, such as endoscopy, laparoscopy, arthroscopy, pharyngo-laryngoscopy, as well as small incision procedures utilizing an operating microscope for visualization, utilize a significantly smaller incision than typical open surgical procedures.
  • Specialized tools may then be used to access the surgical site through the small incision.
  • the surgeon's view and workspace of the surgical site is limited.
  • visualization devices such as endoscopes, laparoscopes, and the like can be inserted percutaneously through the incision to allow the user to view the surgical site.
  • a medical apparatus in a first aspect, includes a display housing and an opening in the display housing.
  • the medical apparatus also includes an electronic display disposed within the display housing, the electronic display comprising a plurality of pixels configured to produce a two-dimensional image.
  • the medical apparatus also includes a display optical system disposed within the display housing, the display optical system comprising a plurality of lens elements disposed along an optical path.
  • the display optical system is configured to receive the two-dimensional image from the electronic display, produce a beam with a cross-section that remains substantially constant along the optical path, and produce a coliimated beam exiting the opening in the display housing.
  • the display optical system further comprises a baffle configured to reduce stray fight.
  • the display optical system comprises less than or equal to four baffles.
  • the display optical system comprises less than or equal to four mirrors.
  • a first baffle is positioned between the electronic display and a first baffle along the optical path, the first mirror positioned prior to the plurality of lens elements along the optical path from the display to the opening.
  • at least three baffles are positioned prior to the plurality of lens elements along the optical path from the display to the opening.
  • at least two mirrors are positioned prior to the plurality of lens elements along the optical path from the display to the opening.
  • the display optical system has an exit pupil and the electronic display is not parallel to the exit pupil.
  • the opening in the display housing comprises a mounting interface configured to mate with a binocular assembly for a surgical microscope.
  • an exit pupil of the display optical system is of a same size or smaller than an entrance pupil of oculars in the binocular assembly.
  • the optical path is less than or equal to 16.2 inches and a light- emitting portion of the electronic display has a diagonal measurement that is greater than or equal to 5 inches. In some embodiments of the first aspect, the optical path is less than or equal to 18.7 inches and a light-emitting portion of the electronic display has a diagonal measuremeni thai is greater than or equal to 8 inches.
  • the display optical system further comprises a converging mirror. In some embodiments of the first aspect, the medical apparatus further comprises a viewing assembly comprising an objective lens, beam positioning optics, and an ocular, the viewing assembly configured to receive the collimaied beam exiting the opening in the display housing.
  • the electronic display has a diagonal light-emitting portion between 4 inches and 9 inches.
  • an optical path length from the electronic display to a last element of the display optical sy stem is at least 9 inches. In a further embodiment, the optical path length from the electronic display to the last element of the display- optical system is less than 20 inches.
  • a medical apparatus in a second aspect, includes a viewing assembly comprising a housing and an ocular, the ocular configured to provide a view an electronic display disposed in the housing.
  • the medical assembly includes an optical assembly disposed on the viewing assembly, the optical assembly configured to provide a surgical microscope view of a surgical site.
  • the optical assembly includes an auxiliary video camera and a gimbal configured to couple the auxiliary video camera to the viewing assembly and configured to change an orientation of the auxiliary video camera relative to the viewing assembly.
  • the medical apparatus includes an image processing system in communication with the optical assembly and the electronic display, the image processing system comprising processing electronics.
  • the image processing system is configured to receive video images acquired by the auxiliary video camera, provide output video images based on the received video images, and present the output video images on the electronic display so thai the output video images are viewable through the ocular.
  • the gimbal is configured to adjust a pitch of the auxiliary video camera between a first position and a second position, wherein the auxiliary video camera has a first viewing angle perpendicular to a floor in the first position and a second viewing angle that is within about 10 degrees of parallel to the floor in the second position.
  • the gimbal comprises two pivots.
  • a first pivot is configured to adjust a pitch of the auxiliary video camera and a second pivot is configured to rotate the auxiliary video camera around an axis perpendicular to the floor.
  • the gimbal is configured to adjust a pitch of the auxiliary video camera between the first position and a third position, wherein the auxiliary video camera has a third viewing angle in the third position that is less than or equal to 180 degrees from the first viewing angle.
  • the gimbal is electronically controlled.
  • the optical assembly is configured to provide an oblique view of a portion of a patient.
  • an orientation of the ocular of the viewing assembly is configured to remain stationary when an orientation of the auxiliary video camera changes to provide the oblique view of the portion of the patient.
  • the gimbal is configured to smoothly adjust the viewing angle of the auxiliary video camera between the first position and the second position.
  • the auxiliary video camera comprises a stereo video camera and the ocular comprises a pair of oculars.
  • the medical apparatus further comprises a camera arm attached to the viewing assembly,
  • a medical apparatus in a third aspect, includes a display housing.
  • the medical apparatus includes a plurality of electronic displays disposed within the display housing, each of the plurality of electronic displays comprising a plurality of pixels configured to produce a two-dimensional image.
  • the plurality of electronic displays is configured to present superimposed images in a field of vie of a person's eye.
  • the m edical apparatus further comprises a binocular viewing assembly coupled to the display housing.
  • at least one of the plurality of electronic displays comprises a transmissive display panel.
  • the superimposed images comprise a video of a first portion of a surgery site that is superimposed on a video of a second portion of the surgery site, the first portion contained within the second portion.
  • the video of the first portion is magnified relative to the video of the second portion .
  • a medical apparatus can include a camera having a field of view that can be designed to include a surgical site, wherein the camera is designed to provide a surgical microscope view of the surgical site.
  • the medical apparatus can include a binocular viewing assembly having a housing and a plurality of oculars, the plurality of oculars designed to provide views of at least one display disposed in the housing.
  • the medical apparatus can include an image processing system designed to receive images acquired by the camera and present the output video images on the at least one display.
  • the medical apparatus can include a movement control system designed to move the camera relative to the binocular viewing assembly, the movement control system having a control member operatively coupled to the movement control system to translate the camera relative to the binocular viewing assembly along at least a first axis and a second axis and to rotate the camera relative to the binocular viewing assembly.
  • the movement control system can include a pitch-yaw adjustment system having an electromechanical device to which the camera can be attached, the pitch-ya adjustment system designed to rotate the camera relative to the binocular viewing assembly around an axis parallel to the first axis and rotate the camera around an axis parallel to the second axis.
  • the control member is operatively coupled to the movement control system via sensors designed to detect movement of the control member, the sensors in communication with components of the movement control system
  • the control member can be operatively coupled to the movement control system via a gimbal having one or more sensors designed to detect movement of the control member, the sensors in communication with one or more components of the movement control system.
  • the movement control system can be attached to the binocular viewing assembly.
  • the movement control system can be attached to an articulated arm.
  • the camera can be attached to the movement control system via an arm.
  • the medical apparatus can include a control system for controlling one or more electromechanical devices operatively coupled to the movement control system.
  • the control system can includes one or more pre-set positions for the movement control system
  • a medical apparatus in a fifth aspect, includes a display, a plurality of cameras and a processor, at least one of said cameras providing a surgical microscope view, said plurality of cameras comprising a first camera configured to image iluorescence in a surgical field and a second camera configured to produce a non-fluorescence image of said surgical field, a processor configured to receive video from said plurality of cameras and to display on said display a first fluorescence video from the first of said cameras and display a second non-fluorescence video from said second of said cameras.
  • a medical apparatus includes elements for stereo viewing positioned at or over a patient, the displays providing multiple views from various video sources within a surgical site and views from additional video sources above or obliquely viewing the surgical opening within a compact housing.
  • the medical apparatus can include a switching module configured to switch between alternative sources of images, for example, different cameras on different surgical devices, such as cameras on retractors, cameras on surgical tools, a camera providing surgical microscope view, etc. and to present one or more of those images on one or more displays.
  • Such images can be tiled, PIP, with certain images large and/or more central than others. Thumbnail images may also be included, such as for selection by a user.
  • Various embodiments enable viewing 3D or stereo images at, near, or o ver the patient with ergonomic characteristics for both the primary and assisting surgeon where a plurality of images from various sources may be viewed in real time with little or no latency,
  • various embodiments provide a method for a surgical procedure using a stereo image acquisition system for viewing the beginning or entrance to the surgical site from above, horizontally or obliquely to the surgical site. Similar views can be provided at the close of the case when the surgeon(s) are closing the wound.
  • Various embodiments are configured to accommodate the stereo image acquisition system that attaches mechanically and electrically to the ergonomic display.
  • the stereo image acquisition system is coupled to the ergonomic display so that the line of sight of the stereo image acquisition system is decoupled from the line of sight of oculars used to view the images acquired by the stereo image acquisition system. Without a fine of sight requirement, the display system can retain a favorable ergonomic position and the stereo image acquisition system can be positioned independently.
  • Various embodiments comprise a method and sy stem of stereo vie wing positioned at or over the patient that displays multiple views from various sources within a surgical site and views from additional sources above or obliquely viewing the surgical opening within a compact housing.
  • Certain embodiments include the above system for a primary surgeon and an equivalent or similar system for an assistant surgeon.
  • the two imaging systems allo the surgeons to be positioned within about 10 degrees of 180 degrees apart (e.g., across the table), or positioned within about 10 degrees of about 90 degrees apart. Other angular separations are also possible, such as any separation from about 20 degrees to about 180 degrees in either direction (e.g., positive or negative angles).
  • This allows the pair of surgeons (e.g., primary and assistant) to be adjacent, shoulder-to-shoulder, across the table, or some other configuration.
  • the positioning of the assistant surgeon can be accomplished without interrupting the primary surgeon by the use of a pivot component in the displays around which the assistant surgeon's display rotates.
  • the dual displays attach to an arm and stand that can be positioned at, near, or over the patient.
  • Various embodiments include attachment points for a configurable stereo imaging acquisition assembly, which has 6 degree-of- freedom positioning, while maintaining the display systems in an ergonomic position favorable to both the primary and the assistant surgeon.
  • a display system comprises a first part comprising a binocular display assembly comprising an ergonomic binocular section containing an ocular, folding prism, and objective for each eye path.
  • the second part comprises an electronic display assembly includes separate eye paths which are folded in a space saving manner, one or more electronic displays and optics that receive light from said one or more electronic displays and form a near constant diameter directed toward the side facing the surgeon.
  • Such an optical layout may include small folding mirrors to keep the overall housing size compact enough to place over the patient and combine with a second stereo display system.
  • the difference in distance between ey e paths at a point in the system where the beams are collimated can be narrower than the inter-pupillary distance of the user. In this manner the overall size of the enclosure can be reduced or minimized.
  • the optical paths for the left and right eyes can be separated at a collimated position, allowing the system to be made of first and second parts, the first part comprising an ergonomic binocular section containing an ocular, folding prism, and objective for each eye path.
  • the second part includes an electronic display to be positioned at, over, or near the patient and includes separate eye paths which are folded in a space saving manner, and in particular maintain a near constant diameter from the side facing the surgeon towards the electronic display, with the last airspace between optics and display being a divergent path.
  • Such an optical layout facilitates using small folding mirrors to keep the overall housing size compact enough to possibly place over the patient and combine with a second stereo display system.
  • the eye path difference in distance at this collimated point in the system (where the two parts can separate) can be narrower than the inter-pupillary distance of the user. In this manner the overall size of the enclosure can be reduced or minimized.
  • a medical apparatus can include a first display portion configured to display a first image and a second display portion configured to display a second image.
  • the medical apparatus can also include electronics configured to receive one or more signals corresponding to images from a plurality of sources and to drive the fsrst and second display portions to produce the first and second images based at least in part on the images from the plurality of sources.
  • the medical apparatus can further include a first beam combiner configured to receive the first and second images from the first and second display portions and to combine the first and second images for viewing.
  • the first and second display portions can include first and second displays.
  • the medical apparatus can further include imaging optics disposed to collect light from both the first and second display portions.
  • the imaging optics can be configured to form images at infinity.
  • the medical apparatus can further include a housing and a first ocular for viewing the combined first and second images within the housing.
  • the medical apparatus can also further include a second ocular for viewing an additional image within the housing.
  • the plurality of sources can include at least one camera providing a surgical microscope view.
  • the medical apparatus can further include the at least one camera providing the surgical microscope view.
  • the plurality of sources can include at least one camera disposed on a surgical tool.
  • the medical apparatus can further include the at least one camera disposed on the surgical tool.
  • the plurality of sources can include at least one source providing data, a computed tomography scan, a computer aided tomography scan, magnetic resonance imaging, an x-ray, or ultrasound imaging.
  • the medical apparatus can further include the at least one source providing the data, computed tomography scan, computer aided tomography scan, magnetic resonance imaging, x-ray, or ultrasound imaging.
  • the first image can include a fluorescence image and the second image can include a non- fluorescence image.
  • the medical apparatus can further comprise a third display portion configured to display a third image and a fourth display portion configured to display a fourth image.
  • the medical apparatus can further include a second beam combiner configured to receive the third and fourth images from the third and fourth display portions and to combine the third and fourth images for viewing.
  • the third and fourth display portions can comprise third and fourth displays.
  • the medical apparatus can further include additional electronics configured to receive one or more signals corresponding to images from another plurality of sources and to drive the third and fourth display portions to produce the third and fourth images based at least in part on the images from the another plurality of sources.
  • Some embodiments of the medical apparatus can further include imaging optics disposed to collect light from both the third and fourth display portions.
  • the imaging optics can be configured to form images at infinity.
  • the medical apparatus can further include a housing, a first ocular for viewing the combined first and second images within the housing, and a second ocular for viewing the combined third and fourth images within the housing.
  • the another plurality of sources can include at least one camera providing a surgical microscope view.
  • the medical apparatus can further include the at least one camera providing the surgical microscope view.
  • the another plurality of sources can include at least one camera disposed on a surgical tool.
  • the medical apparatus can further include the at least one camera disposed on the surgical tool.
  • the another plurality of sources can include at least one source providing data, a computed tomography scan, a computer aided tomography scan, magnetic resonance imaging, an x- ray, or ultrasound imaging.
  • the medical apparatus can further include the at least one source providing the data, computed tomography scan, computer aided tomography scan, magnetic resonance imaging, x-ray, or ultrasound imaging.
  • the third image can include a fluorescence image and the fourth image can include a non-fluorescence image.
  • the medical apparatus can provide 3D viewing of a surgical field.
  • the combined first and second images for viewing can include a composite image of the first and second images.
  • the first beam combiner can be configured to produce the first image as a background image of the composite image, and to produce the second image as a picture-in-picture (PIP) of the composite image.
  • the combined third and fourth images for viewing can include a composite image of the third and fourth images.
  • the second beam combiner can be configured to produce the third image as a background image of the composite image, and to produce the fourth image as a picture-in-piciure (PIP) of ihe composite image.
  • a binocular display for viewing a surgical field.
  • the binocular display can comprise one or more cameras configured to produce images of the surgical field, a left-eye view channel, and a right-eye view channel
  • the left-eye view channel can include a first display configured to display a left-eye view image of the surgical field and one or more first processing electronics.
  • the right-eye view channel can include a second display configured to display a right-eye vie image of the surgical field and one or more second processing electronics.
  • Each of the first and second processing electronics can be configured to receive one or more user inputs, receive one or more input signals corresponding to the images from the one or more camera, select which image of the images from the one or more cameras to display, resize, rotate, or reposition the selected image based at least in part on the one or more user inputs, and produce one or more output signals to drive the first or second display to produce the left-eye or right-eye image
  • each of the first and second processing electronics can include a microprocessor, a field programmable gate array (FPGA), or an application specific integrated circuit (ASIC).
  • the one or more cameras can comprise at least one camera providing a surgical microscope view. In some embodiments, the one or more cameras can comprise at least one camera disposed on a surgical tool. In some embodiments, the one or more cameras can comprise a camera configured to produce a fluorescence image and a camera configured to produce a non- fluorescence image. In some embodiments, the binocular display can further include one or more sources providing data, a computed tomography scan, a computer aided tomography scan, magnetic resonance imaging, an x-ray, or ultrasound imaging. The bmocular display can in some embodiments, provide 3D viewmg of the surgical field.
  • the one or more first processing electronics can include separate processing electronics for each of the one or more cameras configured to produce images on the first display.
  • the one or more second processing electronics can include separate processing electronics for each of the one or more cameras configured to produce images on the second display.
  • a medical apparatus including a refractor, a plurality of cameras, and a hydraulic system.
  • the retractor can be configured to hold open an incision and thereby provide a pathway for access of surgical tools to a surgical site.
  • the plurality of cameras can he configured to acquire video images of the surgical site. At least some of the plurality of cameras can be disposed on the retractor and can be configured to acquire video images within the opening provided by the retractor.
  • the hydraulic system can be configured to deliver pressurized fluid pulses to the plurality of cameras to remove obstmctions therefrom while the cameras are disposed in the surgical site.
  • FIG. 35 illustrates an example camera supported on a platform configured to attach to a retractor having an elastic cover over the camera that includes hydraulic and/or pneumatic pathways for cleaning camera optics.
  • FIG. 36 illustrates a housing for the camera tluidics used for cleaning the camera that is disposed on the camera optics.
  • FIG. 4GB shows a cross-section of a portion of the hydraulic turbine of FIG, 40 A.
  • FIG. 41 shows one embodiment of an impeller.
  • the second arm 5 has mounted to its distal end an input and display- device 13.
  • the input and display device 13 comprises a touchscreen display having various menu and control options available to a user.
  • the touchscreen can be configured to receive multi-touch input from ten fingers simultaneously, allowing for a user to interact with virtual objects on the display.
  • an operator may use the input device 13 to adjust various aspects of the displayed image.
  • the surgeon display incorporating a video camera providing a surgical microscope view may be mounted on a free standing arm, from the ceiling, on a post, or the like.
  • the fiat panel display touch screen 13 may be positioned on a tilt/rotate device on top of the electronics console.
  • a surgical tool 17 can be connected to the console 3 by electrical cable 19,
  • the surgical tool 17 includes, for example, a cutting tool, a cleaning tool, a device used to cut patients, or other such devices.
  • the surgical tool 17 may be in wireless communication with the console 3, for example via WiFi (e.g., IEEE 802.1 l a/b/g/n), Bluetooth, NFC, WiOig (e.g., IEEE 802.1 l ad), etc.
  • the surgical tool 17 may include one or more cameras configured to provide imagery, e.g., image and/or video data.
  • video data can be transmitted to a video switcher, camera control unit (CCU), video processor, or image processing module positioned, for example, within the console 3.
  • CCU camera control unit
  • an operator may use the surgical tool 17 to perform open and/or minimally invasive surgery.
  • the operator may view the surgical site by virtue of the displayed video in the viewing platform 9.
  • the viewing platform (surgeon display system) 9 may be used in a manner similar to a standard surgical microscope although, as discussed above, the viewing platform 9 need not be a direct view device wherein the user sees directly through the platform 9 to the surgical site via an optical path from the ocular through an aperture at the bottom of the viewing platform 9.
  • the viewing platform 9 can provide a view similar to a standard surgical microscope using cameras and displays and can be used in addition to or in conjunction with a standard surgical microscope optical pathway in the viewing platform.
  • the viewing platform 9 can provide a surgical microscope vie wherein changes in the viewing angle, viewing distance, work distance, zoom setting, focal setting, or the like is decoupled from movement of the viewing platform 9.
  • changes in the position, pitch, yaw, and/or roll of the imaging system 18 are decoupled from the viewing platform 9 such that the imaging system 18 can move and/or re-orient while the surgeon can remain stationary while viewing video through the oculars 1 1 ,
  • the third arm 7b can include an imaging system 18 that can be configured to provide video similar to a direct-view surgery microscope.
  • the imaging system 18 can be configured, then, to provide a surgical imaging system configured to provide an electronic microscope- like view that can comprise video of the work site or operational site from a position above the site (e.g., about 15 - 45 cm above the surgical site) or from another desired angle.
  • the surgeon can manipulate the surgical imaging system to provide a desired or selected viewpoint without having to adjust the viewing oculars. This can advantageously provide an increased level of comfort, capability, and consistency to the surgeon compared to traditional direct-view operating microscope systems.
  • the viewing platform 9 can be equipped with wide fieid-of-view oculars 1 1 that are adjustable for refractive error and presbyopia.
  • the oculars 1 1, or eyepieces may additionally include polarizers in order to provide for stereoscopic vision, '
  • the viewing platform 9 can be supported by the arm 7 or 7b, such that it may be positioned for the user to comfortably view the display 13 through the oculars 1 1 while in position to perform surgery. For example, the user can pivot and move the arm 7 or 7b to re-orient and/or re-position the viewing platform 9.
  • the image processing system and the display system are configured to display imagery placed roughly at infinity to reduce or eliminate accommodation and/or convergence when viewing the display.
  • a surgical retractor can be included that is configured to hold open an incision and thereby provide a pathway for access of surgical tools to a surgical site, said retractor comprising portions configured to be disposed about an open central region centraiiy located between said retractor portions so as to permit access of sm'gical tools to the surgical site through said open central region.
  • the retractor can include at least two cameras direc ted inward toward the central open region and at least one of the at least two cameras directed downward into the surgical field.
  • a display optical system can include one or more lenses and one or more redirection elements (e.g., mirrors, prisms) and can be configured to provide light from the display that can be imaged by a binocular viewing assembly comprising a pair of oculars, objectives, and/or turning prisms or mirrors.
  • the display devices such as liquid crystal displays can be imaged with the objective and the pair of oculars and display optical system within the viewing platform 9.
  • the binocular assembly and display optical system can be configured to produce an image of the displays at infinity. Such arrangements may potentially reduce the amount of accommodation by the surgeon.
  • the oculars can also have adjustments (e.g., of focus or power) to address myopia, hyperopia, and/or presbyopia of the surgeon.
  • each ocular can have a variable adjustable power to provide optical correction that the surgeon or other user may desire. Accordingly, the oculars may provide optical correction allowing the surgeon or other users to view the displays through the oculars without wearing glasses even if ordinarily prescription glasses were worn for other activities.
  • FIGS. 2A-C show one embodiment a surgical retractor device that includes an integrated imaging assembly.
  • the imaging assembly includes a plurality of integrated cameras.
  • the retractor 100 includes three blades 101 , however, more or less may be included depending on the design.
  • Each of the blades may be attached to an ariiculable arm 103 that allows for the position of the blades to be adjusted during the operation. For example, following a small incision, the three blades 101 can be arranged in a closed position where each are positioned close to one another. In this closed configuration, the three blades can be introduced through the incision, and then expanded to provide for an operating pathway or working space.
  • the surgical area may be at least 400 mm 2 , for example, have an opening with an areas between 400 and 2100 mm " .
  • the working space may be an area centrally located between retractor blades (or within the lumen of a tubular retractor) that allows for surgical tools or other instruments to pass through.
  • the illumination sources may not be disposed directly adjacent any particular camera.
  • the illumination sources can be omitted, and the camera module can rely on ambient supplementary or overhead light or light directed from a light source located elsewhere.
  • the orientation of an integrated camera 107 may be substantially fixed with respect to the retractor blade 101 or other surgical tool.
  • the camera 107 and/or the camera module 105 may be adjustable with respect to the retractor blade 101.
  • the retractor blades 101 are substantially rigid, in various embodiments, the retractor blades may be malleable, and may have a wide range of different structural features such as width, tension, etc. For example, stronger, larger retractor blades may be desired for spinal and trans-oral surgery, while weaker, smaller retractor blades may be desired for neurosurgery.
  • the retractor can be configured such that different blades can be arranged as desired.
  • cables connecting the camera modules 105 with the aggregator 104 may be adhered (either permanently or non-permanently, e.g., releasably) to the exterior surface of the retractor 100.
  • the hub or aggregator 104 is affixed to an upper surface of the retractor 100.
  • the aggregator may be positioned at any locations relative to the retractor 100, or may be disconnected from the retractor 100 altogether.
  • the aggregator may contain camera interface electronics, tracker interface electronics and SERDES to produce a high speed serial cable supporting all cameras in use.
  • the retractor camera output is coupled to a console which causes video from the retractor camera to be presented on display.
  • the camera modules 105 can include sensors or markers for, e.g., electromagnetic or optical tracking or use encoders, acceleromeiers, gyroscopes, or inertial measurement units (IMUs) or combinations thereof or any other orientation and/or position sensors, as described in more detail below.
  • Tracking can provide location and'Or orientation of the cameras.
  • the images obtained by the cameras may be stitched together or tiled using image processing techniques. Tracking or otherwise knowing the relative locations of the sensor can assist in image processing and display formatting.
  • pairs of cameras together provide information for creating a stereo effect or 3-dimensional (3D) image. Pairs of cameras, for example, may be included on each of the blades 101 of the retractor 100.
  • the refractor is configured to hold open tissue so as to produce an open region or cavity centrally located between the blades.
  • this open central region is unobstructed by the retractor.
  • the central portions of the open region would he unobstmcted by features of the retractor such that the surgeon would have clear access to the surgical site. The surgeon could thus more freely introduce and utilize his or her tools on locations within the surgical site. Additionally, this may enable the surgeon to use tools with both hands without the need to hold an endoscope.
  • the cameras are disposed on the blades of the retractor such that the cameras face inward toward with respect to each other (and possibly downward and/or into the surgical site), the opening or surgical site held open by the retractor blades.
  • the cameras in this example would be disposed about the central open region held open by the retractor blades so as to provide views from locations surrounding the surgical site. The camera thus would face objects within the surgical site such as structures on which tools would be used by the surgeon to operate, as stated above possibly downward and/or into the surgical site.
  • At least two cameras directed inward toward a central open region of the surgical site e.g., an opening created by an incision and/or held open by a retractor
  • at least one of the at least two cameras can be directed downward into the surgical site or field.
  • the cameras on two of the blades face each other such that the leftmost blade and the cameras thereon would be in the fieid-of-view of the cameras on the rightmost blade and vice versa.
  • the cameras on the leftmost blade may be anti-parallel to the cameras on the rightmost blade and have optical axes oriented at an angle, ⁇ , of 180° with respect to each other.
  • the cameras on the remaining blade may be directed orthogonally to the other two blades and thus have optical axes directed at an angle, ⁇ , of 90° with respect to each other.
  • Retractors with cameras can be reaffixed to a frame or mounting structure during a procedure and the cameras can reorient themselves with respect to relative position within an array of the cameras through their communication protocol with the aggregator and video switching unit.
  • the field-of-v ews of the different cameras, and hence the images produced by the different cameras, may overlap.
  • Image processing may ⁇ be employed to yield increased resolution at the regions of overlap.
  • the number of sensors used may be increased to provide increased field-of-view and/or resolution.
  • cameras with overlapping images can be electronically magnified thereby making their images adjacent rather than overlapping.
  • a minimally invasive spine surgery can use a tubular retractor having a circular working space with a diameter of approximately 25 mm.
  • the retractor contains blades, fingers, or at least one barrier such as e.g., a tube that holds tissue back to maintain open the surgical site.
  • the cameras located on the retractor at locations within the surgical field or in very close proximity thereto, e.g., within 75 mm of the surgical opening, can provide a useful viewpoint for the surgeon.
  • the cameras may for example be located on the blades, fingers, tubular barrier, or other portion of the retractor close to the surgical field or within the patient and the surgical field.
  • the cameras may include pairs of cameras arranged and/or oriented to provide stereo and thus 3D imaging or single CMOS camera chips with dual optics to provide stereo.
  • the cameras may be located at various locations in relation to surgical devices, for example, the cameras can be located proximally and dis tally along or near a retractor, wherein the location of the cameras can be configured to facilitate both the progression of surgery and an enhanced view or view selection of an area of interest.
  • the cameras can be faced downward into the surgical site and inward toward each other.
  • the retractor can be used to hold open an incision to provide access to a surgical site.
  • the viewing platform 9 can include one or more imagers configured to provide electronic microscope-like imagmg capabilities.
  • FIG. 3 illustrates an example surgical imaging system 51 attached to an arm 7, the system 51 including one or more cameras 18 mounted on a viewing platform 9.
  • the cameras 18 can be configured to provide imagery of a worksite.
  • the image data can be presented on a display that the user can view using oculars 1 1 mounted on the viewing platform 9.
  • This design can be used to mimic other direct-view microscopes, but it can also be configured to provide additional capabilities.
  • the surgical imaging system 51 can be configured to have a variable working distance without adjusting the viewing platform 9 or the articulating arm 7.
  • the surgical imaging system 51 can be configured to provide image processing capabilities such as electronic zooming and/or magnification, image rotation, image enhancement, stereoscopic imagery, and the like. Furthermore, the imagery from the cameras 18 can be combined with imagery from cameras on the surgical device 17. In some embodiments, the surgical imaging system 51 can provide fluorescence images.
  • auxiliary video camera 18 may involve at least one auxiliary video camera 18 and one or more other cameras that are not disposed on surgical tools but are disposed on other medical devices.
  • These medical devices may include devices introduced into the body such as endoscopes, laparoscopes, arthroscopes, etc.
  • one or more displays such as the at least one display 13 included in the viewing platform 9 may be used to provide a surgical microscope view using one or more cameras such as the auxiliary video camera(s) 18 as well as to display views from one or more cameras located on such medical devices other than surgical tools.
  • a surgical microscope camera can be on a different platform other than the viewing platform 9 and includes features described herein with respect to cameras or imagers on the viewing platform, including but not limited to, isocentered motion, variable work distance, and movement control system.
  • a switching module can be included to switch between views or combinations of views.
  • cameras from a variety of sources e.g., surgical tools and other medical devices, in any combination, may be viewed on the display(s) on the surgical platform together with the surgical microscope view from the auxilsasy video cameras 18.
  • the displays may provide 3D thus any of the images and graphics may be provided in 3D.
  • a virtual touchscreen may be provided by the auxilsasy video cameras 18 or other virtual touchscreen cameras mounted to the viewing platform 9, Accordingly, in some embodiments a user may provide a gesture in the field of view of the auxiliary video cameras and/or virtual touchscreen cameras and the processing module can be configured to recognize the gesture as an input.
  • the virtual display has been described in the context of the auxiliary video cameras 18, other cameras, e.g., virtual reality input cameras, possibly in addition to the auxiliary video cameras 18 may be used. These cameras may be disposed on the viewing platform 9 or elsewhere, such as the third arm 7b, As described herein the displays may provide 3D thus the virtual reality interface may appear in 3D. This may increase the immersive quality of the viewing experience, enhancing the detail and/or realistic presentation of video information on the display.
  • the surgical imaging system 51 includes an isocenter positioning system 52 attached to the viewing platform 9.
  • the isocenter positioning system 52 can include a single track or guide configured to mo ve and orient the cameras 18 such that they are substantially pointed at a single point 53, the isocenter.
  • a second track or guide can be attached to the first guide in an orthogonal manner to provide movement along two dimensions while substantially maintaining the pointing angle towards the isocenter 53.
  • Other configurations can be used to provide isocenter pointing capabilities, such as articulating arms, electro-mechanical elements, curved friction plates, etc.
  • the imaging system is configured to move in an isocenter manner.
  • the horizons of the acquisition systems are configured to be horizontal to match the horizon of the display system and the user.
  • a stereo imaging system may be maintained in a horizontal configuration as it is moved across a range of locations to avoid confusion for the user viewing the video from the stereo camera.
  • the isocenter assemblies can be a part of the display system or a separate, independent system.
  • the viewing platform 9 can be mounted on a separate arm from the cameras 18.
  • the display and the image acquisition of the surgical imaging system can be decoupled, similar to the embodiment illustrated in FIG. 1.
  • ergonomic benefits are provided such as, for example, the surgeon does not need to be looking through binoculars for an extended period of time or at an uncomfortable position or angle.
  • a common relative horizon for both the display and the acquisition system may also be employed,
  • the distance between the surgical site of interest and the imagers can be at least about 20 cm and/or less than or equal to about 450 cm, at least about 10 cm and/or less than or equal to about 50 cm, or at least about 5 cm and/or less than or equal to about 1 m, although values outside this range are possible.
  • the user can interact with the surgical imaging system 51 to select a working distance, which can be fixed throughout the procedure or which can be adjusted at any point in time. Changing the working distance can be accomplished using elements on a user interface, such as a graphical user interface, or using physical elements such as rotatable rings, knobs, pedals, levers, buttons, etc.
  • the working distance is selected by the system based at least in part on the cables and/or tubing being used in the surgical visualization system.
  • the cables and/or tubing can include an RFID chip or an EEPROM or other memor storage ihai is configured to communicate information to the surgical imaging system 51 about the kind of procedure to be performed.
  • the typical working distance can be set to about 40 cm.
  • the user's past preferences are remembered and used, at least in part, to select a working distance.
  • gross focus adjustment can be accomplished manually by positioning the cameras 18 and arm 7.
  • the fine focus adjustment can be done using other physical elements, such as a fine focusing ring, or it can be accomplished electronically.
  • the electronic displays can be configured to be focused at varying levels of magnification allowing the user to vie the displays without adjusting the oculars between magnification adjustments.
  • the oculars can be configured to provide continuous views at infinity.
  • the principal user of the surgical imaging system may select an accommodation level for the oculars, rather than using a relaxed view provided by the electronic displays.
  • the electronic displays in various embodiments, however, can remain in focus and the ocular adjustments do not affect the focus of the various video acquisition systems. Thus, adjustments by the principal user do not affect the views of the other users of the system viewing, for example, other displays showing the video, as the cameras/acquisition systems can remain focused.
  • the surgical imaging system 51 can be focused at a relatively close working distance (e.g., a distance with a relatively narrow depth of field) such that the image remains focused when moving to larger working distances (e.g., distances with broader depth of field).
  • a relatively close working distance e.g., a distance with a relatively narrow depth of field
  • the surgical imaging system 51 can be focused over an entire working range, reducing or eliminating the need to refocus the system after magnification or zoom adjustments are made.
  • FIGS. 5A and 5B illustrate an embodiment of the surgical imaging system 51 having an optical system 53 mounted under the viewing platform 9.
  • the optical components are shown as free-standing to show the structure of the components, but in practice the optical components 53 will be mounted within or on a structure attached to the viewing platform.
  • the optical system 53 and/or the cameras 18 can be modular and can be selected and swapped for use with the surgical imaging system 51.
  • Paragraph [0489] from each of U.S. Prov. App. No. 61/880,808, U.S. Prov. App. No. 61/920,451 , U.S. Prov. App. No. 61/921 ,051 , U.S. Prov. App. No. 61/921,389, U.S. Prov. App. No. 61/922,068, and U.S. Prov. App. No. 61/923,188 is incorporated by reference herein.
  • the optical system 53 is configured to provide stereo image data to the imaging system 51.
  • the optical system 53 includes a turning prism 54 to fold the optical path underneath the viewing platform 9 to decrease the physical extent (e.g., length) of the imaging system under the viewing platform 9.
  • the optical system 53 comprises a Greenough- style system wherein the optical paths for each eye have separate optical components.
  • the optical system 53 comprises a Galilean-style system wherein ihe optical paths for each eye pass through a common objective.
  • the Greenough-style system may be preferable where imaging sensors are being used to capture and convey the image data as compared to the Galilean-style sysiem.
  • the Galilean system can introduce aberrations into the imagery by virtue of the rays for each eye's optical path passing through a periphery of the objective lens. This does not happen in the Greenough-style system as each optical path has its own optics.
  • the Galilean system can be more expensive as the objective used can be relatively expensive based at least in part on the desired optical quality of the lens and its size.
  • the optical system 53 can include two right-angle prisms 54, two zoom systems 55, and two image sensors 56. This folding is different from a traditional operating room microscope because the optical path leads to image sensors rather than to a direct-view optical system,
  • the optical syste 53 can have a relatively constant F-number. This can be accomplished, for example, by varying the focal length and/or aperture of the system based on working distance and/or magnification. In one embodiment, as the focal length changes, the eye paths can move laterally apart (or together), the prisms 54 can rotate to provide an appropriate convergence angle, and the apertures can change their diameters to maintain the ratio of the focal length to the diameter a relatively constant value. This can produce a relatively constant brightness at the image sensor 56, which can result in a relatively constant brightness being displayed to the user.
  • the illumination can change to compensate for changes in the focal length and/or the aperture so as to provide a relatively constant brightness at the image sensors 56.
  • the optical assembly 53 can include a zoom system 55 configured to provide a variable focal distance and/or zoom capabilities.
  • a Galilean-style stereoscopic system generally includes a common objective for the two eye paths. When this optical system is imaged with image sensors 56, it can create aberrations, wedge effects, etc, that can be difficult to compensate for.
  • the surgical imaging system 51 can include a Galilean-style optical system configured to re-center at least one of the stereo paths to a central location through the objective lens, which can be advantageous in some applications,
  • the real-time visualization system utilizes a Greeno gh- style system. This can have separate optical components for each stereo path.
  • the optical assembly 53 can be configured to provide variable magnification and/or afocaf zoom and can be configured to operate in a magnification range from about Ix to about 6x, or from about lx to about 4x, or from about Ix to about 2.5x.
  • the distal-most portion of the Greenough assembly 53 can be similar in functionality to an objective fens of a typical, direct-view operating room microscope with the working distance set approximately to that of the focal length.
  • the working distance, and in some implementations the focal length can be between about 20 cm and about 40 cm, for example. In some embodiments the work distance may be adjustable from 15 cm to 40 cm or to 45 cm. Other values outside these ranges are also possible.
  • the surgical imaging system 51 includes an opto-mechanieal focus element configured to v r '- the focal length of a part of the optica! assembly 53 or the whole optical assembly 53.
  • FIGS. 6A-6E illustrate embodiments of optical assemblies 53 for use in a stereoscopic surgical imaging system, such as those described herein with reference to FIGS. 5A-5B.
  • FIG. 6A illustrates a side view of an example optical assembly 53 configured to use a turning prism 54 to fold an optical path from a tissue 57 to a sensor 56 along a lens train 55 that is situated near or adjacent to a viewing platform 9. This can advantageously provide a relatively long optical path in a relatively compact distance.
  • FIG. 6B illustrates a front view of an embodiment of an optical assembly configured to change a convergence angle in a stereoscopic imaging system.
  • the prisms 54 can be the turning prism 54 illustrated in FIG. 6A.
  • the prisms 54 can be configured to rotate to change a convergence angle, and as a result, a convergence point and/or a working distance.
  • the working distance which can be a distance from the prisms 54 to the target 57 (e.g., tissue), can be user-selectable or adjustable.
  • the convergence angle can decrease.
  • the convergence angle can increase (e.g., ⁇ 1 > ⁇ 2). This can be advantageous where the lens path 55 is fixed and the working distance is adjustable.
  • the stereo imagery can then be viewed on the display 59 by a user.
  • FIG. 6C illustrates a front vie of an embodiment of an optical assembly 53 that is configured to maintain a substantially constant convergence angle.
  • the optical assembly 53 can include two prisms 54a and 54b for each optical path, wherein the prisms 54a, 54b can move and/or rotate. For example, when the working distance decreases the first set of prisms 54a can rotate towards one another to decrease an effective distance between the second set of prisms 54b. The second set of prisms 54b can, in turn, rotate to compensate for the changed angle so as to converge on the common target.
  • the second set of prisms 54b can direct the light to the first set of prisms 54a which can then direct the light down the fixed lens paths 55 (e.g., fixed in their position relative to the view mder).
  • the fixed lens paths 55 e.g., fixed in their position relative to the view mder.
  • FIG. 6D illustrates a front view of an embodiment of an optica! assembly 53 configured to provide a substantially narrow convergence angle to be able to view stereoscopic imagery through a narrow insertion tube 60 (e.g., a tube partially inserted into a body during a procedure).
  • a similar assembly 53 can be used as described with reference to FIG. 6C, and the convergence angle can be maintained substantially constant or at least sufficiently narrow to vie through the insertion tube 60.
  • the embodiments of the optical assembly 53 which are configured to maintain a sufficiently narrow convergence angle can be advantageous as they allow stereo access to narrow surgical entries by allowing the angle to decrease and avoid clipping one of the stereo paths.
  • the left and right lens paths can move closer to one another and the prisms can adjust to the proper convergence angle for that distance.
  • the left and right lens paths can remain fixed and there can be sets of prisms for each path configured to direct the light along the lens paths while maintaining a substantially constant convergence angle.
  • maintaining a constant convergence angle can be visually helpful to the user when zoom changes, e.g., because the changing depth cues do not confuse the user's eye and/or brain.
  • constant convergence may induce less stress on the user.
  • movement control system 10100 can be used for other types of visualization and imaging systems. Movement of the imagers 18 can be performed prior to and/or during the activity, such as surgical procedures, dental procedures, and the like. Movement of the imagers 18 can advantageously allow a medical professional or other operator to alter the view through oculars 1 1 , for example, to provide different surgical microscope-like electronic visualizations which might be beneficial during the course of a medical procedure or for different surgical procedures.
  • one or more control members of the movement control system 10100 can be attached to a component of the mo vement control system 10100 using various types of joints and/or can be remote from the movement control system 10100 such as a remote joystick or toggle.
  • the control member 101 10 can include a joint for attachment to the movement control system 10100.
  • control member 101 10 can include joint 101 1 1.
  • one or more of the joints can include components for detecting movement of the control member and/or an imager arm.
  • rotation about joints, such as joint 101 1 1, around the x-axis is hereinafter termed “pitch” or “tilt” and rotation about joints, such as joint 101 1 1, around the y-axis is hereinafter termed “yaw” or “pan.”
  • the joint 1011 1 can be spherical joints received in a socket formed in the member 10220 thereby forming a ball- and-socket attachment.
  • other types of mounting mechanisms may be used for attaching control member 101 10 as well as an imager arm to components of the movement control system 10100.
  • joints such as gimbals can be used which limit the rotational degrees of freedom about the gimbal.
  • Other types of joint can be used depending on the types of movement the movement control system is designed to allow. For example, if only pitch is needed without yaw, one can use a joint having a single rotational degree of freedom.
  • the control member 101 10 can be positioned remotely from the movement control system 10100,
  • the movement control system 10100 can be attached to an attachment structure, such as binocular display unit 9, and support one or more imagers 18. As shown in the illustrated embodiment, the movement control system 10100 can be oriented generally underneath the binocular display unit 9 and in some embodiments can be sized such that the movement control system 10100 does not extend significantly beyond the outer housing of the binocular display unit 9. This can advantageously provide a smaller form factor thereby reducing the likelihood that the movement control system 10100 will interfere with the medical professionals and assistants during a medical procedure.
  • the attachment structure can be other components of the surgical visualization system 1 such as, but not limited to, a dedicated articulating arm or a display arm.
  • the movement control system 10100 can extend significantly beyond the outer housing of the binocular display unit 9 or any other platform to which it is attached. This can be advantageous in situations where a greater degree of movement of the imagers 18 is desired or in embodiments where the control member 101 10 is located above the attachment point between the movement control system 10100 and binocular display unit 9.
  • the movement control system 10100 can be configured to allow translation of one or more attached imagers 18 along a plane relative to the binocular display unit 9.
  • the binocular display unit 9 can be immobile while the one or more imagers 18 are translated.
  • the one or more imagers 18 can be translated along a plane parallel to the operating table 10101.
  • the movement control system 10100 can be translated along both the x-axis and the y-axis (which projects perpendicularly through the sheet).
  • the movement control system 10100 can have a range of translation relative to the binocular display unit 9, of approximately ⁇ 500mm along the x-axis and y-axis at full extension, approximately ⁇ 400mm along the x-axis and y-axis at full extension, approximately ⁇ 300mm along the x-axis and y-axis at full extension, approximately ⁇ 200mm along the x-axis and y-axis at full extension, or approximately ⁇ 100mm along the x-axis and y-axis at full extension.
  • full extension along one axis can be greater than full extension along the other axis.
  • full extension along the x-axis may be approximately ⁇ 175mm whereas the y-axis extension can be three-quarters full extension of the x-axis, one-half full extension of the x-axis, one-quarter full extension of the x-axis, or any other ratio between unity and zero.
  • the range of translation relative to the binocular display unit 9 along the y-axis can be approximately ⁇ 87.5mm. This can be advantageous in cases where allowing the y-axis to have a full range of motion may interfere with the medical professional and or assistants.
  • the range of translation of the x- axis can be three-quarters full extension of the y-axis, one-half full extension of the y- axis, one-quarter full extension of the y-axis, or any ratio between unity and zero.
  • the imager 18 can translate further in the "positive" direction than the "negative" direction. For example, along the x-axis, the imager 18 may move from -100 mm to 500 mm. Ranges of motion outside these ranges are also possible.
  • the maximum translation relative to the binocular display unit 9 along the x-axis and y-axis can be chosen to provide a balance between greater maneuverability, the yaw and/or pitch angles, working distances, size constraints, and other such factors.
  • translation of the imagers 18 can be performed by translating one or more control members, such as control member 10110, in the desired direction.
  • the control member 101 10 can be electrically coupled to the movement control system 10100 to provide translation via an electromechanical system utilizing stepper motors, linear motors, or the like.
  • a joint of the control member 101 10 can include components for detecting translation of the control member 101 10, The signals from these sensors can be used to control other components of the movement control system, such as one or more electromechanical components such as stepper motors, linear motors, or the like to translate the imager 18.
  • the electromechanical components can be coupled to a moveable platform to which the imager 18 can be attached, in some embodiments, the control member 101 10 can be physically connected to the movement control system 10100 without any electromechanical assistance.
  • the movement control system 10100 need not translate solely along a plane parallel to the operating table 10101 or the x-y plane as set forth in the illustrated embodiment.
  • the plane of translation can be defined by the orientation of the mount to which the movement control system 10100 is connected.
  • the movement control system 10100 can be configured for non-planar translation and/or translation along more than one plane.
  • a tip and tilt stage provides angular motion.
  • a rotary stage can also be used to provide rotary motion.
  • the movement control system 10100 can be configured to allow rotation of the one or more attached imagers 1 8 about a joint which can be attached to components of the movement control system 10100 and/or remotely from the movement control system 10100.
  • the movement control system 10100 can be designed to allo the control member, such as control member 101 10, as well as the imager 18 and/or imager armto "pitch” or "'tilt” and "yaw” or “pan” relative to the binocular display unit 9.
  • the binocular display unit 9 can be immobile while the "tilt" and "yaw” or “pan” of the one or more imagers 18 are adjusted.
  • Pitch or yaw can allow the imager 18 to have a line of sight that is centered (e.g., focused) on the surgical site after the imager 18 is translated. This can advantageously allow the medical professional or assistant to adjust the viewing angle during a medical procedure. This can be beneficial in circumstances where a medical professional is unable to adequately view an object due to another element obstructing the view. Under such circumstances, a medical professional can translate the imager 18 and adjust the viewing angle of the imager 18 such that the same general area is viewed from a different angle.
  • the movement control system 10100 can allow both pitch and yaw adjustments reiaiive to the binocular display unit 9 within the range of approximately ⁇ 60 degrees each, by approximately ⁇ 50 degrees each, by approximately ⁇ 40 degrees each, by approximately ⁇ 30 degrees each, by approximately ⁇ 2.0 degrees each, or approximately ⁇ 10 degrees each.
  • the pitch and yaw can have different adjustment ranges.
  • the yaw can have an adjustment range of approximately ⁇ 40 degrees whereas the pitch can have an adjustment range of approximately three-quarters that of the yaw, one-half that of the yaw, one-quarter that of the yaw, or any other ratio between unity and zero.
  • the pitch can have an adjustment range of approximately ⁇ 20 degrees.
  • the adjustment range of yaw and pitch can correspond to the distance at full extension along both the x-axis and the y-axis.
  • the pitch and yaw can be chosen such that the imager 18 can remain centered on the surgical site when the movement control system 10100 is fully extended in any direction.
  • the working distance between the imager 18 and the surgical site can be approximately 200mm, with a range of translation along the x-axis of approximately ⁇ 175mm, and a range of translation along the y-axis of approximately ⁇ 87.5mm.
  • the pitch adjustment range can be ⁇ 20 degrees and the yaw adjustment range can be ⁇ 40 degrees.
  • the pitch and yaw adjustment ranges can also be different to match the differences in extension.
  • the pitch and yaw adjustment range can be chosen such that the imager 18 can remain centered on the surgical site when the movement control system 10100 is fully extended in any direction at at least one working distance.
  • the pitch and yaw adjustment range can be approximately ⁇ 20 degrees and approximately ⁇ 10 degrees respectively to allow centering at a working distance of 400mm.
  • the imager 18 can adjust further in a "positive" angle than a "'negative” angle.
  • the yaw may range from -5 degrees to 15 degrees.
  • increasing or decreasing the pitch and/or yaw of the imagers 18 relative to the binocular display unit 9 can be achieved by increasing or decreasing the pitch and/or yaw of the one or more control members, such as control member 101 10.
  • the control member 101 10 can be electrically coupled to the movement control system 10100 to provide pitch and yaw via an electromechanical system utilizing stepper motors, linear motors, or the like.
  • a joint of the control member 101 10 can include components for detecting pitch and'Or yaw of the control member 101 10.
  • the joint of the control member 101 10 can be gimbals which can detect pitch and'Or yaw of the control member 101 10.
  • the signals from these sensors can be used to control other components of the movement control system, such as one or more electromechanical components such as stepper motors, linear motors, or the like to adjust the pitch and/or yaw of the imager 18.
  • the movement control system 10100 can be configured to allow rotation along other axes such as the z-axis.
  • the control member 101 10 can be physically- connected to the movement control system 10100 without any electromechanical assistance.
  • the movement control system 10100 can be configured to adjust the working distance between the imagers 18 and the surgical site.
  • the binocular display unit 9 can remain immobile while the working distance of the imagers 18 are adjusted.
  • the working distance can range from between approximately lm to approximately 10mm, from between approximately 800mm to approximately 50mm, from between approximately 600mm to approximately 100mm, or from between approximately 400mm to approximately 200mm.
  • the control member 101 10 can be electrically coupled to the movement control system 10100 to provide working distance adjustment via an electromechanical system utilizing stepper motors, linear motors, or the like.
  • a joint of the control member 101 10 can include components for detecting rotation of the control member 101 10 about the longitudinal axis.
  • the signals from these sensors can be used to control other components of the movement control system, such as one or more electromechanical components such as stepper motors, linear motors, or the like to adjust the pitch and/or yaw of the imager 18.
  • the control member 101 10 can be physically connected to the movement control system 10100 without any electromechanical assistance.
  • the movement control system 10100 can include a translation system for translating an imager 1 8 and/or an imager arm, a pitch- yaw adjustment system for adjusting the pitch and/or yaw of the imager 18 and'Or an imager arm, a control member, such as control member 101 10, and one or more imager arms to which the imager 18 can be attached.
  • a working distance adjustment system can be included which can allow adjustments in working distance of the imager 18 and/or an imager arm. It should be appreciated by one of ordinary skill in the art that the translation system, the pitch-yaw adjustment system, and/or the workmg distance adjustment system can be used separately or in any combination.
  • control member 10110 can be operatively coupled to the translation, pitch-yaw adjustment, and/or working distance adjustment systems.
  • control member can be coupled to an electromechanical system for controlling the translation, pitch-yaw adjustment, and/or working distance adjustment systems.
  • the control member can be directly attached to a component of the movement control system 10100 or can be remotely positioned (e.g., a toggle or joystick on a separate module).
  • control member can be coupled directly to the translation, pitch-yaw adjustment, and/or working distance adjustment systems such that no electromechanical devices are used.
  • the operator can be given the option of controlling the translation, pitch-yaw adjustment, and/or working distance adjustment systems with or without electromechanical devices.
  • the operator can control the translation, pitch-yaw adjustment, and/or working distance adjustment systems without electromechanical devices for certain portions of a procedure and use such electromechanical devices for controlling the translation, pitch-yaw adjustment, and/or working distance adjustment systems during other portions of a procedure.
  • coarse control of the movement control system 10100 can be achieved without use of electromechanical devices whereas fine control of the movement control system 10100 can be achieve with use of electromechanical devices, vice-versa, or a combination of the two.
  • the movement control system 10100 can include a control system which controls functions of the electromechanical devices.
  • the electromechanical components can be programmed such that the electromechanical components can orient the translation, pitch-yaw adjustment, and/or workmg distance adjustment systems in certain positions based on the operator's input.
  • the electromechanical components can be programmed such that it goes to reverts back to a pre-set or previous position upon receiving a command from the operator.
  • the electromechanical components can be programmed such that an operator can specify a desired position for the imager 18 and the control system can control the electromechanical devices coupled to the translation, pitch-yaw adjustment, and/or working distance adjustment systems orient the imager 18 in the desired position.
  • the imager arm 10120 and the imager 18 can be attached such that the imager 18 can be directed towards the side of the head of a patient.
  • the imager 18 can be attached to the imager arm 10120 using a yoke 10125 which can be designed to allow for coarse and/ or fine control of pitch, yaw, and/or roll of the imager 18,
  • the yoke 10125 can have one or more pivots which can be configured to allow the imager 18 to have a viewing angle parallel to the operating room floor such that an operator can view the side of the head.
  • the yoke 10125 can be configured to allow the imager 18 to rotate such that the imager can be directed to a portion of the back of the head.
  • the movement control system 10130 can include a one or more control members, such as control member 10145.
  • Control member 10145 can be positioned such that the longitudinal axis of the control member 10145 is parallel with and/or collinear with axis 10135. This can advantageously allow the imager 18 to be rotated about axis 10135 by rotating the control member 10145.
  • the control member 10145 can be mechanically coupled to ihe imager 18.
  • the control member 10145 can be coupled to the imager 18 via an electromechanical system.
  • the control member 10145 can include sensors for detecting rotation of the control member 10145 and use data received from the sensors to rotate the imager 18 via electromechanical components such as stepper motors, linear motors, or the like.
  • the movement control system 10130 can include a first plate element 10150 and a second plate element 10155 which can be rotatable coupled.
  • the second plate element 10155 can include first and second supports 10160, 10165 to which the imager 18 can be attached.
  • the first and second plate elements 10150, 10155 can be rotatable coupled such that the axis of rotation of the two plate elements 10150, 10155 is parallel and/or coliinear with axis 10140.
  • control member 10145 can include one or more switches and/or actuators 10170 for controlling movement of the device.
  • the actuator 10170 can be coupled to mechanisms which can unlock the apparatus 10130 such that the movement control system 10130 can be manipulated to rotate and/or translate the imager 18.
  • the switches and/or actuators can be coupled to an electromechanical system to rotate and/or translate the movement control system 10130.
  • the movement control system 10130 may comprise a plurality of handles, for example, a left hand handle and a right hand handle to accommodate for surgeons who are left handed and surgeons who are right handed.
  • the movement control system can be placed on a different platform other than the binocular display unit to further decouple the line of sight of the microscope view camera and the surgeon.
  • various embodiments may include at least two arms, one for a binocular display unit and one for a camera or imager that provides stereo surgical microscope views and to allow the surgical microscope camera be placed at a location away from the binocular display unit and to decouple line of sight of the surgical microscope from the line of sight of the ocular.
  • FIGS. 10A-10D illustrate example display optical systems 1 1005 configured to provide a view of displays 1 1010 through oculars (not shown) that receive light from the last lens 1 1015 in the display optical system 1 1005.
  • the display optical system 1 1005 forms an exit pupil at or near the entrance pupil of the surgeon binoculars. These pupils are closely matched, for example, in size and shape.
  • the exit pupil of the display optical system 11005 can be the same size or smaller than the entrance pupil of oculars used to view the display.
  • the oculars form an exit pupil that is matched (e.g., in size and shape) to the entrance pupil of the surgeon's eye(s).
  • the display optical system 1 1005 is configured to produce a beam that has a relatively constant cross-section between the first lens element 1 1012 and the last lens element 1 1015, where the cross-section is relatively small.
  • this allows ihe display optical system 1 1005 to be included in a relatively small or compact package and use relatively small optical elements.
  • the last lens 1 1015 eoilimates the beam leaving the display optical system 1 1005.
  • the termination of the rays shown in FIG. 1 OA to ihe ieft of lens 1 1015 is the exit pupil of ihe display optical system 1 1005.
  • the exit pupil of the display optical system 1 1005 is configured to be the same size or smaller than, and positioned at the same location, as an entrance pupil of a binocular viewing assembly configured to allow a user to view the display 1 1010.
  • the lenses in the display optical system 1 1005 form a highly color- corrected view of the display by forming the exit pupil in a position favorably disposed for the user and the binoculars.
  • a combination of singlets and bonded lenses provide such correction.
  • the display optical system 1 1005 may be designed to provide such correction while keeping a small beam column or ray bundle, which permits adding mirrors and obtaining a compact package.
  • producing an undistorted image can be difficult without such a group of lenses designed properly to provide such correction.
  • This correction includes both color correction as well as distortion correction.
  • the display optical system 11005 advantageously allows a relatively small, compact lens assembly to provide a view of a relatively large display 1 1010.
  • the display optical system 1 1005 can be configured to work with displays 1 1010 of varying sizes, including, without limitation, displays with a diagonal that is less than or equal to about 0.86 in. (22 mm), at least about 0.86 in. (22 mm) and/or less than or equal to about 10 in., at least about 1 in. and/or less than or equal to about 9 in., at least about 2 in. and/or less than or equal to about 8 in., or at least about 4 in. and/or less than or equal to about 6 in.
  • the display may, for example, have a diagonal of about 5 inches or about 8 inches in some embodiments.
  • the total optical path length of the display optical system 1 1005 can be less than or equal to about 9 in., at least about 9 in. and/or less than or equal to about 20 in., at least about 10 in. and/or less than or equal to about 19 in., at least about 14 in. and/or less than or equal to about 18 in.
  • the display optical system 1 1005 can include lenses, mirrors, prisms, and other optical elements configured to direct and manipulate light along an optical path.
  • the display optical system 1 1005 can be used in conjunction with a primary display, a surgeon display, an assistant display, possibly other displays, or any combination of these.
  • the example display optical system 1 1005 illustrated in FIG. 10A has a total optical path length of about 16.2 in. (412 mm). It is configured to provide an image of a 5 in. display 1 1010.
  • the display optical system 1 1005 can include a lens 1 1012 configured to direct the light from the display 1 1010 along a path wherein light from the display 1 1010 is directed along a path with a relatively narrow cross-section.
  • the light received from the display is initially substantially reduced in beam size for example by the lens 1 1012 or lenses closest to the display and a more narrow beam is produced.
  • the lens 1 1012 or lenses closest to the display collect light at an angle (half angle) in excess of 20°, 25°, 30° and reduce the beam size of the light.
  • This design is advantageous because it allows for the elements in the display optical system 1 1005 to be relatively small and compact.
  • the cross-section of the optical beam after the lens 1 1012 in the display optical system 1 1005 can be configured to be relatively constant. This configuration allows folding or redirecting mirrors present in the optical path to remain small.
  • FIG. 10B illustrates a binocular display optical system 1 1005 configured to provide a view of stereo displays 1 1010a, 1 1010b through a pair of oculars.
  • the binocular display optical system 1 1005 can be based on the optical design illustrated in FIG. I OA, and can include one or more elements 1 1014 in the optical path before the lens 1 1012 to reduce the physical size of the optical system while maintaining the length of the optical path. These elements can include mirrors, prisms, and/or other optical elements configured to redirect the light from the displays 1 1010a, 1 1010b to the lens 1 1012.
  • the elements 1 1014 include curved mirrors which redirect the optical path and converge the rays from the displays 1 1010a, 1 1010b.
  • the elements 1 1014 include mirrors or prisms (for example that may have planar reflecting surface) that do not substantially affect the convergence of the light rays, but redirect the optical path.
  • the reflective surface or cross- section of the mirror is non-circular, and is, for example, elliptical. Accordingly, in various embodiments the cross-section of the mirror or other reflective surface is possibly being longer in one direction than in another, for example, orthogonal direction.
  • These elements may fold the optical path to provide for a more compact system. Such a system may therefore have an optical path length from display to ocular that is longer than the length and/or width of the viewing platform of the combination thereof.
  • the display optical system 1 1005 can include at least four mirrors, or less than or equal to four mirrors. In certain implementations, two mirrors can be used to fold the opticai path from the display 1 1010 to the exit pupil, the two mirrors positioned between the first lens 1 1012. and the display 1 1 010. In some embodiments, the display optical system 1 1005 includes at least four lenses or less than or equal to four lenses.
  • the example display opticai system 1 1 005 illustrated in FIG. I OC has a total optical path length of about 1 8.7 in. (475 mm). It is configured to provide an image of an 8 in. display 1 1010.
  • the display optical system 1 1005 can include a lens 1 1012 configured to direct ihe light from the display 1 1010 along a path wherein light from the display 1 1010 is directed along a path with a relatively narrow cross-section, allowing for the display optical system I I 005 to be relatively small and compact.
  • the cross-section of the optical beam after the lens 1 1012 in the display optical system 1 1 005 can be configured to be relatively constant. This configuration allows folding or redirecting mirrors present in the optical path to remain small.
  • the display- optical system 1 1005 can be configured to be used in conjunction with a display 1 1010 with a relatively high resolution.
  • the example display optical system 1 1005 illustrated in FIG. 10D has a total optical path length of about 9.3 in. (237 mm). It is configured to provide an image of a smaller display, in this case a 0.9 in. (22 mm) display 1 1010. Because the display is much smaller than the display in the embodiments described in connection with FIGS. 1 OA- I OC, the opticai path can be much shorter and may fit into a smaller space.
  • the display optical system 1 1005 can include a lens 1 1012 configured to direct the light from the display 1 1010 along a path wherein light from the display 1 1010 is directed along a path with a relatively narrow cross-section, allowing for the display optical system 1 1005 to be relatively small and compact.
  • the cross-section of the optical path after the lens 1 1012 in the display optical system 1 1005 can be configured to be relatively constant. This configuration allows folding or redirecting mirrors present in the optical path to remain small. Based at least in part on the relatively short optical path length, the display optical system 1 1005 can be configured to be used in conjunction with a secondary display or an assistant display.
  • FIGS, 1 1A-1 1 G illustrate example display optical systems 1 1300 configured to provide a view of a display 1 1310, the display optical system 1 1300 having an exit pupil 11305 wherein light paths that would intersect a viewing assembly housing 1 1315 are reduced or eliminated through baffles or apertures 1 1320, where a baffle includes a panel with an aperture.
  • FIG. 1 1 A illustrates an example embodiment of a display optical system 1 1300 comprising a display 1 1310, with other optical components configured to direct the light from the display 1 1310 to the exit pupil 1 1305. The light paths are traced with black lines to show the periphery of the bundle of light paths from the display 11310 to the exit pupil 1 1305.
  • FIG. 1 A illustrate example display optical systems 1 1300 configured to provide a view of a display 1 1310, the display optical system 1 1300 having an exit pupil 11305 wherein light paths that would intersect a viewing assembly housing 1 1315 are reduced or eliminated through baffles or apertures 1 1320, where a baffle includes a panel with
  • FIG. 12 shows a front view of a viewing platform comprising the binocular assembly and display enclosure assembly.
  • Various embodiments optionally- separate the two into self-contained components at a point in the optical path where the beam diameters of each eye path are approximately 18 to 20 mm in diameter where the beams may be collimated, and the separation between the 2 eye paths is approximately 22 to 25 mm.
  • these separate optical portions or sections of the optical display system need not be on separate platforms or housings or otherwise be separately contained.
  • the separation between eye paths is substantially less than that of the viewer, which is approximately 65 mm with a range of, for example, 52 to 78 mm.
  • the viewing platform can include binoculars for a primary surgeon and an assistant surgeon, as illustrated in FIGS.
  • the collimated eye paths have a near constant diameter through a series of fold mirrors shown in FIGS. 14 and 15 and a first optical element.
  • This compact folding with corresponding baffles within the section of the optical path comprises 40-50% of the total track of the optical path in the enclosure.
  • the eye paths are mirror images of each other and are substantially parallel in direction, that is they have little or no convergence, and the display panels are orthogonal to the viewer in some embodiments, in plane with each other or parallel in other embodiments.
  • FIG. 15 shows a path from the exit pupil, which is outside of the enclosure, and matches within 25% the corresponding entrance pupil of the binocular assembly located in this same collimated section where the two components meet and attach.
  • the overall optical path length of each eye from collimated pupil matching attachment point to electronic display for images is approximately 300 mm to 400 mm for the primary surgeon display.
  • the ratio of path length to display diagonal is within a range of 1 :2.2 to 1 :3.5, display diagonal to path length.
  • Various embodiments have display diagonals of 100 mm to 150 mm, one for each eye path.
  • the assistant surgeon display and display pathway are reduced in various embodiments to permit a more compact dual assembly.
  • the assistant surgeon display being less than the primary surgeon display configuration in diagonal of viewed display and path length.
  • the ratio of path length to display diagonal is within a range of 1 :5 to 1 :7, display diagonal to path length.
  • Various embodiinents have assistant display diagonals of 25 mm to 30 mm, one for each eye path.
  • each path With a near constant optical beam diameter of each path multiple mirrors are positioned between the first element group and the last element group with a single mirror in the converging beam path and the display. Between the last optical element and the display are 2, 3, or 4 or more baffles for stray light rejection. The aperture of each baffle is proportional to the display ratio, 3:4 , or 16:9 or other.
  • the assistant surgeon display path may rotate around a rotation point through 270 degrees or thereabouts, the center of rotation taking the form of a column or bearing. Examples of primary and assistant displays with primary and assistant binoculars are illustrated in FIGS. 13A and 13B.
  • the rotation point of the assistant surgeon display is defined as a vertical column allowing the assistant surgeon eye pair to be horizontal through the range of rotation.
  • the input field of view, 2 ⁇ (for example, of the electronic display e.g., LCD or LED display) is 6 degrees, but could be 3 degrees or as much as 10 degrees.
  • the single display can be controlled at the CPU (central processing unit) level of overall system or by other processing and/or control electronics to provide a multitude of images optimized or configured for each (e.g., left and right) eye.
  • Such optimizations/configurations can include real or calculated parallax.
  • fluorescence or different wavelength images or other images are superimposed on different images electronically.
  • the CPU or processing and/or control electronics can also select appropriate views from within the surgical site from any one of a number of sources based on the position of the assistant display using an encoder located, for example, on the rotation point or other known point. This can allow the assistant surgeon an appropriate view based on his or her position with respect to the primary surgeon.
  • a two-part (e.g. left and right) channel display or Wheatstone-like stereo display configured to provide left and right images with parallax provided by inter-pupillary distance between human eyes comprising a binocular assembly for the user and an electronic display assembly,
  • a two-part Wheatstone-like stereo display comprising a binocular assembly for a second user and an electronic display assembly. 13.
  • Embodiment 13 where the second stereo display is used by an assisting surgeon.
  • electronic displays that receive video feed from stereo cameras can be provided for both the primary surgeon and assistant.
  • Two separate binocular displays, one for the primary surgeon and one for the assistant can be used.
  • Each display can have left and right electronic displays that present left and right channels of a stereo camera. Viewing the two displays together, which present images with the proper parallax for stereo imaging, the viewer can see 3D images (video) when peering through the oculars of the binocular display.
  • the two binocular displays can be attached and supported together on an articulating arm.
  • a binocular display unit may- comprise a separate primary surgeon display assembly and an assistant display assembly.
  • Each can include a housing that includes therein a pair of electronic displays and imaging optics and possibly some folding and/or redirection mirrors.
  • each can also include a binocular assembly comprising a pair of objectives, prisms, and oculars for the left and right eye of the viewer.
  • the primary surgeon display assembly and an assistant display assembly can be coupled together, for example, using a post or rotatabie joint, and the assistant display assembly can rotate about the post or joint with respect to the primary surgeon display assembly, in this manner, an assistant standing opposite to the primary surgeon or standing to the left or right of the surgeon can look through the assistant display without moving the surgeon display assembly.
  • the binocular display unit includes primary surgeon oculars and displays on a single unit possibly in separate housings associated with the primary surgeon display assembly. This applies for the assistant display as well.
  • the displays and oculars need not be separately- housed and can be attached to a common base housing.
  • the primary surgeon oculars and assistant oculars can be separable from the binocular display unit.
  • the oculars and binocular display unit can form a single assembly.
  • the oculars and binocular viewing assembly can be a substantially unitary assembly, wherein the oculars are substantially permanently affixed to the binocular display unit.
  • the binocular displays can receive images from a plurality cameras including possibly left and right cameras on a stereo image pair.
  • the selection of cameras may be different as well as presentation (e.g., orientation of the video images) may vary.
  • processing electronics selects certain cameras to provide video images to the respective electronic displays in the assistant display assembly and orients or presents those images appropriately.
  • the images provided the primary surgeon may be mirror images of the images provided to the assistant if the assistant is standing opposite (facing 180° with respect to) the primary surgeon.
  • manipulation e.g., rotation
  • selection of images may be varied depending on the location of the assistant.
  • the location of the assistant can be determined based on the rotation of the assistant display assembly.
  • Sensors such as encoders in the binocular display unit may provide such information.
  • Other approaches to determining and/or monitoring the assistant's location including tracking sensors on the assistant (and/or primary surgeon) may be employed.
  • images with parallax can be provided to left and right electronic displays that are viewed through respective left and right imaging optics and oculars. As a result, a surgeon or assistant viewing these displays through the pair of oculars will see a three dimensional image.
  • images without parallax may be displayed on both the left and right electronic displays to produce a two-dimensional image.
  • the left and right displays can include both image content that includes parallax and image content without parallax. This image content could come from stereo and mono cameras, etc. (Stereo cameras have two channels, left and right, oriented at appropriate angles with respect io each other as the left and right eyes of a human io provide the appropriate parallax for three-dimensional rendition and perception.)
  • parallax image content when both parallax image content and images without parallax content (corresponding to 3D image content and 2D image content, respectively) are provided, ihe images with parallax (3D image content) may be emphasized over the images without parallax (2D image content).
  • the 3D content may have brighter intensity and/or higher saturation than the 2D image content.
  • Other parameters may also be used to emphasize the 3D content.
  • calibrating the 3D space being imaged may be useful.
  • a calibration pattern may be projected onto the surgical site.
  • a fight source and imaging optics may be employed to project an image of a reticle or other calibration pattern having known features and dimensions onto the surgical site being imaged, for example, by a camera configured to produce a surgical microscope view or a camera on a retractor or surgical tool.
  • One or more of these cameras may image the projected calibration pattern together with the surgical site.
  • Knowledge about the reticle pattern and how that pattern appears when projected onto a three dimensional surface will provide depth information about that three-dimensional surface. This information can be determined by processing electronics.
  • a 3D CAD rendition of the surgical site may be generated based on ihe image of the surgical site and the information from the projected calibration pattern. Alternatively or in addition, measurements such as distance from one feature in the surgical site to another or the size of features in the surgical site may be determined. Volumes may also potentially be able to be calculated. The information may have other uses as well .
  • Stereo cameras can be included on various medical devices, for example, retractors, surgical tools etc.). Such cameras can include a single sensor or detector array (e.g., CMOS, CCD, etc.) or a pair of sensors to obtain left and right eye views.
  • a single sensor for example, may be employed to obtain left and right images of a stereo camera pair.
  • the sensor may be partitioned into areas to receive light from left and right imaging optics that produces left and right images on the active area of the sensor.
  • a mask can be employed to partition the active area of the sensor into these left and right areas for receiving the left and right images.
  • the surgeon can depress the foot pedal to indicate which video image feed or camera for example from the various cameras on the retractors and/or surgical tools or one or more cameras thai provide a surgical microscope field of view the surgeon is interested in using.
  • the surgeon depresses a foot pedal to cycle through different video feed (e.g., thumbnails).
  • the surgeon could depress another foot pedal to select one of these for enlarging etc.
  • a single pedal may be employed. Depressing the pedal quickly will enable the surgeon to cycle through the various video feeds. Holding the foot pedal down a longer period of time may be used to select one of the video feeds for enlarging and/or placing the image at a particular location such as more central. Two or more foot pedals could also be used.
  • a display shows a single central image plus a plurality (e.g., 4) thumbnails disposed about a central image. Selection of which thumbnail replaces the image at center is specified by clicking on the input device, 1 , 2, 3, or 4 times (e.g., if 4 thumbnails).
  • the thumbnails can be arranged about the central image(s) like the numbers on a clock and the number of clicks can similarly be assigned to the thumbnail. For example, two clicks can designate the second thumbnail which is located in the lower right hand corner of central image.
  • the central image can be enlarged in some embodiments. More than a single image may be included in the center.
  • the selected image may be viewed at locations other than the center of the display. For example, two images, one main and another PIP, could be used. Other arrangements are possible.
  • Surgical microscope camera views from the temporal direction can be provided using the surgical microscope camera as shown in FIG. 21.
  • a movement control system comprising a gimbal, for example, attached to the binocular display unit can support a camera the provides a surgical microscope view and that can be moved and reoriented to provide such oblique or temporal views.
  • the surgeon need not tilt his or her head or re-position to obtain this perspective. Configurations that provide isocentered positioning, as shown, can reduce disorientation in some cases.
  • the assistant may be located opposite to the surgeon, or on ihe left or right of the surgeon, or in other locations.
  • the location of the assistant is sensed, and the appropriate camera views are provided to the assistant viewing the assistant electronic displays in the assistant binocular viewing assembly.
  • FIG. 22 illustrates how the assistant opposite of the surgeon (with the assistant display assembly directed 180° with respect to the surgeon display assembly) is to see the images reoriented (e.g., upside down) and with the locations of the images reversed.
  • Selection of cameras for different assistant positions e.g., 180° position, 90° position, can apply to mono and/or stereo cameras on the retractor and/or on a surgical tool as well as to surgical microscope view cameras. Additionally, such camera selection based on sensing the assistant's location can be used for side/temporal (e.g., through ear) approaches discussed above.
  • one of the sensors pairs can be a 4K sensor with higher resolution than the other sensor pair(s).
  • one sensor pair or sensor with left and right portions for respective left and right eye views
  • a camera for the surgeon may be configured to move in an isocentered manner.
  • the assistant optics is counter rotated to maintain the assistant display camera view horizontal.
  • FIG. 24 shows a common objective for the different sensors.
  • a common objective may be employed for the sensors used for the primary surgeon as well as the sensors used for the assistant surgeon.
  • a common objective may also be employed for left and right channels and left and right sensors (even if not used as a common objective for both the assistant and primary display).
  • FIG. 2.3 show a common object employed for three sensor pairs, each having left and right sensors for respective left and right eye views.
  • a surgeon will use the camera that provides ihe surgical microscope vie at the early stages of the surgery, for example, to make the incision for access into the body and to introduce tools initially into the body.
  • the surgeon may additionally use the cameras on the retractor.
  • the surgeon will use the proximal retractor cameras initially and the distal retractor cameras thereafter as the surgical tool(s) passes deeper into the surgical site, for example, passing through proximal regions of the opening in the body into more distal regions into the surgical site.
  • the various cameras can be employed to guide advancement of ihe tool into the desired depth in the body and into the surgical site.
  • this process may be reversed (for example, the distal camera may be used more after relying on the proximal camera, and the surgical microscope camera may be used after the distal camera),
  • Various embodiments of the system may additionally be configured to provide for the same convergence angle for each of the stereo cameras, for example, the stereo camera that provides the surgical microscope view as well as stereo cameras on the retractor, including possibly both proximal and distal stereo cameras.
  • the stereo camera that provides the surgical microscope view as well as stereo cameras on the retractor, including possibly both proximal and distal stereo cameras.
  • this tool camera too may have the same convergence angle.
  • camera's having similar convergence angle can be selected as the tool progresses into (or out of) the surgical site and/or at different stages of the surgery. Having a similar convergence angle from one stereo camera to another should provide a more comfortable viewing experience for the surgeon.
  • the convergence angle is determined by the separation of the left and right cameras of a stereo camera pair that make up the stereo camera. These cameras obtain images of the object from different perspectives akin to the human's eyes separated by an inter-pupillary distance.
  • the convergence angle is also determined by the disiance to the object, for example, the working distance of the camera. In particular, the convergence angle depends on the ratio of the distance separating the left and right cameras and the working distance of the camera pair to the object.
  • the camera that provides the surgical microscope view has a variable work disiance.
  • the surgeon may select a working distance for this camera that is suitable for the type of procedure to be performed. This work distance may establish a convergence angle, if for example the separation between the left and right cameras in the stereo camera pair is fixed. (In other embodiments, with variable convergence angle for the stereo camera that provides surgical microscope views, the surgeon may select a convergence angle.)
  • other stereo cameras may be configured to be adjusted to also provide this same convergence angle.
  • the stereo camera or cameras on the retractor and/or surgical tool may be adjustable to provide the same convergence as is provided by the camera configured to provide surgical microscope views.
  • Such cameras may include proximal and/or distal cameras on the retractor. See, for example, U.S. Patent Application No. 14/491 ,935 filed September 19, 2014 which is incorporated herein by reference in its entirety, which shows stereo camera designs including single sensor and multiple sensor camera designs.
  • the mask can be moved dynamically (increasing or decreasing this separation) to accommodate variable optical parameters of the camera optics, for example, convergence, as well as variable focus and working distance.
  • the mask may be implemented via software and corresponds to which pixels of the sensor to exclude from image formation. Conversely, the software implemented mask determines what pixels are used to collect image data. Accordingly, separate left and right open portions of the mask where light to form the image is collected can be spaced farther apart or closer together depending on the desired convergence angle.
  • Such a mask need not be limited to embodiments such as those with a single sensor.
  • Embodiments such that employ two detector array chips can also have one or more masks that can be moved to accommodate for different optical parameters including convergence, work distance, focal length, etc. See, for example, U.S. Patent Application No. 14/491,935 filed September 19, 2014 which is incorporated herein by reference in its entirety, which shows stereo camera designs including single sensor and multiple sensor camera designs.
  • One or both two dimensional detector arrays can have masks having open regions that are laterally translated to change the distance separating the locations where light is collected, thus changing, for example, the convergence angle.
  • the mask may be implemented via software and corresponds to which pixels of the sensor to exclude from image formation.
  • the mask can be adjusted, for example, one or more openings therein can be translated, to provide for the same convergence between stereo cameras on the retractor and/or surgical tool as on the stereo camera that provides surgical microscope views.
  • a convergence angle may be initially established by selecting a working distance for the stereo camera providing the surgical microscope view depending on the type of procedure to be performed. This selection of working distance may establish a convergence angle between the left and right channels of the stereo camera pair that provides surgical microscope views.
  • stereo mask on one or more other stereo cameras may be changed or reconfigured, for example, by moving one or more openings therein, to provide the same convergence angle as provided by said one or more stereo camera pairs. Consequently, using the reconfigurable mask with movable aperture(s), stereo camera pairs on retractors or surgical tools may be provided with a similar convergence as the stereo camera pair providing the surgical microscope view.
  • stereo cameras e.g., surgical microscope view camera, proximal retractor camera, distal retractor camera, surgical tool camera, etc.
  • the stereo camera may additionally provide adjustable focus.
  • One or more actuators may be included that are configured to translate one or more lenses in the camera optics that images the surgical site onto the two- dimensional detector array to change the focus of the camera. These actuators may be driven electrically in some embodiments although different types of actuators could be employed. These actuators can be included in the package that supports the camera and is disposed on the retractor.
  • cameras on retractors in contrast for example to endoscopes have available space lateral to the imaging lenses (e.g., in the radial direction) in which such actuation devices can be located. The result may be that the lateral dimensions (e.g., in x and y) exceed the longitudinal dimensions (z), however, surgical access to the surgical site would not be impeded by utilization of the space surrounding ihe lenses in the lateral or radial directions.
  • the mask when the focus is changed using the actuator, the mask may be reconfigured or changed as discussed above. For example, one or more open region or aperture in ihe mask through which light is directed to the left and/or right channel can be shifted laterally to increase or decrease the convergence angle. In this manner, the convergence angle of the stereo camera with the adjustable focus disposed on the retractor or surgical tool can be altered to be the same as the convergence angle of the stereo camera providing the surgical microscope views. Constant convergence angle for different stereo cameras can be provided even if such cameras include an adjustable focus. Both the focus and the mask can be changed as needed to provide the desired focus and convergence angle.
  • one or more lenses may be translated laterally to alter the convergence of the stereo camera.
  • one or more lens included in imaging optics for ihe left and/or right channel may be translated orthogonal to the optical axis or optical path to the sensor to alter the convergence angle, for example, to provide different or the same convergence angle with different work distances, focuses, similar to the use of the laterally displaced mask discussed above.
  • FIG. 25 is a schematic illustration of a surgical visualization system including an assistant display.
  • a separate assistant display may be provided for use by a surgical assistant or observer.
  • the assistant display 10035 comprises a binocular viewing platform 10036 for the assistant that includes oculars 10039 mounted on a lockable articulated arm 10037, which extends from a support post 10041 .
  • the assistant binocular viewing platform can include one or more displays such as LCD or organic LED displays as described with regard to the surgeon's viewing platform.
  • optics may also be included in the binocular viewing pl atform to provide a view of the display through the oculars.
  • a Wheatstone setup may for example be used in some embodiments.
  • the assistant display 10035 can include, for example, one or more eMagin NTE AMOLED displays.
  • the display can provide a three-dimensional view, as described in more detail above with respect to the surgeon binocular display.
  • the viewing platform may be disposed above and/or over the patient, similar to a surgical microscope.
  • the viewing platform may be disposed on an articulated arm so as to be arranged above and/or over the patient, similar to a surgical microscope. Providing the viewing platform above or over the patient permits the surgeon to be sufficiently close to the patient (e.g., at the patient's side) to perform the surgery while looking through the oculars.
  • the viewing platform is compact, thus allowing the surgeon to be in close proximity to the patient without being separated from the patient by a bulky- system.
  • the oculars can thus be disposed sufficiently over the patient so that the surgeon's hands can reach the patient to perform surgery.
  • the surgeon's close proximity to the patient can allow the surgeon to more closely monitor the patient during the surgery.
  • a surgical visualization system includes a viewing platform, for surgeons and/or assistants etc., comprising a housing containing one or more displays therein.
  • the displays provide video from a camera viewing the surgical site.
  • the viewing platform does not provide a direct view through the housing.
  • the surgeon or assistant does not see through the housing directly viewing the surgical sight using light passing from the surgical site through the housing, instead, the viewer peers into the housing, via oculars, at displays.
  • the displays present images obtained from cameras sensors viewing the surgical sight.
  • Such a configuration provides ergonomic benefits as the line of sight of the stereo camera is decoupled from the displays and/or oculars so that the surgeon or assistant need not have his or her line of sight aligned with the line of sight of the stereo camera providing the surgical microscope view.
  • This configuration and benefit may apply to both the surgeon and assistant displays.
  • Mirrors may be employed to direct left and right images from one or more displays to left and right oculars.
  • left and right displays or left and right eye portions of a display provide images with parallax consistent to that of human eyes or Wheatstone configuration is used to provide stereo.
  • Mirrors may fold the optical path from the display(s) to the oculars thereby providing for a more compact smaller footprint.
  • lenses may be employed in the optical path from the display(s) to the oculars.
  • the lenses may collimate the light from the display and form a collimated or substantially collimated beam that is received by the oculars.
  • the lenses may also reduce the beam size.
  • one or more rectangular displays having a diagonal between 3-8 inches, 4-6 inches, e.g., 5 inches may be used.
  • the lens may collect and collimate light from such a large object and reduce the beam to a smaller size for the oculars which may have an aperture size, for example, between about 0.3-2.0 inches or 0.5 to 1.0 or 1.5 inches.
  • Producing a beam having a reduced cross section enables smaller folding mirrors to be employed.
  • the housing is compact and has a small footprint.
  • the viewing platform may provide three-dimensional images via the left and right pair of oculars. Accordingly, in various embodiments, the viewing platform has a similar feel as a surgical microscope that is a compact a stereo binocular microscope, to be familiar to surgeons.
  • the viewing platform may be disposed above and/or over the patient, similar to a surgical microscope.
  • the viewing platform may be disposed on an articulated arm so as to be arranged above and/or over the patient, similar to a surgical microscope. Providing the viewing platform above or over the patient permits the surgeon to be sufficiently close to the patient (e.g., at the patient's side) to perform the surgery while looking through the oculars.
  • a compact viewing platform that can be disposed proximal to, above and/or over the patient allows the surgeon to be in close proximity to the patient without being separated from the patient by a bulky system.
  • the oculars can thus be disposed sufficiently over the patient so that the surgeon's hands can reach the patient to perform surgery.
  • the surgeon's close proximity to the patient can allow the surgeon to more closely monitor the patient during the surgery.
  • An image of object 12102 is formed along the left eye optical path at the left eye image plane 12106.
  • An image sensor here can then produce video for a left-eye view in a binocular display system, as described herein.
  • a counterpart right-eye view image (not shown) could also be produced on the opposite side of the opto-mechanieal axis of the objective at a similar longitudinal position along the length of the triplet opto-mechanical axis as the left eye image.
  • the air-spaced triplet 12105 can be a super achromat.
  • FIG. 27 illustrates an example imaging system 12.200 comprising a common objective fens 12205 for both optical paths of stereo imagers.
  • the imaging system 12200 can be configured to provide a relatively high zoom factor.
  • the common objective lens 12205 can be a doublet used at its focal length. The doublet can be an achromat.
  • the common objective lens 12205 can comprise more than two fens elements, e.g., three, four, five, or more than five lens elements.
  • the afocal lens group may comprise first and second lenses or lens groups lens separated by a distance. An additional lens or lenses may be included, for example, between these two lenses.
  • a central lens group is shown disposed between first and second power lens groups.
  • the first and second lens or lens groups are negative and the central lens is positive.
  • the first and second lens or fens groups may be positive and the central lens is negative.
  • these first and second lens groups and the central lens group may comprise positive and negative lenses or just a single positive or negative lens. A variety of other configurations are possible.
  • the lenses may be in different locations and separated by different distances. One or more lenses may be moved to change magnification.
  • the lens group is afocal such that collimated light input by the afocal zoom will be output as a coflimated beam. Magnification may also be provided, m various embodiments, an afocal zoom such as shown in FIG. 27 is disposed in each of the right and left optical paths (e.g., corresponding to right and left eye views for a stereo display).
  • the light from an object can be coflimated by the objective lens 12205 to produce coflimated light output 12207.
  • the collimated fight 12207 can then enter the afocal zoom lens group 12210.
  • the afocal lens group 12210 is configured to receive collimated light and output collimated light.
  • the afocai lens group 12210 can include one or more moving lens elements and or variable power optical elements to change a magnification of the collimated light output to result in a zoom lens system.
  • the collimated space 12207 between the objective lens 12205 and the zoom lens group 12210 can be of any arbitrary size (e.g., length along the optical path)
  • the zoom systems 12210 shown in FIGS. 27 and 28 have different designs.
  • the video coupler optical systems 12220 also have different designs.
  • the video coupler optical system shown in FIG. 28 shows an unfolded prism between a first lens element and a second lens element. Such a prism may permit the optical path to be redirected, for example, to provide a more compact of desirable shape.
  • the common objective 12205 coliimates light that passes through the prism, fold element, or beam splitter 12235 to redirect the optical path.
  • the collimated light enters the afocal zoom lens group 12210, which provides collimated light output.
  • the aperture 12209 restricts the aperture of the imaging system 12200.
  • the collimated light then passes through beam splitters and optical redirection elements 122.40 to generate a plurality of optical paths.
  • Each optical path comprises a video coupler optical system 12220a, 12220b, 12220c configured to generate an image at image planes 12230a, 12230b, 12230c.
  • the fluorescence or other wavelength of interest may be detected by the one or more cameras imaging the surgical field such as one or more camera providing a surgical microscope view.
  • an optical detector that is sensitive to the wavelength of the fluorescent emission may be employed to view the fluorescent image.
  • the wavelength of fluorescent emission is in the infrared.
  • sensors sensitive to different wavelengths may be employed.
  • one or more sensors sensitive to the fluorescing wavelength e.g., IR
  • one or more sensors sensitive to the fluorescing wavelength may be used in conjunction with one or more sensors not sensitive or less sensitive to the fluorescing wavelength but sensitive or more sensitive to other useful wavelengths (e.g. visible light).
  • Light can be collected and distributed to both types of detectors for example using a beam splitter such as a wavelength dependent beam splitter that reflects one wavelength and passes another.
  • the fluorescent and non-fluorescent images can be recorded by the respective sensors.
  • the fluorescent and non-fluorescent images can be superimposed when displayed on electronic displays that receive image data from both types of sensors.
  • images produced by fluorescence or other wavelengths of interest are superimposed on one or more images from other camera(s).
  • Filtering could be provided to remove unwanted wavelengths and possibly increase contrast.
  • the filter can be used to remove excitation illumination.
  • emission image content e.g., fluorescing tissue
  • emission image content can be parsed and superimposed on image content that is not emitting (e.g., tissue that is not fluorescing), or vice versa.
  • IR fluorescence images are superimposed over non-IR (e.g. visible) images. Other wavelengths such as other fluorescence wavelengths may be employed.
  • an artificial color rendition of the fluorescing content can be used in place of the actual fluorescing color so as to enable the fluorescing tissue to be visible.
  • FIG. 31 schematically illustrates an example medical apparatus in accordance with certain embodiments described herein.
  • the medical apparatus 2100 can comprise a display (or display portion) 21 10, a plurality of cameras 2120, and one or more processors 2130.
  • the plurality of cameras 2120 can include at least one first camera 2121 configured to image fluorescence in a surgical field, and at least one second camera 2122a configured to produce a non- fluorescence image of the surgical field.
  • the processor 2130 can be configured to receive images from the plurality of cameras 2121 a, 212.2a, and to display on the display 2.1 10 a fluorescence image from the at least one first camera 2121 a and to display on the display 21 10 the non- fluorescence image from the at least one second camera 2122a.
  • the processor 2130 can advantageously include a plurality of processors 2131 a, 2132a, e.g., a separate processor for each camera within the plurality of cameras 2120.
  • a first processor 2131 a can be configured to receive an image from at least one first camera 2121 a and to display on the display 21 10 a fluorescence image.
  • at least one second processor 2132a can be configured to receive an image from at least one second camera 2122a and to display on the display 21 10 the non-fluorescence image,
  • the display 21 10 can be a primary display, a surgeon display, an assistant display, possibly other displays, or any combination of these.
  • the display 21 10 can include a display portion, a display, or display device as described herein.
  • the display 21 10 can include a display (or display portion) to be viewed through one or more oculars, e.g., a display within the viewing platform 9 of the surgical viewing system 1 shown in FIGS. 1 , 3, 4A, 5A and 5B.
  • the display (or display portion) could be within a housing.
  • the display 21 10 can include a display mounted on a display arm from the ceiling or on a post, e.g., a display device 13 on display arm 5 of the surgical viewing sysiem 1 shown in FIG, 1.
  • the plurality of cameras 2120 can include a camera to provide a surgical microscope view of the surgical field.
  • the plurality of cameras 2120 can include a camera disposed on a surgical tool or on another medical device.
  • the plurality of cameras 2120 can include at least one first camera 2121a and at least one second camera 212.2a configured to form a left-eye view of the surgical field.
  • the plurality of cameras 2120 can also include at least one first camera 2121b and at feast one second camera 2122b configured to form a right-eye view of the surgical field.
  • the left and right-eye views are for stereoscopic viewing of the surgical field and the cameras can be angled to provide desired convergence mimicking the human eye.
  • One or more cameras 2121 a, 2121b, 2122a, and/or 2122b of the plurality of cameras 2120 can include optical assemblies as described herein.
  • one or more cameras 2121a, 2121b, 212.2a, and/or 2122b can include a turning prism 54, a lens train 55, and/or a sensor 56 as shown in FIG. 6A.
  • the at. least one first camera 2121a can be configured to image fluorescence in a surgical field, and the at least one second camera 2122a can be configured to produce a non-fluorescence image of the surgical field.
  • the at least one first camera 2121b can be configured to image fluorescence in a surgical field, and the at. least one second camera 2122b can be configured to produce a non- fluorescence image of the surgical field.
  • the first camera 212.1a and/or 2.121 b can be sensitive to infrared wavelengths, ultraviolet wavelengths, or other fluorescence wavelengths.
  • an optical detector e.g., sensor 56 or an array of sensors, of ihe first camera 2121a and/or 2121b can be sensitive to fluorescence wavelengths.
  • the first camera 2121 a and/or 2121 b sensitive to fluorescence wavelengths can include an infrared, ultraviolet, or other fluorescence light source.
  • illumination using an optical fiber can be used to provide pump radiation to induce fluorescence.
  • the processor 2130 can be configured to superimpose the fluorescence image over the non-fluorescence image, in other embodiments, the processor 2130 can be configured to superimpose the non- fluorescence image over the fluorescence image. In various embodiments, the processor 2130 can electronically process and synchronize the fluorescence and non-fluorescence images together. For example, the processor 2130 can read, align, and combine together the images. [0297] The processor 2130 can include a general ail-purpose computer and in some embodiments, a single processor may be used with both the left and right display portions 21 10, However, various embodiments of the medical apparatus 2100 can include separate processing electronics for the left-eye and right-eye views.
  • Such separate processing for the left and right channels can be advantageous over a processor with single processing electronics or the general ail-purpose computer since time is critical in surgical procedures. For example, in some embodiments, having separate dedicated processing electronics for each channel can provide pure parallel processing, which results in faster processing of images, thereby reducing latency. In addition, addressing a failure of a general all-purpose computer may entail rebooting of the computer and involve some downtime. Furthermore, with separate processing electronics in left-eye and right-eye view channels, if one of the processing electronics were to fail, the processing electronics in the other channel can continue to provide images to the surgeon. Such redundancy can also be incorporated into a monocular viewing system. For example, in some embodiments of a monocular viewing system, two channels similar to a binocular viewing system can be provided. Images for the monocular viewing system can be split into each channel, with each channel having its own processing electronics.
  • the medical apparatus 2100 can include separate processing for each camera within each channel to further increase processing of images and reduce latency.
  • processor 2131a can be configured to receive an image from camera 2121 and to display on the display 21 10 a fluorescence image from camera 212.1 a.
  • Processor 2132a can be configured to receive an image from camera 2.122a and to display on the display 21 10 the iron- fluorescence image from camera 2122a. The fluorescence and non- fluorescence images can be superimposed optically on the display 21 10.
  • processor 2.131b can be configured to receive images from camera 2.12.1b and to display on the display 2.1 10 a fluorescence image from camera 2121b.
  • Processor 2132b can be configured to receive images from camera 2122b and to display on the display 21 10 the non- fluorescence image from camera 2122b. The fluorescence and non-fluorescence images can be superimposed optically on the display 21 10.
  • each of the separate processing electronics can be configured for image manipulation, e.g., to receive image data, process the image data, and output the images for display.
  • each of the processing electronics can be configured to receive one or more user inputs, receive one or more input signals corresponding to images from one or more cameras, and/or select which image to display.
  • Each of the processing electronics can also resize, rotate, or reposition the selected image based at least in part on one or more user inputs.
  • the processing electronics can also produce one or more output signals to drive one or more display s to produce one or more images.
  • each processing electronics can include a microprocessor, a field programmable gate array (FPGA), or an application specific integrated circuit (ASIC).
  • Each processing electronics can also include a graphics processing unit (GPU) and random access memory (RAM). The processing electronics can also control the color balance, brightness, contrast, etc. of the one or more images.
  • an image at a first wavelength range can be superimposed with an image at a second wavelength range.
  • one or more sensors can capture a first image at a first wavelength range
  • one or more sensors can capture a second image at a second wavelength range.
  • the first and second images can be superimposed optically as disclosed herein.
  • the image at a first wavelength can be provided by narrow band imaging instead of fluorescence imaging.
  • a filter in some embodiments can allow imaging with the use of ambient light at blue (about 440 to about 460 nm) and/or green (about 540 to about 560 nm) wavelengths for the image at the first wavelength. Imaging at or near these wavelengths can improve visibility of features since the peak light absorption of hemoglobin occurs at these wavelengths.
  • the image at the second wavelength can be provided without narrow band imaging (e.g., use of ambient light without a filter).
  • One or more image could also be from other sources, e.g., a data file, a computed tomography (CT) scan, a computer aided tomography (CAT) scan, magnetic resonance imaging (MRI), an x-ray, ultrasound imaging instrument, etc.
  • CT computed tomography
  • CAT computer aided tomography
  • MRI magnetic resonance imaging
  • x-ray ultrasound imaging instrument, etc.
  • the cameras 2221a, 2222a can produce the images onto the plurality of displays 221 l a, 2212a, e.g., with a processor.
  • an electronic processor need not perform the combining of images.
  • a beam combiner 2.230a can be configured to receive the fluorescence and non-fluorescence images from the first 221 1 and second 2212a displays and to combine or superimpose optically the fluorescence and non- fluorescence images for left-eye viewing, e.g., within a housing through an ocular or on a display device.
  • the plurality of cameras can also include another first camera 2221b configured to produce a fluorescence image onto another first display 2211 b and another second camera 2222b configured to produce a non- fluorescence image onto another second display 2212b.
  • the cameras 2221b, 2.222b can obtain images that can be viewed on the plurality of displays 221 lb, 2212b, for example, using processing electronics.
  • an electronic processor need not perform the combining of images.
  • the beam combiner 2230 can include a beam splitter (e.g., a 45 degree or other angle splitter used in reverse), a dichroic beam splitter, a prism, or other optical structure to combine the beams.
  • a beam combiner 2230a can be placed within the left-eye optical path to receive the fluorescence and non- fluorescence images from the first 221 1a and second 2212a displays and to superimpose the fluorescence and non-fluorescence images for left-eye viewing, e.g., within a housing through an ocular or on a display device.
  • another beam combiner 2230b can be placed in the right-eye optical path to receive the fluorescence and non-fluorescence images from the first 221 1b and second 2212b displays and to superimpose the fluorescence and non-fluorescence images for right-eye viewing.
  • Some embodiments can further include imaging optics (e.g., an optics assembly) disposed to collect light from the displays to enable the images to overlap. The imaging optics can be configured to form images at infinity.
  • FIG. 32C schematically illustrates a top view of an embodiment of a medical apparatus incorporating the example left and right assemblies from FIGS. 32A and 32B.
  • an image at a first wavelength range can be superimposed with an image at a second wavelength range.
  • a first camera 2221a can produce a first image at a first wavelength range onto a first display 2.2.1 1a
  • a second camera 2222a can produce a second image at a second wavelength range onto a second display 2212a.
  • the beam combiner 2230a can optically superimpose the first and second images.
  • the image at a first wavelength can be provided by narrow band imaging instead of fluorescence imaging, and the image at the second wavelength can be provided without narrow band imaging as described herein.
  • images from two different cameras of the same or substantially the same wavelength, but having other properties can be superimposed.
  • one image could be a natural image of tissue, and another view could be an unnatural image (e.g., an image with false color or an image with exaggerated or extreme contrast).
  • such superimposed images can advantageously show margins between healthy and unhealthy tissue.
  • the example embodiments of the medical apparatuses shown in FIGS. 31 and 32A-32C can also be modified to produce a composite image of two or more images.
  • FIG. 31 and 32A-32C can also be modified to produce a composite image of two or more images.
  • 33A illustrates a schematic of an example composite image 2.500, where a first (e.g., a background) image 2.501 is produced on a first portion 251 1 of the composite image 2500, and a second (e.g., a picture-in-picture (PIP)) image 2502 is produced on a second portion 2512 of the composite image 2500.
  • the images can include a fluorescence image and a non-fluorescence image.
  • the images are not necessarily fluorescence and non-fluorescence images.
  • one image can be a surgical microscope view of the surgical field from a camera producing the surgical microscope view.
  • the other image can be the image of the surgical field from a camera disposed on a surgical tool or other medical device.
  • One or more image could also be from sources other than cameras, e.g., a data file, a computed tomography (C ' T j scan, a computer aided tomography (CAT) scan, magnetic resonance imaging (MRI), an x-ray, ultrasound imaging instrument, etc.
  • FIG. 33B schematically illustrates a front view of an embodiment of a medical apparatits incorporating the example left and right assemblies from FIG. 31 or 32A-32C to produce a composite image of two or more images for both left and right eyes.
  • the processor 2131a can be configured to receive an image from the first camera 2121a and to display on the display 2.1 10 the image as the background image 2501 of the composite image 2500.
  • the processor 2132a can be configured to receive an image from the second camera 2122a and to display on the display 21 10 the image as the PIP image 2502 of the composite image 2500.
  • the processor 2131b can be configured to receive an image from the first camera 2121b and to display on the display 21 10 the image as the background image 2501 of the composite image 2500.
  • the processor 2132b can be configured to receive an image from the second camera 2.122b and to display on the display 21 10 the image as the PIP image 2502 of the composite image 2500.
  • the position of the PIP image 2502 in the composite image 2500 can be in the same or different from that illustrated in the figures.. Additional cameras or sources can also be used to produce a multiple PIP images.
  • a beam combiner 2230a, 2230b can be placed within each eye's optical path to produce the composite image 2500.
  • the background image from a camera can be resized or the row count of pixels of the background image can be reduced.
  • the background image can be resized from the full frame to the size of the first portion 251 1 (e.g., about 1/2, 2/3, 3/4, etc.) of the composite image 2500.
  • the beam combiner 2230a, 2230b in each eye's optical path between the viewer and the displays can superimpose the background image with a PIP image such that (he background image appears on the first portion 2.51 1 of the composite image 2500, and the PIP image forms within the remaining portion 2512 (e.g., about 1 /2, 1/3, 1 /4, etc.) of the composite image 2500.
  • the remaining portion 2512 can include a border 2513 surrounding the PIP image 2502 to help prevent the viewer from seeing similar types of images as being falsely contiguous (e.g., similar types of tissues from multiple sources).
  • a first camera 2221 a for providing a surgical microscope view can provide the background image on a first display 221 1a
  • a second camera 2222a disposed on a surgical tool or other medical device can provide the smaller image on a second display 2.212a.
  • the beam combiner 2230a can produce the background image from the first display 221 la as the first portion 251 1 (e.g., about 2/3) of the composite image 2500.
  • the beam combiner 2230a can also combine the PIP image from the second display 2212a as part of a second portion 2512 (e.g., about 1/3) of the composite image 2500.
  • the background image 2501 can be produced in the majority (e.g., about 2/3) of the composite image 2500.
  • the PIP image 2502 can be produced as part of, e.g., within the remaining portion 2512 (e.g., about 1/3) of the composite image 2500.
  • the display 221 l for the background image can be a 5" display.
  • the smaller PIP image from the second camera 2222a can be displayed on a smaller panel viewed off from the beam combiner 2.230a, or could be displayed on a 5" display using only a portion of the display (e.g., about 1/3 of the display or about part of 1/3 of the display). After properly baffling the optical pathways, the viewer can see the smaller image 2502 adjacent the background image 2501 as though it were a picture-in-picture.
  • the beam combiner 2230 can also produce additional PIP images from other displays as part of the composite image 2500. For example, multiple images (e.g., two, three, four, five, six, nine, twelve, etc.) from multiple displays (e.g., two, three, four, five, six, nine, twelve, etc.) can be viewed for each eye's view by using one or more beam combiners 2230.
  • multiple images e.g., two, three, four, five, six, nine, twelve, etc.
  • multiple displays e.g., two, three, four, five, six, nine, twelve, etc.
  • the background image could include additional superimposed or overlapping images.
  • Some embodimenis can include a switch to determine which image to be displayed.
  • the background image could be switched off and not be displayed so that a different image(s) can be displayed in the first portion 251 1 of the view 2500.
  • the images can appear adjacent to one another or tiles in a manner that is not restricted to a PIP arrangement.
  • more than two images may be included, for example, tiles with respect to each other.
  • more than one beam combiner and more than two displays may be employed in various embodiments to combine images, for example for the left eye (or for the right eye).
  • some embodiments as shown in FTG. 32A-32C can, by the use of beam combiners 2230, advantageously can reduce latency by decreasing the time to produce an image for viewing. For example, multiple images can be tiled to view the multiple images from a variety of sources as opposed to being aligned and combined using an image processing technique that consumes computing power.
  • an advantage of additional dispiays in each eye's path in certain embodiments can present to the viewer superimposed images without the complexity of electrical registration and timing issues.
  • the brain can also merge the images if the additional dispiays are reasonably aligned optically.
  • the surface of the cameras or optical elements, such as lenses can become fogged or otherwise obstructed (e.g., with blood).
  • At least one of the cameras may be disposed on a surgical device.
  • at least one camera may be disposed on a retractor.
  • at least one camera may be disposed on a surgical tool.
  • blood or other body fluids and/or biomaterial may be disposed on the camera and block or limit the field of view of the camera, or otherwise degrade the images produced by the camera.
  • the cameras can be cleansed while remaining in place within the surgical site.
  • the cameras can be configured to be cleaned while on the surgical device, retractor, and/or surgical tool.
  • one or more cameras outfitted with cleaning apparatus are included on retractors, tools, or both.
  • a central hydraulic system can be used to provide hydraulic fluid (e.g., saline) and/or air to provide said cleaning for said retractor cameras, surgical tool cameras, or both.
  • surgical visualization systems comprise retractor cameras and surgical fool cameras each having cleaning apparatus for cleaning said cameras.
  • Cleansing fluids may ⁇ be, for example, distilled water, deionized water, or saline (including possibly physiological saline), among others. In some embodiments, these pulses may be brief, high-pressure, and low-volume.
  • the pulse can be produced in a number of ways, for example, using a pop off valve (e.g., disposable elastomeric pop off valves) and a three way valve as discussed elsewhere herein. Pulses can also be produced other ways including by a diaphragm, actuated by a cam and motor in conjunction with a one-way valve.
  • Fluid pressure can be supplied by air pressure in double spike IV bottle.
  • a disposable diaphragm pump can be used to increase pulse pressure.
  • two pumps can be used to eliminate interruption in pulse pressure.
  • passive hydraulic amplifiers can be used to increase the fluid pressure.
  • solenoids, piezoelectric actuators, or other techniques may be used.
  • a rolling edge diaphragm, Bourdon rube, or bellow can likewise be used to produce ihe pulse.
  • a reed valve can be configured to alternate between air and saline operating at the natural mechanical resonance frequency of the reed valve and associated fluid and air column dynamics.
  • the fluid can be saline or other biocompatible liquid.
  • the lens elements are configured such that a stop is affixed to a first lens element wherein the stop covers a large fraction of the first lens element.
  • the lens elements are configured such that a first element comprises a piano window. The stop can be located behind the piano window or more lens elements and can be relatively small. In such embodiments, the light collected by the stop is correspondingly small such that the area of the piano window that should remain clean is relatively small. This configuration can facilitate cleaning the lens system.
  • the piano window is secured to the lens system using a structure that does not extend over the top of the piano window so as to not interfere with mechanisms configured to clean the lens system.
  • the piano window can have a step edge or be retained by a support member (e.g., a metal ring or an edge of the lens system housing) that extends along a side portion of the piano window without extending beyond the distal face of the piano window through which light is collected for the sensor.
  • the camera cleaning sequence may comprise alternate delivery of liquid and air/gas multiple times.
  • actuation of the camera cleaning system may cause the system to deliver pulses of liquid and air/gas three times.
  • the camera cleaning system may deliver a pulse of liquid followed by a pulse of air/gas a first time, then a pulse of liquid followed by a pulse of air/gas a second time, then a pulse of liquid followed by a pulse of air/gas a third time.
  • the number of pulses can be larger or smaller.
  • the sequence of liquid and air/gas pulses can vary.
  • the camera cleaning sequence may comprise the delivery of a pulse of liquid, followed by the delivery of a pulse of air/gas, without further delivery of liquid or air/gas, until the camera cleaning system is actuated again.
  • multiple pulses of liquid can be grouped together as can multiple pulses of air or gas. In one illustrative example, for instance, two pulses of liquid can be followed by two pulses of air or gas. Or a plurality of pulses of liquid can be followed by a pulse of air or gas. Multiple pulses of air or gas may also follow a pulse of liquid.
  • a wide range of sequences are possible and as discussed below may be selected by the user, for example, via a graphic user interface or other interface. Custom sequences may be selected or programmed by the user or pre-programmed for selection by the user or may be programmed in the system without possible modification by the user.
  • the camera cleansing system may be pre-programmed by the manufacturer.
  • the camera cleansing system may be programmed by the surgeon or other operator,
  • the surgeon or other operator may decide to alter the parameiers of the camera cleansing system preprogrammed by the manufacturer. For example, depending on the surgical procedure, the surgeon or other operator may desire a different duration of the fluid and air/gas pulses. As another example, depending on the surgical procedure, the surgeon or other operator may desire a different number of pulses to be ejected with an actuation of a foot pedal or other form of input device. As another example, the surgeon or other operator may desire a different pressure for the pulses and/or volume of liquid or air to be ejected.
  • actuation of the camera cleansing system can be manually controlled.
  • a proportional foot pedal can control actuation of the liquid and/or air/gas pulses.
  • depression of the foot pedal can actuate the delivery of liquid and/or air/gas pulses to the camera(s).
  • the foot pedal may be configured to actuate the delivery of a single liquid and air pulse combination when initially depressed and depressing the foot pedal further to the ground may increase the number of and/or rate of liquid and air/gas pulses delivered to the camera(s). Two or more distinct regions of foot pedal depression may thus be provided that cause pulses with different parameters to be delivered so as to give the surgeon more control of the pulses delivered.
  • the foot pedal is not proportional.
  • the fluid and/or air/gas pulses can be voice controlled.
  • the fluid and/or air/gas pulses can be controlled via a touchscreen.
  • the graphical user interface or other control can enable control of fluid and/or air/gas delivery to the cameras for cleaning the cameras and pulse washing and/or air drying cameras. Other input devices can also be used.
  • the liquid, air, or gas delivered to the camera(s) may be provided by a plurality of fluid lines.
  • the fluid lines may include an inlet at one end connected to a fluid source and an outlet at the other end.
  • the outlet includes a nozzle or is in fluid communication with a nozzle configured to deliver fluid to the camera(s) to be cleaned.
  • these lines may connect to a hydraulic and/or pneumatic cassette used in a hydraulic and/or pneumatic system with pressurized hydraulic and/or pneumatic supplies.
  • the cassette comprises elastomeric proportional valves.
  • valves in the cassette control flow of the liquid and air to deliver the pulses to the cameras.
  • pop off valves such as disposable elastomeric pop off valves
  • three way valves connected to the liquid (e.g. saline source) and to the air source may be employed to switch from a liquid pulse to an air pulse while reducing dribble. Other configurations, however, may be employed.
  • pinch valves for controlling the flo emission of pulses can be connected to the liquid and gas lines that extend from the cassette to the retractor. Actuating the pinch valves can cut off the supply of liquid or air/gas to the camera(s), while releasing the pinch valve may open the supply of liquid or air/gas. In some embodiments, actuation of the pinch valves can be driven by solenoids. In some embodiments, the pinch valves may be disposable roller pinch valves or thumb wheel valves.
  • liquid and/or air/gas pulses can be delivered to each of the plurality of cameras simultaneously.
  • the cameras can be connected to the same fluid line in communication with the pressurized fluid source.
  • one valve e.g., pop off valve, pinch valve, roller or thumb wheel valve etc.
  • this fluid line splits into or is coupled to multiple lines directed to the different cameras or groups of cameras.
  • opening the one valve may cause all the cameras to be cleaned at the same time.
  • closing the one valve may cease the delivery of liquid and/or air/gas to all the cameras at the same time.
  • including only one valve with the camera cleansing system can potentially reduce complexity, cost, and bulk to the camera cleansing system, compared to adding multiple valves to the camera cleansing system.
  • the flex cable may include fiuidic channels to convey the air, gas, or liquid to the camera optics to provide for cleaning. Fiuidic channels can also transport other fluids such as pharmaceuticals, saline for irrigation, fluorescent dyes, etc. to the surgical site, Fiuidic channels can also be provided for aspiration, to provide egress of gases or liquids from the surgical site.
  • the fiuidic channel containing flex cable may be an overlay or surrounding member affixed over the electronic flex cable, thereby allowing the fluid-carrying component to be disposable, whereas the electronic flex cable with integrated optics module may be sterilizable and reusable.
  • the distal end of the fiuidic flex cable can contain an outer housing that is secured over the imaging module. In some embodiments, it is the annular space and shape of the inner surface of said outer housing that directs the fluid and or fluid air pulses over the most distal surface of the optics for cleaning.
  • FIGS. 34A-C show an embodiment wherein an irrigation pathway 403 is provided by an outer sheath 401 comprising a cable 405 including iluidic channel containing cable and a portion that covers the sensor and imaging optics 409.
  • the portion 402 of the sheath 401 that covers the sensor and imaging optics 409 can be shaped to provide conformal fitting yet leave a space 41 1 between the sheath 401 and the sensor and imaging optics 409 for air flow.
  • a section 410 of the outer sheath 401 forward of the imaging optics can be shaped to direct the fluid across the distal surface of the lens.
  • the outer sheath 401 that delivers the fluid can be a separable assembly that can be added to or attached to the optical stack 409.
  • pop off valve is included for the saline line and a pop off valve is provided for the air line to provide a pressurized pulse when the pressure exceeds a threshold.
  • the pressurized pulse of saline exits the annular output port.
  • one or more (e.g., a pair or multiple pairs) of nozzles provide egress of liquid and gas for cleaning.
  • This pulse of pressurized saline is followed by a pulse of pressurized air to force ("squeegee") residual saline from the optical window.
  • squeegee pressurized air to force
  • a valve comprises a linear actuator such as a linear motor, a member attached to the motor such that the motor can move the member, tubing having a pathway therethrough.
  • the movable member is configured such that the motor can cause the movable member to contact the tubing to compress and close the pathway through the tubing.
  • the motor can also cause the movable member to return from such a position so as to reduce the compression and open the pathway through the tubing.
  • the motor can cause the movable member to move back and forth in a linear direction between these positions thereby opening and closing the pathway at a rapid rate
  • the valve will be changed into a more open state.
  • the waveform driving the motor can thus be modulated, for example, using pulse width modulation or frequency modulation (e.g., when the duty cycle is not 5O%:50%). This waveform can thereby determine how much time the moveable member spends compressing the tubing and thus the amount of resistance to the flo of fluid through the tube.
  • Different types of motors and tubing and different configurations may be employed.
  • the tubing may be disposable tubing.
  • First and second pressures can be applied to respective opposite first and second sides of the piston. Tf the first pressure exceeds the second pressure the moveable element may be moved so as to increase blockage of the pathway. Conversely, if the second pressure exceeds the first pressure, the moveable element may be moved in the opposite direction so as to reduce the amount of blockage of the pathway.
  • the first and second pressure therefore, can be controlled to control the occlusion and thus the flow of fluid through the pathway.
  • the common chamber which is configured to receive both saline and air, may be disposed distal to the saline valve. With continued reference to FIG. 37, the common chamber may then extend to the camera optics in the optics housing, so that air and/or saline from the common chamber may travel to the camera optics in the optics housing and clean the camera optics.
  • the Kerrison 1900 includes a base 1930.
  • the base 1930 can include a cutting portion at a distal end (e.g., ihe left end of FIG. 39B).
  • the base 1930 can be fsxed axialiy (e.g., parallel to the handle axis 1927) with respect to the distal handle portion 1923 and/or with respect to the proximal handle portion 1918.
  • the base 1930 and/or distal handle poriion 1923 are rotatable about the handle axis 1927 with respect to the proximal handle portion 1918.
  • the proximal handle portion 1918 can define an actuation chamber 1919.
  • at least a portion of the actuation chamber 1919 along a length of the actuation chamber 1919 parallel to the handle axis 1927 has a substantially constant cross-section.
  • at least a portion of the actuation chamber 1919 has a circular cross-section
  • the distal handle poriion 1923 defines a distal actuation chamber 1917.
  • the distal actuation chamber 1917 has a cross-section with substantially the same shape and/or size as a cross-section of at least a portion of the actuation chamber 1919.
  • the Kerrison 1900 can include a piston 1920.
  • the piston 1920 can be operably coupled with and/or attached to a Kerrison top portion 1928.
  • the piston 1920 can be a unitary part with or attached/ dhered/welded to the Kerrison top portion 1928.
  • the piston 1 920 and top portion 1928 are connected via a releasable connection (e.g., a protrusion-slot connection).
  • the piston 1920 can be fixed axially (e.g., parallel to the handle axis 1927) with respect to the top portion 1928. n some embodiments, the piston 1920 is fsxed rotationaiJy with respect to the top portion 1928 (e.g., rotation about the handle axis 1927).
  • the top portion 1928 can include a cutting edge on the distal end of the top portion 1928.
  • the cutting edge of the top portion 1928 can be configured to operate with the cutting portion of the base 1930 to cut medical material (e.g., bone and/or other tissue).
  • the top portion 1928 is connected to the base 1930 via a track-protrusion engagement.
  • the top portion 1928 can include a protrusion configured to slidably engage with a track in the base 1 930. Engagement between the track of the base 1930 and the protrusion of the top portion 1928 can limit the movement of the top portion 1928 with respect to the base 1930 to the axial direction (e.g., parallel to the handle axis 1927).
  • the piston 1920 is configured to fit within the actuation chamber 1919 and/or within the distal actuation chamber 1917.
  • the piston 1920 can have a first guide portion 1921 a configured to fit snugly within the actuation chamber 1919 (e.g., fst such that movement of the first guide portion 1921 a within the actuation chamber is substantially limited to axial movement and rotational mo vement about the handle axis 1927).
  • the piston 1920 includes a second guide portion 1921 b. The second guide portion 1921b can be configured to fit snugly within the distal actuation chamber 191 7.
  • Axial movement of the piston 1920 can be limited by interaction between a radially-inward projection 1913 of the proximal handle portion 1918.
  • proximal axial movement of the piston 1920 can be limited by interaction between the second guide portion 1921 b and the radially-inward projection 191 3.
  • distal axial movement of the piston 1920 is limited by interaction between the first guide portion 1921 a and the radially-inward projection 1913.
  • the distal handle portion 1923 can include a distal opening 1905.
  • the distal opening 1905 can be sized and/or shaped to accommodate passage of the top portion 1928 therethrough.
  • the top portion 1928 is sized and shaped to fit snugly within the distal opening 1905.
  • the top portion 1928 can have a non-circular cross-section sized to substantially match a cross- section shape of the distal opening 1905.
  • the top portion 1928 is rotationally locked to the distal handle portion 1923 via interaction between the distal opening 1905 and the top portion 1928.
  • the grip 1915 can be rotated relative to the top portion 1928 and the base 1930.
  • sensors and'or optical devices e.g., cameras, CMOS sensors, etc.
  • sensors and'or optical devices can be attached to the proximal handle portion 1918 such that the relative alignment of the sensors and ' or optical devices with respect to the handle portion 1918 remains consistent independent of rotation of the top portion 1928 and base 1930 with respect to the handle portion 1918.
  • the actuation element 1916 can be configured to exert an axial force on the piston 1920 (e.g., a force upon the first guide portion 1921 a) to move the piston 1920 in the distal axial direction.
  • the Kerrison 1900 can include a biasing structure 1924 (e.g., a spring or other resilient structure) configured to bias the piston 1920 in the proximal axial direction.
  • the biasing structure 1924 can provide a return force to return the piston 1920 to push the piston 1920 in the proximal axial direction when the axial force from the actuation element 1916 is reduced and/or removed.
  • the Kerrison 1900 includes a return valve (not shown) configured to introduce physiological saline so as to provide compression to the actuation element 1916.
  • the return valve may, for example, allow injection of pressurized gas into the distal actuation chamber 1917 or in the region of the proximal actuation chamber 1919 forward the first guide portion 1921 .
  • the fluid introduced via the return valve can be used to move the piston 1920 in the proximal direction.
  • the Kerrison 1900 does not include a biasing structure 1924.
  • the actuation element 1916 can be fluidly connected to a conduit 1914 through which physiological saline can be input into and pulled out from the actuation element 1916.
  • hydraulic controls associated with the actuation element 1916 are operated via a foot pedal.
  • Such embodiments can allow for greater dexterity for the user of the Kerrison 1 900 by reducing the operating variables controlled by the Kerrison 1900 handle portions 191 8, 1923.
  • Elastomeric and/or proportional valves can be used to enhance the responsiveness of the Kerrison 1900 to operation of a foot pedal
  • a pneumatically-driven Kerrison can be used.
  • the Kerrison can be driven by fluid (e.g., hydraulic of pneumatic) by a bellows actuators.
  • the cutting surface or top surface of the tool can be bayonetted.
  • the bayonetted structure of the tool can allow the tool to be inserted into the surgical area without interfering or obscuring the views of the surgical site or overhead views of the surgical field.
  • the bayonet style tool can be utilized for the Kerrison, forceps, scissors, or other tools described herein.
  • the bayonet configuration can be advantageous for small surgical sites or external viewing of the surgical site.
  • the bayonet feature reduces the area obscured by the tool within the surgical site.
  • the housing supporting or comprising the tool can be configured to have a port or lumen therein arranged to facilitate the removal of tissue and bone extracted from the surgical site.
  • the Kerrison can have a side port or opening located proximal of the cutting head, though which cut tissue can be removed (e.g., pushed through port or opening as cutter withdraws and Kerrison returns to the default position).
  • a source of suction, or a source of saline and suction can be supplied to the port.
  • the removal port or lumen of the housing can also support a mechanical removal mechanism, such as but not limited to a screw type auger (which can be hydraulically actuated, via for example a gear motor, gerotor, or vane motor), to facilitate removal of bone debris and extracted tissue from the surgical site, in some embodiments, the removed tissue can be extracted to a waste reservoir supported by or tethered to the housing of the tool.
  • a mechanical removal mechanism such as but not limited to a screw type auger (which can be hydraulically actuated, via for example a gear motor, gerotor, or vane motor), to facilitate removal of bone debris and extracted tissue from the surgical site, in some embodiments, the removed tissue can be extracted to a waste reservoir supported by or tethered to the housing of the tool.
  • the mo vable cutting head of the Kerrison can be a generally cylindrical tube that can be actuated (in the matter described above) to slidably move against the fixed cutting surface 1730.
  • said cylindrical tube can be slidable within an outer
  • the housing supporting or comprising the tool can be configured to have a suction port and a source of saline so that the tool and/or the surgical site can be flushed with saline and the saline and debris can removed via the suction line simultaneously or sequentially with the flushing.
  • the saline can be provided through the conduit used to provide saline to the second inflatable element, through the same or a different lumen of such conduit.
  • Any of the hydraulic system or pneumatic system embodiments disclosed herein can be configured to incorporate or use any suitable surgical tools, including without limitation scissors, micro-scissors, forceps, micro-forceps, bipolar forceps, clip appliers including aneurysm clip appliers, rongeur, and, as described, Kerrison tools.
  • any suitable surgical tools including without limitation scissors, micro-scissors, forceps, micro-forceps, bipolar forceps, clip appliers including aneurysm clip appliers, rongeur, and, as described, Kerrison tools.
  • physiological saline is directed through the nozzle frame 2072 toward an impeller 2076.
  • the impeller 2076 can include a plurality of impeller blades 2077 around the outer periphery of the hub of the impeller 2076.
  • the impeller blades 2077 can rotate within a blade cavity 2077a. (See FIG. 40C.)
  • the impeller 2076 can be integral with or otherwise rotationally coupled with an output shaft 2079 for driving the tool 2082, which can be a drill or other rotational tool.
  • the nozzle outlets can be positioned close to the impeller blades 2077 in the axial direction and can direct physiological saline at a highly-radial angle toward impeller blades 2077 whose surfaces are close to parallel to the rotation of axis of the impeller 2076.
  • the hydraulic turbine 2070 can be configured to operate at rotational speeds of 40,000 rpm to 60,000 rpm, though higher and lower rpm values may be possible. In some embodiments, the hydraulic turbine is configured to operate at rotational speeds of 100,000 rpm. The hydraulic turbine 2070 can be configured to operate at operating pressures between 70 psi and 190 psi, though greater and lesser operating pressures are possible. In some embodiments, the operating pressure of the hydraulic turbine 2070 is designed to be approximately 120 psi.
  • the viscous frictional losses that would be otherwise incurred from interaction between the reflected fluid Fl and the impeller 2076 and/or output shaft 2079 can be reduced.
  • the diverted high velocity fluid F2 and scavenged reflected fluid Fl can be diverted back to the cassette 2020 for re-pressurization.
  • scavenging reflected fluid Fl and diverting it back to the cassette 2020 can reduce the amount of physiological saline required to operate the tools and/or other components of the system.
  • the housing 2071 can include one or more ports open to ambient. Such ports can be configured to receive pressurized air or other pneumatic gas.
  • the turbine 2070 is configured to operate as a dual hydro/pneumatic turbine configured to operate via hydraulic power and pneumatic power or a continual variance of hydraulic and pneumatic power.
  • a controller or switch(s) can be used to vary the amount of hydraulic fluid or pneumatic air or gas are applied to the turbine.
  • the controller or switch(es) allow the user to increase pneumatic gas or air and decrease hydraulic or vice versa.
  • the pneumatic gas or air and hydraulic fluid can be provided by output ports on the display console.
  • the controller and or swiich(es) may be on the controller or remotely located.
  • multiple impellers 2.076 can be utilized in the same turbine housing 2071.
  • the overall diameter of the turbine 2070 and/or some of its components can be reduced relative to a single-impeller turbine 2070 without sacrificing output torque.
  • a Kerrison for bone removal generally includes a handle mechanically coupled to a head including a stationary portion and a movable portion. When a user squeezes the handle, the movable portion moves closer to the stationary portion in a cutting manner (e.g., in a shearing manner), for example to remove bone by trapping the removed bone between the stationary portion and the movable portion (e.g., within a channel between the stationary portion and the movable portion).
  • tools include an aneurysm clipper, a rongeur, forceps, scissors, and the like, although many other hand-operated tools are known to those skilled in the art.
  • Patent Application No. 14/283, 106 (Attorney Docket No. CAMPLX.039A), which is i ncorporated herein by reference in its entirety.
  • the fluid reservoir e.g., an IV bag
  • the compressed gas source pressurizes the physiological saline within the fluid reservoir.
  • a reservoir pressurization line connects the compressed gas source (e.g., hospital compressed gas system) to the fluid reservoir (e.g., an IV bag).
  • the actuator chambers can receive input both from a pressure source (e.g., hospital pressure) and a vacuum source (e.g., hospital vacuum). Input from both sources may provide more precise and responsive control of pressure within the actuator chambers.
  • a proportional-integral-derivative controller PID controller
  • PID controller may be utilized to sense the pressure and/or vacuum level and provide feedback so that the pressure and/or vacuum inputs may be adjusted according to the desired setpoints.
  • Proportional operation of the valves can also enhance the precision with which a user of the hydraulic pressure circuit can regulate the hydraulic fluid pressure within the actuator chambers. Such precision can be useful for controlling surgical tools and other surgical equipment especially in delicate procedures that require extreme precisions like neurosurgery.
  • a surgical tool including the stereo camera system of Embodiment 13 disposed thereon.
  • Embodiment 13 The medical apparatus of Embodiment 12, further comprising at least one mirror between said imaging optics and an exit pupil of said imaging optics.
  • the medical apparatus of Embodiment 1 further comprising processing electronics configured to communicate with said one or more electronic displays to provide images for said one or more electronic displays.
  • Embodiment 75 or 76 wherein said sources of images other than cameras comprises Computer Aided Tomography (CAT ' ) scan, MRJ, x- ray, and ultrasound imaging instruments.
  • CAT ' Computer Aided Tomography
  • Embodiment 75 or 76 wherein said source comprises a source of artificially generated image data.
  • Embodiment 1 further comprising at least one beam splitter disposed in one or both of said first and second optical paths configured to receiving images to be viewable by a binocular assembly connected to said primary housing in addition to images from said electronics displays.
  • the medical apparatus of Embodiment 79 further comprising at least one separate electronic display disposed with respect to said at least one beam splitter such that said one or both of said first and second optical paths receives images produced on said at least one electronic display through said at least one beam splitter for viewing through said binocular assembly connected to said housing in addition so images from said electronics displays.
  • said at least one beam splitter comprises first and second beam splitters and said at least one separate electronic display comprises first and second displays configured to display a pair of two- dimensional images which together when viewed through said binocular assembly produces a three-dimensional image.
  • Embodiment 1 The medical apparatus of Embodiment 1, further comprising an assistant display housing containing at least one assistant electronic display and assistant display imaging optics for imaging images produced on said at least one assistant electronic display.
  • Embodiment 82 further comprising processing electronics in communication with said at least one electronic displays in said assistant display configured to adj st the images presented on said at least one electronic displays in said assistant display based on the orientation of the assistant display housing with respect to the primary housing.
  • Embodiment 101 further comprising sensors to determine an orientation of the assistant housing that provides input to said processing electronics io adjust the images presented on said at least one assisiani electronic display depending on said orientation.
  • Embodiment 102 further comprising at least four cameras for providing a surgical microscope view, said processing electronics selecting images from different pairs of said four cameras depending on said orientation of said assistant housing.
  • Embodiment 104 The medical apparatus of Embodiment 103, wherein said least four cameras comprise four cameras in a square 2 2 array and said electronics select a pair of said four cameras depending on said orientation of said assistant housing.
  • a medical apparatus comprising:
  • one or more electronic displays comprising a plurality of pixels configured to produce a two-dimensional image:
  • first and second imaging optics disposed respectively in first and second optical paths from said one or more electronic displays to form respective first and second substantially coiiimated optical beams;
  • a primary housing at least partially enclosing said displays and said imaging optics
  • first and second imaging optics are configured to direct said first and second beams through said opening.
  • a medical apparatus comprising:
  • an assistant display assembly comprising: one or more electronic displays comprising a plurality of pixels configured to produce a two-dimensional image;
  • first and second imaging optics disposed respectively in first and second optical paths from said one or more electronic displays to form respective first and second collimated optical beams and images disposed at infinity;
  • an assistant display housing at least partially enclosing said displays and said imaging optics
  • first and second imaging optics are configured to direct said first and second beams so that they are substantially parallel to each other and have cross-sections with centers separated from each other by between about 22 mm and 25 mm
  • the housing can include an opening, and the first and second imaging optics can be configured to direct the first and second beams through the opening, and the first and second beams can be substantially parallel to each other at the opening).
  • Embodiment 6 The medical apparatus of Embodiment 1, further comprising a plurality of reflective surfaces in the optical paths of the optical beams to fold the optical beams.
  • the medical apparatus of Embodiment 1 further comprising a plurality of reflective surfaces to fold the optical beams. 10. The medical apparatus of Embodiment 1, further comprising at least one mirror between said one or more electronic displays and said imaging optics.
  • Embodiment 11 The medical apparatus of Embodiment 1 , further comprising between 0 and 2. mirrors between said one or more electronic displays and said imaging optics.
  • Embodiment 13 The medical apparatus of Embodiment 12, further comprising at least one mirror between said imaging optics and an exit pupil of said imaging optics.
  • said at least one mirror comprises at least one mirror between said electronic display and said imaging optics, and at least one mirror disposed between lenses in said imaging optics, said former mirror being larger than said latter mirror.
  • imaging optics comprise a first lens configured to reduce a cross- section of the first beam.
  • Embodiment 1 wherein the imaging optics comprise a plurality of lenses mcluding a first lens and a last lens in said optical paths and said imaging optics has an optical path length from said first lens to the last lens that is between about 50 mm and 250 mm.
  • the imaging optics comprise a plurality of lenses mcluding a first lens and an exit pupil in said optical paths and said imaging optics has an optical path length from said first lens to the exit pupil that is between about 10 mm and 50 mm.
  • said one or more electronic displays comprise first and second electronic displays having centers spaced apart by a distance Wdisplay, wherein said first and second imaging optics have exit pupils having centers spaced apart by a distance Weyepaths, and wherein Wdisplay > Weyepaths.
  • Embodiment 41 The medical apparatus of Embodiment 1 , further comprising baffles in said housing for reducing stray light.
  • Embodiment 47 The medical apparatus of Embodiment 1 , further comprising a binocular assembly comprising first and second objectives, first and second beam positioning optics, and first and second oculars.
  • the medical apparatus of Embodiment 1 further comprising processing electronics configured to communicate with said one or more electronic displays to provide images for said one or more electronic displays.
  • Embodiment 65 The medical apparatus of Embodiment 65, wherein said electronics is configured to receive images from one or more cameras on a surgical device.
  • Embodiment 80 The medical apparatus of Embodiment 79, further comprising at least one separate electronic display disposed with respect to said at least one beam splitter such that said one or both of said first and second optical paths receives images produced on said at least one electronic display through said at least one beam splitter for viewing through said binocular assembly connected to said housing in addition so images from said electronics displays.
  • a surgical visualization system comprising:
  • Embodiment 92 wherein the reduced-size real-time video streams are arranged on the periphery of the central video stream and correspond to a number such that a number of user interface signals received from the actuator indicates which of the reduced-size real-time video streams to display as the ceniral video stream.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Optics & Photonics (AREA)
  • General Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Biophysics (AREA)
  • Theoretical Computer Science (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Dentistry (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Gynecology & Obstetrics (AREA)
  • Pulmonology (AREA)
  • Microscoopes, Condenser (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Lenses (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Endoscopes (AREA)
PCT/US2014/072121 2013-12-23 2014-12-23 Surgical visualization systems Ceased WO2015100310A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP14873324.9A EP3087424A4 (en) 2013-12-23 2014-12-23 Surgical visualization systems
JP2016542194A JP2017507680A (ja) 2013-12-23 2014-12-23 手術可視化システム

Applications Claiming Priority (12)

Application Number Priority Date Filing Date Title
US201361920451P 2013-12-23 2013-12-23
US61/920,451 2013-12-23
US201361921051P 2013-12-26 2013-12-26
US61/921,051 2013-12-26
US201361921389P 2013-12-27 2013-12-27
US61/921,389 2013-12-27
US201361922068P 2013-12-30 2013-12-30
US61/922,068 2013-12-30
US201461923188P 2014-01-02 2014-01-02
US61/923,188 2014-01-02
US201462088470P 2014-12-05 2014-12-05
US62/088,470 2014-12-05

Publications (1)

Publication Number Publication Date
WO2015100310A1 true WO2015100310A1 (en) 2015-07-02

Family

ID=53479644

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/072121 Ceased WO2015100310A1 (en) 2013-12-23 2014-12-23 Surgical visualization systems

Country Status (4)

Country Link
US (3) US20150297311A1 (enExample)
EP (1) EP3087424A4 (enExample)
JP (2) JP2017507680A (enExample)
WO (1) WO2015100310A1 (enExample)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9216068B2 (en) 2012-06-27 2015-12-22 Camplex, Inc. Optics for video cameras on a surgical visualization system
EP3156839A1 (en) * 2015-10-14 2017-04-19 Leica Instruments (Singapore) Pte. Ltd. Beam splitter device having at least two beamsplitting surfaces with different reflection-to-transmission ratios
US9642606B2 (en) 2012-06-27 2017-05-09 Camplex, Inc. Surgical visualization system
WO2017147070A1 (en) * 2015-11-06 2017-08-31 Tait Towers Manufacturing, LLC Coordinated view display device
US9782159B2 (en) 2013-03-13 2017-10-10 Camplex, Inc. Surgical visualization systems
US10028651B2 (en) 2013-09-20 2018-07-24 Camplex, Inc. Surgical visualization systems and displays
JP2018130375A (ja) * 2017-02-16 2018-08-23 池上通信機株式会社 映像データ表示装置
WO2019059314A1 (ja) * 2017-09-22 2019-03-28 株式会社ニコン 画像表示装置及び画像表示システム
US10568499B2 (en) 2013-09-20 2020-02-25 Camplex, Inc. Surgical visualization systems and displays
EP3442446A4 (en) * 2016-04-11 2020-03-04 Relign Corporation ARTHROSCOPIC DEVICES AND METHODS
US10667868B2 (en) 2015-12-31 2020-06-02 Stryker Corporation System and methods for performing surgery on a patient at a target site defined by a virtual object
US10702353B2 (en) 2014-12-05 2020-07-07 Camplex, Inc. Surgical visualizations systems and displays
EP3681150A4 (en) * 2017-09-04 2021-01-13 Kajita, Hiroki MULTI-POINT-OF-VIEW VIDEO IMAGE VIEWING SYSTEM AND CAMERA SYSTEM
US10918455B2 (en) 2017-05-08 2021-02-16 Camplex, Inc. Variable light source
US10966798B2 (en) 2015-11-25 2021-04-06 Camplex, Inc. Surgical visualization systems and displays
US11076816B2 (en) 2015-08-02 2021-08-03 P-Cure Ltd. Imaging system and method
US11154378B2 (en) 2015-03-25 2021-10-26 Camplex, Inc. Surgical visualization systems and displays
US11172953B2 (en) 2016-04-11 2021-11-16 RELIGN Corporation Arthroscopic devices and methods
US11207119B2 (en) 2016-03-11 2021-12-28 RELIGN Corporation Arthroscopic devices and methods
US11464579B2 (en) 2013-03-13 2022-10-11 Stryker Corporation Systems and methods for establishing virtual constraint boundaries
US12167888B2 (en) 2016-03-10 2024-12-17 RELIGN Corporation Arthroscopic devices and methods
US12408998B2 (en) 2019-07-03 2025-09-09 Stryker Corporation Obstacle avoidance techniques for surgical navigation

Families Citing this family (193)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11871901B2 (en) 2012-05-20 2024-01-16 Cilag Gmbh International Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage
US20240148357A1 (en) * 2012-06-21 2024-05-09 Globus Medical, Inc. Medical imaging systems using robotic actuators and related methods
US10009658B2 (en) 2013-03-11 2018-06-26 Sony Corporation Multiview TV template creation and display layout modification
US11547446B2 (en) 2014-01-13 2023-01-10 Trice Medical, Inc. Fully integrated, disposable tissue visualization device
CN105722450B (zh) * 2014-02-14 2018-01-23 奥林巴斯株式会社 内窥镜系统
GB2524498A (en) * 2014-03-24 2015-09-30 Scopis Gmbh Electromagnetic navigation system for microscopic surgery
KR20160014933A (ko) * 2014-07-30 2016-02-12 삼성전자주식회사 초음파 장치 및 그 제어방법
US10709611B2 (en) 2014-09-25 2020-07-14 Amo Development, Llc Systems and methods for lenticular laser incision
AU2015320445B2 (en) 2014-09-25 2020-06-25 Amo Development, Llc Systems for lenticular laser incision
US11504192B2 (en) 2014-10-30 2022-11-22 Cilag Gmbh International Method of hub communication with surgical instrument systems
EP3212080B1 (en) * 2014-10-31 2022-10-05 Rtthermal, LLC Magnetic resonance imaging patient temperature monitoring system and related methods
US10869592B2 (en) 2015-02-23 2020-12-22 Uroviu Corp. Handheld surgical endoscope
DE102015204868A1 (de) * 2015-03-18 2016-09-22 Charité - Universitätsmedizin Berlin Elastographieeinrichtung und Elastographieverfahren
US20160331584A1 (en) * 2015-05-14 2016-11-17 Novartis Ag Surgical tool tracking to control surgical system
US10063850B2 (en) * 2015-06-23 2018-08-28 Mitaka Kohki Co., Ltd. Surgical stereoscopic observation apparatus
CN107735046B (zh) 2015-06-26 2021-04-30 索尼奥林巴斯医疗解决方案公司 手术用显微镜装置和手术用显微镜系统
EP3165153A1 (en) * 2015-11-05 2017-05-10 Deutsches Krebsforschungszentrum Stiftung des Öffentlichen Rechts System for fluorescence aided surgery
SG10201510773PA (en) * 2015-12-30 2017-07-28 Creative Tech Ltd A method for creating a stereoscopic image sequence
WO2017122541A1 (ja) * 2016-01-13 2017-07-20 ソニー株式会社 画像処理装置、画像処理方法、プログラム、及び、手術システム
US10284900B2 (en) 2016-03-15 2019-05-07 Sony Corporation Multiview as an application for physical digital media
US10455270B2 (en) 2016-03-15 2019-10-22 Sony Corporation Content surfing, preview and selection by sequentially connecting tiled content channels
JP7026645B2 (ja) 2016-03-17 2022-02-28 トライス メディカル インコーポレイテッド 凝血塊の排出及び視覚化装置及び使用方法
DE102017109021B4 (de) 2016-05-11 2022-10-27 Carl Zeiss Meditec Ag System für das stereoskopische Visualisieren eines Objektbereichs sowie Objektbereich-Visualisierungsverfahren
US10650621B1 (en) 2016-09-13 2020-05-12 Iocurrents, Inc. Interfacing with a vehicular controller area network
US10973585B2 (en) 2016-09-21 2021-04-13 Alcon Inc. Systems and methods for tracking the orientation of surgical tools
US9772497B1 (en) 2016-09-23 2017-09-26 Robert Troy Hewlett Customized viewing system for an optical device
WO2018058053A1 (en) 2016-09-23 2018-03-29 Robert Troy Hewlett Customized viewing system for an optical device
US11832797B2 (en) 2016-09-25 2023-12-05 Micronvision Corp. Endoscopic fluorescence imaging
US11350816B2 (en) 2020-09-13 2022-06-07 Micron Vision Corp. Portable and ergonomic endoscope with disposable cannula
US11684248B2 (en) 2017-09-25 2023-06-27 Micronvision Corp. Endoscopy/stereo colposcopy medical instrument
US11330973B2 (en) 2017-09-25 2022-05-17 Micronvision Corp Portable and ergonomic endoscope with disposable cannula
US11051886B2 (en) * 2016-09-27 2021-07-06 Covidien Lp Systems and methods for performing a surgical navigation procedure
US10666923B2 (en) * 2017-02-24 2020-05-26 Immervision, Inc. Wide-angle stereoscopic vision with cameras having different parameters
CN106910224B (zh) * 2017-02-27 2019-11-22 清华大学 宽视场高分辨显微成像中像感器阵列标定方法
DE102017003231A1 (de) * 2017-04-03 2018-10-04 Mühlbauer Gmbh & Co. Kg Optisches Bauteilerfassungssystem und Verfahren zum Erfassen mindestens eines Bauteils
DE102017108371B4 (de) * 2017-04-20 2020-08-27 Carl Zeiss Meditec Ag Medizinisch-optisches Darstellungssystem und Verfahren zum Betreiben desselben
WO2018217951A1 (en) * 2017-05-24 2018-11-29 Camplex, Inc. Surgical visualization systems and displays
DE102017214246B3 (de) * 2017-08-16 2018-10-31 Siemens Healthcare Gmbh Vorrichtung und Verfahren zur Feinjustage der Rekonstruktionsebene eines digitalen Kombinationsbildes sowie zugehöriges Bildauswertesystem und/oder Radiologiesystem nebst zugehörigem Computerprogrammprodukt und computerlesbaren Medium
US12262866B2 (en) * 2017-09-22 2025-04-01 Carl Zeiss Meditec Ag Visualization system comprising an observation apparatus and an endoscope
US11980342B2 (en) 2020-11-12 2024-05-14 Micronvision Corp. Minimally invasive endoscope
US12268358B2 (en) 2019-12-05 2025-04-08 Uroviu Corp. Portable endoscope with side-mountable disposable portion
US11771304B1 (en) 2020-11-12 2023-10-03 Micronvision Corp. Minimally invasive endoscope
FI3689295T3 (fi) * 2017-09-29 2023-12-07 J Morita Mfg Corp Hammastarkkailulaite ja hammaskuvan näyttömenetelmä
US11317919B2 (en) 2017-10-30 2022-05-03 Cilag Gmbh International Clip applier comprising a clip crimping system
US11229436B2 (en) 2017-10-30 2022-01-25 Cilag Gmbh International Surgical system comprising a surgical tool and a surgical hub
US11291510B2 (en) 2017-10-30 2022-04-05 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11564756B2 (en) 2017-10-30 2023-01-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US10980560B2 (en) 2017-10-30 2021-04-20 Ethicon Llc Surgical instrument systems comprising feedback mechanisms
US11026687B2 (en) 2017-10-30 2021-06-08 Cilag Gmbh International Clip applier comprising clip advancing systems
US11801098B2 (en) 2017-10-30 2023-10-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11510741B2 (en) 2017-10-30 2022-11-29 Cilag Gmbh International Method for producing a surgical instrument comprising a smart electrical system
US11311342B2 (en) 2017-10-30 2022-04-26 Cilag Gmbh International Method for communicating with surgical instrument systems
US11911045B2 (en) 2017-10-30 2024-02-27 Cllag GmbH International Method for operating a powered articulating multi-clip applier
US11389164B2 (en) 2017-12-28 2022-07-19 Cilag Gmbh International Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices
US11304763B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Image capturing of the areas outside the abdomen to improve placement and control of a surgical device in use
US11202570B2 (en) 2017-12-28 2021-12-21 Cilag Gmbh International Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems
US11633237B2 (en) 2017-12-28 2023-04-25 Cilag Gmbh International Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures
US11100631B2 (en) 2017-12-28 2021-08-24 Cilag Gmbh International Use of laser light and red-green-blue coloration to determine properties of back scattered light
US11602393B2 (en) 2017-12-28 2023-03-14 Cilag Gmbh International Surgical evacuation sensing and generator control
US11432885B2 (en) 2017-12-28 2022-09-06 Cilag Gmbh International Sensing arrangements for robot-assisted surgical platforms
US20190206569A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Method of cloud based data analytics for use with the hub
US10932872B2 (en) 2017-12-28 2021-03-02 Ethicon Llc Cloud-based medical analytics for linking of local usage trends with the resource acquisition behaviors of larger data set
US11903601B2 (en) 2017-12-28 2024-02-20 Cilag Gmbh International Surgical instrument comprising a plurality of drive systems
US11304699B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US10892995B2 (en) 2017-12-28 2021-01-12 Ethicon Llc Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US11612408B2 (en) 2017-12-28 2023-03-28 Cilag Gmbh International Determining tissue composition via an ultrasonic system
US11147607B2 (en) 2017-12-28 2021-10-19 Cilag Gmbh International Bipolar combination device that automatically adjusts pressure based on energy modality
US10987178B2 (en) 2017-12-28 2021-04-27 Ethicon Llc Surgical hub control arrangements
US11324557B2 (en) 2017-12-28 2022-05-10 Cilag Gmbh International Surgical instrument with a sensing array
US12127729B2 (en) 2017-12-28 2024-10-29 Cilag Gmbh International Method for smoke evacuation for surgical hub
US11786245B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Surgical systems with prioritized data transmission capabilities
US11832899B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical systems with autonomously adjustable control programs
US11696760B2 (en) 2017-12-28 2023-07-11 Cilag Gmbh International Safety systems for smart powered surgical stapling
US11896443B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Control of a surgical system through a surgical barrier
US10898622B2 (en) 2017-12-28 2021-01-26 Ethicon Llc Surgical evacuation system with a communication circuit for communication between a filter and a smoke evacuation device
US11304745B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical evacuation sensing and display
US10758310B2 (en) 2017-12-28 2020-09-01 Ethicon Llc Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices
US10944728B2 (en) 2017-12-28 2021-03-09 Ethicon Llc Interactive surgical systems with encrypted communication capabilities
US11786251B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US11069012B2 (en) 2017-12-28 2021-07-20 Cilag Gmbh International Interactive surgical systems with condition handling of devices and data capabilities
US11419667B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Ultrasonic energy device which varies pressure applied by clamp arm to provide threshold control pressure at a cut progression location
US11857152B2 (en) 2017-12-28 2024-01-02 Cilag Gmbh International Surgical hub spatial awareness to determine devices in operating theater
US11423007B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Adjustment of device control programs based on stratified contextual data in addition to the data
US11969216B2 (en) 2017-12-28 2024-04-30 Cilag Gmbh International Surgical network recommendations from real time analysis of procedure variables against a baseline highlighting differences from the optimal solution
US11096693B2 (en) 2017-12-28 2021-08-24 Cilag Gmbh International Adjustment of staple height of at least one row of staples based on the sensed tissue thickness or force in closing
US11253315B2 (en) 2017-12-28 2022-02-22 Cilag Gmbh International Increasing radio frequency to create pad-less monopolar loop
US11076921B2 (en) 2017-12-28 2021-08-03 Cilag Gmbh International Adaptive control program updates for surgical hubs
US11678881B2 (en) 2017-12-28 2023-06-20 Cilag Gmbh International Spatial awareness of surgical hubs in operating rooms
US11969142B2 (en) 2017-12-28 2024-04-30 Cilag Gmbh International Method of compressing tissue within a stapling device and simultaneously displaying the location of the tissue within the jaws
US10918310B2 (en) 2018-01-03 2021-02-16 Biosense Webster (Israel) Ltd. Fast anatomical mapping (FAM) using volume filling
US11659023B2 (en) 2017-12-28 2023-05-23 Cilag Gmbh International Method of hub communication
US11308075B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical network, instrument, and cloud responses based on validation of received dataset and authentication of its source and integrity
WO2019133143A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Surgical hub and modular device response adjustment based on situational awareness
US11304720B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Activation of energy devices
US11257589B2 (en) 2017-12-28 2022-02-22 Cilag Gmbh International Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes
US12458351B2 (en) 2017-12-28 2025-11-04 Cilag Gmbh International Variable output cartridge sensor assembly
US11771487B2 (en) 2017-12-28 2023-10-03 Cilag Gmbh International Mechanisms for controlling different electromechanical systems of an electrosurgical instrument
US11540855B2 (en) 2017-12-28 2023-01-03 Cilag Gmbh International Controlling activation of an ultrasonic surgical instrument according to the presence of tissue
US11278281B2 (en) 2017-12-28 2022-03-22 Cilag Gmbh International Interactive surgical system
US11419630B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Surgical system distributed processing
US12062442B2 (en) 2017-12-28 2024-08-13 Cilag Gmbh International Method for operating surgical instrument systems
US11744604B2 (en) 2017-12-28 2023-09-05 Cilag Gmbh International Surgical instrument with a hardware-only control circuit
US12396806B2 (en) 2017-12-28 2025-08-26 Cilag Gmbh International Adjustment of a surgical device function based on situational awareness
US11026751B2 (en) 2017-12-28 2021-06-08 Cilag Gmbh International Display of alignment of staple cartridge to prior linear staple line
US11166772B2 (en) 2017-12-28 2021-11-09 Cilag Gmbh International Surgical hub coordination of control and communication of operating room devices
US11311306B2 (en) 2017-12-28 2022-04-26 Cilag Gmbh International Surgical systems for detecting end effector tissue distribution irregularities
US11424027B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Method for operating surgical instrument systems
US11559308B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method for smart energy device infrastructure
US11410259B2 (en) 2017-12-28 2022-08-09 Cilag Gmbh International Adaptive control program updates for surgical devices
US20190201142A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Automatic tool adjustments for robot-assisted surgical platforms
US11291495B2 (en) 2017-12-28 2022-04-05 Cilag Gmbh International Interruption of energy due to inadvertent capacitive coupling
US11051876B2 (en) 2017-12-28 2021-07-06 Cilag Gmbh International Surgical evacuation flow paths
US20190201090A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Capacitive coupled return path pad with separable array elements
US11832840B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical instrument having a flexible circuit
US11529187B2 (en) 2017-12-28 2022-12-20 Cilag Gmbh International Surgical evacuation sensor arrangements
US11464559B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Estimating state of ultrasonic end effector and control system therefor
US11864728B2 (en) 2017-12-28 2024-01-09 Cilag Gmbh International Characterization of tissue irregularities through the use of mono-chromatic light refractivity
US11266468B2 (en) 2017-12-28 2022-03-08 Cilag Gmbh International Cooperative utilization of data derived from secondary sources by intelligent surgical hubs
US20190201112A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Computer implemented interactive surgical systems
US11937769B2 (en) 2017-12-28 2024-03-26 Cilag Gmbh International Method of hub communication, processing, storage and display
US11132462B2 (en) 2017-12-28 2021-09-28 Cilag Gmbh International Data stripping method to interrogate patient records and create anonymized record
US11317937B2 (en) 2018-03-08 2022-05-03 Cilag Gmbh International Determining the state of an ultrasonic end effector
US11998193B2 (en) 2017-12-28 2024-06-04 Cilag Gmbh International Method for usage of the shroud as an aspect of sensing or controlling a powered surgical device, and a control algorithm to adjust its default operation
US11576677B2 (en) 2017-12-28 2023-02-14 Cilag Gmbh International Method of hub communication, processing, display, and cloud analytics
US11364075B2 (en) 2017-12-28 2022-06-21 Cilag Gmbh International Radio frequency energy device for delivering combined electrical signals
US11666331B2 (en) 2017-12-28 2023-06-06 Cilag Gmbh International Systems for detecting proximity of surgical end effector to cancerous tissue
US11160605B2 (en) 2017-12-28 2021-11-02 Cilag Gmbh International Surgical evacuation sensing and motor control
US12376855B2 (en) 2017-12-28 2025-08-05 Cilag Gmbh International Safety systems for smart powered surgical stapling
US11559307B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method of robotic hub communication, detection, and control
US11589888B2 (en) 2017-12-28 2023-02-28 Cilag Gmbh International Method for controlling smart energy devices
US11109866B2 (en) 2017-12-28 2021-09-07 Cilag Gmbh International Method for circular stapler control algorithm adjustment based on situational awareness
US11234756B2 (en) 2017-12-28 2022-02-01 Cilag Gmbh International Powered surgical tool with predefined adjustable control algorithm for controlling end effector parameter
US11273001B2 (en) 2017-12-28 2022-03-15 Cilag Gmbh International Surgical hub and modular device response adjustment based on situational awareness
US11284936B2 (en) 2017-12-28 2022-03-29 Cilag Gmbh International Surgical instrument having a flexible electrode
US11446052B2 (en) 2017-12-28 2022-09-20 Cilag Gmbh International Variation of radio frequency and ultrasonic power level in cooperation with varying clamp arm pressure to achieve predefined heat flux or power applied to tissue
US11056244B2 (en) 2017-12-28 2021-07-06 Cilag Gmbh International Automated data scaling, alignment, and organizing based on predefined parameters within surgical networks
US11376002B2 (en) 2017-12-28 2022-07-05 Cilag Gmbh International Surgical instrument cartridge sensor assemblies
US10966791B2 (en) 2017-12-28 2021-04-06 Ethicon Llc Cloud-based medical analytics for medical facility segmented individualization of instrument function
US11896322B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub
US11464535B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Detection of end effector emersion in liquid
US11844579B2 (en) 2017-12-28 2023-12-19 Cilag Gmbh International Adjustments based on airborne particle properties
US11571234B2 (en) 2017-12-28 2023-02-07 Cilag Gmbh International Temperature control of ultrasonic end effector and control system therefor
US11179208B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Cloud-based medical analytics for security and authentication trends and reactive measures
US11818052B2 (en) 2017-12-28 2023-11-14 Cilag Gmbh International Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US10943454B2 (en) 2017-12-28 2021-03-09 Ethicon Llc Detection and escalation of security responses of surgical instruments to increasing severity threats
US20190201113A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Controls for robot-assisted surgical platforms
US12096916B2 (en) 2017-12-28 2024-09-24 Cilag Gmbh International Method of sensing particulate from smoke evacuated from a patient, adjusting the pump speed based on the sensed information, and communicating the functional parameters of the system to the hub
US11678927B2 (en) 2018-03-08 2023-06-20 Cilag Gmbh International Detection of large vessels during parenchymal dissection using a smart blade
US12303159B2 (en) 2018-03-08 2025-05-20 Cilag Gmbh International Methods for estimating and controlling state of ultrasonic end effector
US11259830B2 (en) 2018-03-08 2022-03-01 Cilag Gmbh International Methods for controlling temperature in ultrasonic device
US11534196B2 (en) 2018-03-08 2022-12-27 Cilag Gmbh International Using spectroscopy to determine device use state in combo instrument
DE102018105515A1 (de) * 2018-03-09 2019-09-12 Haimer Gmbh Vorrichtung zur Einstellung und/oder Vermessung eines Werkzeugs
US11219453B2 (en) 2018-03-28 2022-01-11 Cilag Gmbh International Surgical stapling devices with cartridge compatible closure and firing lockout arrangements
US11090047B2 (en) 2018-03-28 2021-08-17 Cilag Gmbh International Surgical instrument comprising an adaptive control system
US11166716B2 (en) 2018-03-28 2021-11-09 Cilag Gmbh International Stapling instrument comprising a deactivatable lockout
US11471156B2 (en) 2018-03-28 2022-10-18 Cilag Gmbh International Surgical stapling devices with improved rotary driven closure systems
US11278280B2 (en) 2018-03-28 2022-03-22 Cilag Gmbh International Surgical instrument comprising a jaw closure lockout
US11259806B2 (en) 2018-03-28 2022-03-01 Cilag Gmbh International Surgical stapling devices with features for blocking advancement of a camming assembly of an incompatible cartridge installed therein
US11207067B2 (en) 2018-03-28 2021-12-28 Cilag Gmbh International Surgical stapling device with separate rotary driven closure and firing systems and firing member that engages both jaws while firing
US11096688B2 (en) 2018-03-28 2021-08-24 Cilag Gmbh International Rotary driven firing members with different anvil and channel engagement features
US10973520B2 (en) 2018-03-28 2021-04-13 Ethicon Llc Surgical staple cartridge with firing member driven camming assembly that has an onboard tissue cutting feature
US10292769B1 (en) * 2018-08-07 2019-05-21 Sony Corporation Surgical assistive device and method for providing assistance in surgery of anatomical portions of internal organ affected by intraoperative shift
US11389126B2 (en) * 2018-10-31 2022-07-19 General Electric Company Gantry housing, and medical apparatus
CN109464202B (zh) * 2018-12-26 2021-05-14 韩成冰 一种口腔颌面外科肿瘤切除装置
WO2020138521A1 (ko) * 2018-12-26 2020-07-02 쓰리디메디비젼 주식회사 수술 동영상 생성 시스템
US10959716B2 (en) * 2019-02-11 2021-03-30 Warsaw Orthopedic, Inc. Surgical retractor system and method
US11317915B2 (en) 2019-02-19 2022-05-03 Cilag Gmbh International Universal cartridge based key feature that unlocks multiple lockout arrangements in different surgical staplers
US11464511B2 (en) 2019-02-19 2022-10-11 Cilag Gmbh International Surgical staple cartridges with movable authentication key arrangements
US11357503B2 (en) 2019-02-19 2022-06-14 Cilag Gmbh International Staple cartridge retainers with frangible retention features and methods of using same
US11369377B2 (en) 2019-02-19 2022-06-28 Cilag Gmbh International Surgical stapling assembly with cartridge based retainer configured to unlock a firing lockout
US11298129B2 (en) 2019-02-19 2022-04-12 Cilag Gmbh International Method for providing an authentication lockout in a surgical stapler with a replaceable cartridge
TWI743473B (zh) * 2019-04-26 2021-10-21 財團法人國家實驗研究院 外科手術攝影系統
USD964564S1 (en) 2019-06-25 2022-09-20 Cilag Gmbh International Surgical staple cartridge retainer with a closure system authentication key
USD952144S1 (en) 2019-06-25 2022-05-17 Cilag Gmbh International Surgical staple cartridge retainer with firing system authentication key
USD950728S1 (en) 2019-06-25 2022-05-03 Cilag Gmbh International Surgical staple cartridge
CN110236787B (zh) * 2019-06-26 2024-04-05 北京大学深圳医院 耳后骨膜撑开器
DE102019118508B3 (de) * 2019-07-09 2020-12-24 Carl Zeiss Meditec Ag Optische Abbildungsvorrichtung und Verfahren zur Verbesserung dargestellter Bilder
EP4003138A4 (en) 2019-07-25 2023-08-30 Uroviu Corp. Disposable endoscopy cannula with integrated grasper
US10820871B1 (en) 2019-08-09 2020-11-03 GE Precision Healthcare LLC Mobile X-ray imaging system including a parallel robotic structure
KR102447954B1 (ko) * 2019-12-09 2022-09-28 김영철 다관절 고정암 기반 촬영제어기능 구비형 작업대 시스템
DE102019134329B4 (de) * 2019-12-13 2021-11-25 Carl Zeiss Meditec Ag Aufhängung für digitales Operationsmikroskop mit Positionskorrektur, optisches Gerät und Verfahren zu dessen Betrieb
WO2022056400A1 (en) * 2020-09-13 2022-03-17 Micron Vision Corp. Portable and ergonomic endoscope with disposable cannula
US12214116B2 (en) * 2020-09-14 2025-02-04 Alcon Inc. Methods and systems for providing control stability in a vacuum generation system using cascade proportional-integral-derivative (PID) controller
US20240000513A1 (en) * 2021-01-25 2024-01-04 Smith & Nephew, Inc. Systems and methods for fusing arthroscopic video data
KR102658988B1 (ko) * 2021-04-29 2024-04-22 주식회사 삼육오엠씨(365mc) 입체 체형 스캐닝 장치
CN114209354B (zh) * 2021-12-20 2024-10-01 深圳开立生物医疗科技股份有限公司 一种超声图像的显示方法、装置、设备及可读存储介质
DE102022200819A1 (de) * 2022-01-25 2023-07-27 Carl Zeiss Meditec Ag Verfahren zum Betreiben eines stereoskopischen medizinischen Mikroskops und medizinisches Mikroskop
WO2023177785A1 (en) 2022-03-17 2023-09-21 Mako Surgical Corp. Techniques for securing together components of one or more surgical carts
US12231612B2 (en) * 2022-04-02 2025-02-18 Mantis Health, Inc. Stereoscopic camera adapter for enabling down-hole data capture and transmission
DE102022120203A1 (de) * 2022-08-10 2024-02-15 Carl Zeiss Meditec Ag System zum Erfassen und Visualisieren von OCT-Signalen
JP2025531249A (ja) * 2022-09-21 2025-09-19 アルコン インコーポレイティド 手術処置のための薄型光学システム
CN115574214B (zh) * 2022-10-14 2024-05-07 季华实验室 一种可多自由度调节的相机对中装置
CN118210189B (zh) * 2024-05-22 2024-08-02 福建医科大学附属第一医院 一种脑部手术录像辅助装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040070822A1 (en) * 1999-09-21 2004-04-15 Olympus Optical Co., Ltd. Surgical microscopic system
US7471301B2 (en) * 2002-07-24 2008-12-30 Total Immersion Method and system enabling real time mixing of synthetic images and video images by a user
US8187167B2 (en) * 2004-10-28 2012-05-29 Jae-Hwang Kim Monitoring apparatus for laparoscopic surgery and display method thereof
US8294733B2 (en) * 2007-06-08 2012-10-23 Olympus Corporation Endoscopic image viewing program and method

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3827429B2 (ja) * 1997-04-03 2006-09-27 オリンパス株式会社 手術用顕微鏡
US20010055062A1 (en) * 2000-04-20 2001-12-27 Keiji Shioda Operation microscope
DE10203215B4 (de) * 2002-01-28 2004-09-09 Carl Zeiss Jena Gmbh Mikroskop, insbesondere Operationsmikroskop
US20050096646A1 (en) * 2003-10-31 2005-05-05 Parris Wellman Surgical system for retracting and severing tissue
DE102005018432A1 (de) * 2005-04-21 2006-10-26 Leica Microsystems (Schweiz) Ag Optisches System mit Display
US9386914B2 (en) * 2007-04-04 2016-07-12 Karl Storz Endovision, Inc. Video endoscopic device with detachable control circuit
AU2009325140A1 (en) * 2008-12-10 2011-06-30 Minimally Invasive Devices, Inc Systems and methods for optimizing and maintaining visualization of a surgical field during the use of surgical scopes
US20130250081A1 (en) * 2012-03-21 2013-09-26 Covidien Lp System and method for determining camera angles by using virtual planes derived from actual images
WO2014102561A1 (en) * 2012-12-26 2014-07-03 Verathon Medical (Canada) Ulc Video retractor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040070822A1 (en) * 1999-09-21 2004-04-15 Olympus Optical Co., Ltd. Surgical microscopic system
US7471301B2 (en) * 2002-07-24 2008-12-30 Total Immersion Method and system enabling real time mixing of synthetic images and video images by a user
US8187167B2 (en) * 2004-10-28 2012-05-29 Jae-Hwang Kim Monitoring apparatus for laparoscopic surgery and display method thereof
US8294733B2 (en) * 2007-06-08 2012-10-23 Olympus Corporation Endoscopic image viewing program and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3087424A4 *

Cited By (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9723976B2 (en) 2012-06-27 2017-08-08 Camplex, Inc. Optics for video camera on a surgical visualization system
US11166706B2 (en) 2012-06-27 2021-11-09 Camplex, Inc. Surgical visualization systems
US9615728B2 (en) 2012-06-27 2017-04-11 Camplex, Inc. Surgical visualization system with camera tracking
US9216068B2 (en) 2012-06-27 2015-12-22 Camplex, Inc. Optics for video cameras on a surgical visualization system
US11889976B2 (en) 2012-06-27 2024-02-06 Camplex, Inc. Surgical visualization systems
US9629523B2 (en) 2012-06-27 2017-04-25 Camplex, Inc. Binocular viewing assembly for a surgical visualization system
US9642606B2 (en) 2012-06-27 2017-05-09 Camplex, Inc. Surgical visualization system
US9681796B2 (en) 2012-06-27 2017-06-20 Camplex, Inc. Interface for viewing video from cameras on a surgical visualization system
US9492065B2 (en) 2012-06-27 2016-11-15 Camplex, Inc. Surgical retractor with video cameras
US10555728B2 (en) 2012-06-27 2020-02-11 Camplex, Inc. Surgical visualization system
US10925472B2 (en) 2012-06-27 2021-02-23 Camplex, Inc. Binocular viewing assembly for a surgical visualization system
US9936863B2 (en) 2012-06-27 2018-04-10 Camplex, Inc. Optical assembly providing a surgical microscope view for a surgical visualization system
US11129521B2 (en) 2012-06-27 2021-09-28 Camplex, Inc. Optics for video camera on a surgical visualization system
US10022041B2 (en) 2012-06-27 2018-07-17 Camplex, Inc. Hydraulic system for surgical applications
US11389146B2 (en) 2012-06-27 2022-07-19 Camplex, Inc. Surgical visualization system
US10925589B2 (en) 2012-06-27 2021-02-23 Camplex, Inc. Interface for viewing video from cameras on a surgical visualization system
US10231607B2 (en) 2012-06-27 2019-03-19 Camplex, Inc. Surgical visualization systems
US9782159B2 (en) 2013-03-13 2017-10-10 Camplex, Inc. Surgical visualization systems
US12396802B2 (en) 2013-03-13 2025-08-26 Stryker Corporation Systems and methods for establishing virtual constraint boundaries
US11918305B2 (en) 2013-03-13 2024-03-05 Stryker Corporation Systems and methods for establishing virtual constraint boundaries
US11464579B2 (en) 2013-03-13 2022-10-11 Stryker Corporation Systems and methods for establishing virtual constraint boundaries
US10932766B2 (en) 2013-05-21 2021-03-02 Camplex, Inc. Surgical visualization systems
US10028651B2 (en) 2013-09-20 2018-07-24 Camplex, Inc. Surgical visualization systems and displays
US10881286B2 (en) 2013-09-20 2021-01-05 Camplex, Inc. Medical apparatus for use with a surgical tubular retractor
US10568499B2 (en) 2013-09-20 2020-02-25 Camplex, Inc. Surgical visualization systems and displays
US11147443B2 (en) 2013-09-20 2021-10-19 Camplex, Inc. Surgical visualization systems and displays
US10702353B2 (en) 2014-12-05 2020-07-07 Camplex, Inc. Surgical visualizations systems and displays
US11154378B2 (en) 2015-03-25 2021-10-26 Camplex, Inc. Surgical visualization systems and displays
US11076816B2 (en) 2015-08-02 2021-08-03 P-Cure Ltd. Imaging system and method
CN108139579A (zh) * 2015-10-14 2018-06-08 徕卡仪器(新加坡)有限公司 具有至少两个有不同反射-透射比的分束表面的分束器装置
WO2017065688A1 (en) * 2015-10-14 2017-04-20 Leica Instruments (Singapore) Pte Ltd Beam splitter device having at least two beamsplitting surfaces with different reflection-to-transmission ratios
EP3156839A1 (en) * 2015-10-14 2017-04-19 Leica Instruments (Singapore) Pte. Ltd. Beam splitter device having at least two beamsplitting surfaces with different reflection-to-transmission ratios
US10516880B2 (en) 2015-11-06 2019-12-24 Tait Towers Manufacturing, LLC Coordinated view display device
WO2017147070A1 (en) * 2015-11-06 2017-08-31 Tait Towers Manufacturing, LLC Coordinated view display device
US10966798B2 (en) 2015-11-25 2021-04-06 Camplex, Inc. Surgical visualization systems and displays
US12364548B2 (en) 2015-12-31 2025-07-22 Stryker Corporation Systems and methods for comparing localization and vision data to identify an avoidance region
US10667868B2 (en) 2015-12-31 2020-06-02 Stryker Corporation System and methods for performing surgery on a patient at a target site defined by a virtual object
US11103315B2 (en) 2015-12-31 2021-08-31 Stryker Corporation Systems and methods of merging localization and vision data for object avoidance
US11806089B2 (en) 2015-12-31 2023-11-07 Stryker Corporation Merging localization and vision data for robotic control
US12167888B2 (en) 2016-03-10 2024-12-17 RELIGN Corporation Arthroscopic devices and methods
US11207119B2 (en) 2016-03-11 2021-12-28 RELIGN Corporation Arthroscopic devices and methods
US12096969B2 (en) 2016-03-11 2024-09-24 RELIGN Corporation Arthroscopic devices and methods
US11172953B2 (en) 2016-04-11 2021-11-16 RELIGN Corporation Arthroscopic devices and methods
US11622784B2 (en) 2016-04-11 2023-04-11 RELIGN Corporation Arthroscopic devices and methods
US12042167B2 (en) 2016-04-11 2024-07-23 RELIGN Corporation Arthroscopic devices and methods
EP3442446A4 (en) * 2016-04-11 2020-03-04 Relign Corporation ARTHROSCOPIC DEVICES AND METHODS
JP2018130375A (ja) * 2017-02-16 2018-08-23 池上通信機株式会社 映像データ表示装置
US10918455B2 (en) 2017-05-08 2021-02-16 Camplex, Inc. Variable light source
EP3681150A4 (en) * 2017-09-04 2021-01-13 Kajita, Hiroki MULTI-POINT-OF-VIEW VIDEO IMAGE VIEWING SYSTEM AND CAMERA SYSTEM
JPWO2019059314A1 (ja) * 2017-09-22 2020-10-08 株式会社ニコン 画像表示装置及び画像表示システム
WO2019059314A1 (ja) * 2017-09-22 2019-03-28 株式会社ニコン 画像表示装置及び画像表示システム
US12408998B2 (en) 2019-07-03 2025-09-09 Stryker Corporation Obstacle avoidance techniques for surgical navigation

Also Published As

Publication number Publication date
US20150297311A1 (en) 2015-10-22
EP3087424A4 (en) 2017-09-27
JP2020000921A (ja) 2020-01-09
EP3087424A1 (en) 2016-11-02
JP2017507680A (ja) 2017-03-23
US20240382174A1 (en) 2024-11-21
US20230145221A1 (en) 2023-05-11

Similar Documents

Publication Publication Date Title
US20240382174A1 (en) Surgical visualization systems
US11389146B2 (en) Surgical visualization system
US20220249078A1 (en) Optics for video camera on a surgical visualization system
US10932766B2 (en) Surgical visualization systems

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14873324

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016542194

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2014873324

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014873324

Country of ref document: EP