US20200221939A1 - Portable surgical methods, systems, and apparatus - Google Patents
Portable surgical methods, systems, and apparatus Download PDFInfo
- Publication number
- US20200221939A1 US20200221939A1 US16/828,686 US202016828686A US2020221939A1 US 20200221939 A1 US20200221939 A1 US 20200221939A1 US 202016828686 A US202016828686 A US 202016828686A US 2020221939 A1 US2020221939 A1 US 2020221939A1
- Authority
- US
- United States
- Prior art keywords
- camera
- stand
- kit
- inches
- case
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title abstract description 15
- 210000002310 elbow joint Anatomy 0.000 claims description 16
- 238000012800 visualization Methods 0.000 claims description 12
- 238000004891 communication Methods 0.000 claims description 6
- 239000011521 glass Substances 0.000 claims description 4
- 238000001356 surgical procedure Methods 0.000 abstract description 44
- 241000282461 Canis lupus Species 0.000 description 25
- 230000033001 locomotion Effects 0.000 description 18
- 230000003287 optical effect Effects 0.000 description 13
- 210000003128 head Anatomy 0.000 description 8
- 238000004659 sterilization and disinfection Methods 0.000 description 6
- 230000001954 sterilising effect Effects 0.000 description 5
- 230000003190 augmentative effect Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 238000013461 design Methods 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 238000009826 distribution Methods 0.000 description 3
- 230000002452 interceptive effect Effects 0.000 description 3
- 229940074731 ophthalmologic surgical aids Drugs 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 230000001755 vocal effect Effects 0.000 description 3
- 230000002411 adverse Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000926 neurological effect Effects 0.000 description 2
- 229920000642 polymer Polymers 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 230000006641 stabilisation Effects 0.000 description 2
- 238000011105 stabilization Methods 0.000 description 2
- 208000002177 Cataract Diseases 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 238000013132 cardiothoracic surgery Methods 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 230000003749 cleanliness Effects 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000002316 cosmetic surgery Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000007418 data mining Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000010006 flight Effects 0.000 description 1
- 239000006260 foam Substances 0.000 description 1
- 238000002682 general surgery Methods 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- ACGUYXCXAPNIKK-UHFFFAOYSA-N hexachlorophene Chemical compound OC1=C(Cl)C=C(Cl)C(Cl)=C1CC1=C(O)C(Cl)=CC(Cl)=C1Cl ACGUYXCXAPNIKK-UHFFFAOYSA-N 0.000 description 1
- 238000011065 in-situ storage Methods 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 210000001503 joint Anatomy 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 229910001092 metal group alloy Inorganic materials 0.000 description 1
- 150000002739 metals Chemical class 0.000 description 1
- 238000002406 microsurgery Methods 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 238000004091 panning Methods 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000035939 shock Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 239000010935 stainless steel Substances 0.000 description 1
- 229910001220 stainless steel Inorganic materials 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00025—Operational features of endoscopes characterised by power management
- A61B1/00027—Operational features of endoscopes characterised by power management characterised by power supply
- A61B1/00032—Operational features of endoscopes characterised by power management characterised by power supply internally powered
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00064—Constructional details of the endoscope body
- A61B1/00108—Constructional details of the endoscope body characterised by self-sufficient functionality for stand-alone use
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00147—Holding or positioning arrangements
- A61B1/00149—Holding or positioning arrangements using articulated arms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/30—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/51—Housings
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/555—Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H04N5/225—
-
- H04N5/2252—
-
- H04N5/23203—
-
- H04N5/247—
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/044—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for absorption imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00203—Electrical control of surgical instruments with speech control or speech recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00207—Electrical control of surgical instruments with hand gesture control or hand gesture recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00216—Electrical control of surgical instruments with eye tracking or head position tracking control
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00221—Electrical control of surgical instruments with wireless transmission of data, e.g. by infrared radiation or radiowaves
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00681—Aspects not otherwise provided for
- A61B2017/00734—Aspects not otherwise provided for battery operated
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B50/00—Containers, covers, furniture or holders specially adapted for surgical or diagnostic appliances or instruments, e.g. sterile covers
- A61B50/30—Containers specially adapted for packaging, protecting, dispensing, collecting or disposing of surgical or diagnostic appliances or instruments
- A61B50/31—Carrying cases or bags, e.g. doctors' bags
- A61B2050/311—Cases
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/30—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
- A61B2090/309—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using white LEDs
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
- A61B2090/3616—Magnifying glass
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/371—Surgical systems with images on a monitor during operation with simultaneous use of two cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/372—Details of monitor hardware
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
- A61B2090/502—Headgear, e.g. helmet, spectacles
-
- H04N2005/2255—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2213/00—Details of stereoscopic systems
- H04N2213/001—Constructional or mechanical details
Definitions
- Existing surgical visualization systems typically include a surgeon's microscope, beam splitters, an assistant's microscope, a light source, stand and stabilization systems, video cameras, etc. These existing systems are large and heavy; and, due to the components typically found in such systems, have complex assembly requirements and require complex sterilization and draping procedures. Additionally, in use, these systems require that the surgeon constantly look through a fixed eye-piece of the surgeon's microscope while performing delicate surgeries for prolonged periods, which increases the risks of surgeon fatigue. Also, in addition to being expensive and requiring dedicated infrastructure, conventional surgical visualization systems (optical, digital, or a combination thereof) are not easy to move, and require tedious balancing and calibration procedures, which can be a major concern in developing countries during transport of operating room (OR) equipment from one remote site to another.
- OR operating room
- the invention is embodied in portable surgical methods, systems, and apparatus.
- the surgical systems may include a camera configured to capture images, viewing equipment configured to receive and display the captured images, a processor, and a stand.
- the camera, the viewing equipment, the processor, and the stand are configured to be housed in a case.
- Surgery may be performed using the surgical system by retrieving surgical components from the case, assembling the retrieved surgical components into a surgical system, positioning a patient within the surgical system for surgery, configuring the surgical system, performing the surgery with the surgical system, reconfiguring the surgical system during the surgery, disassembling the surgical system after the surgery, and placing the components in the case.
- FIG. 1 depicts a surgical system in accordance with aspects of the invention
- FIG. 2A depicts a case for transporting the surgical system of FIG. 1 in accordance with an aspect of the invention
- FIG. 2B depicts a wireless viewing equipment headset for use in the surgical system of FIG. 1 in accordance with an aspect of the invention
- FIG. 3 depicts a method for setting up a surgical system to perform a surgery in accordance with aspects of the invention
- FIG. 4A depicts an assembled stand in accordance with aspects of the invention
- FIG. 4B depicts the stand of FIG. 4A in a disassembled state
- FIG. 5A depicts a front view of a visor for use with the headset of FIG. 2B in accordance with aspects of the invention
- FIG. 5B depicts a side view of the visor of FIG. SA.
- FIGS. 6A-6E and 7A-7D show different case-stand embodiments respectively.
- FIG. 1 depicts a surgical system in accordance with aspects of the invention being used by a surgeon 102 and an assistant 104 to operate on a patient 106 positioned on a table 108 .
- the patient 106 is shown in a horizontal position (e.g., lying down on an operating table), it will be understood that the surgical system may be used with the patient oriented in other planes such as in a vertical plane (e.g., seated upright in an examination setting) or in an oblique plane (e.g., slanted seating in a dentist's chair).
- the illustrated surgical system includes a battery 110 , a stand 112 , a processor 114 , a light source 116 , a camera 118 , and viewing equipment 120 .
- the battery 110 may be a rechargeable battery that is rechargeable via a single power cord 111 .
- the battery may supply six or more hours of operation on a single charge.
- the processor 114 may be a processor in a conventional mobile device such as a smart phone or a tablet computer.
- the light source 116 may be a high luminosity “cold” light source such as a smart light emitting diode (LED) and may be configured to deliver coaxial or collimated light.
- LED smart light emitting diode
- the LEDs may be white, warm, or arranged in a combination array to produce a desired color temperature and wavelength(s) depending of the type of surgery to be performed and/or the type of tissue being operated upon.
- the camera 118 may be a three-dimensional ( 3 D) stereo camera with voice activated zoom and positioning (e.g., in the x, y, and z directions). Suitable batteries, processors 114 , light sources 116 , and cameras 118 will be understood by one of skill in the art from the description herein.
- the illustrated stand 112 supports the battery 110 , the processor 114 , the light source 116 , and the camera 118 .
- the stand 112 may also support additional ports 119 for transferring information between the equipment supported by the stand 112 and other equipment in the operating room.
- the battery 110 , stand 112 , processor 114 , light source 116 , and camera 118 may be configured for releasable assembly (e.g., using friction, snap fit, and/or twist connections).
- one or more of the components may each be implemented as an individual system module (hardware) designed so that they facilitate a quick and easy electrical/electronic connection through a releasable assembly; e.g., the operating system such as a Linux based Kernel is optimized for a rapid boot time supporting ‘plug-and-play’ features for instantly integrating the other components.
- the operating system such as a Linux based Kernel is optimized for a rapid boot time supporting ‘plug-and-play’ features for instantly integrating the other components.
- the stand 112 may be configured to position/orient a device such as the camera 118 mounted on a stage of the stand along one or more axis and/or around one or more axis.
- the stand may be configured to orient the stage/camera in three orthogonal axes (e.g., positioning in the x, y, and z directions) and to rotate the stage/camera about the axis (e.g., to pan, tilt, and rotate the camera) to enable positioning/orienting the stage/camera to accommodate positioning of the patient in multiple planes.
- the stand may be configurable/adjustable/customizable for use with one or more accessories, e.g., to serve the needs of a particular surgical specialty and/or procedure.
- the stand may be configured to serve as a holding and positioning arm for a neuro endoscope for performing neurological procedures.
- the stand includes a base, a first arm configured for attachment to the base, a second arm configured for attachment to the camera, and a rotatable elbow joint coupled between the first and second arms. At least one of the first and second arms may be a telescoping arm.
- a handheld endoscope may be incorporated as an accessory to the core system through the video feed of the endoscope.
- the video feed from the endoscope may be fed to and processed by the processor 114 .
- the processor 114 may then display an image from the endoscope in a similar manner to that from the camera module when it is mounted on the stage.
- the light source for the endoscope may be a separate light of light siphoned from the light source 116 , e.g., via fiber optic cable.
- the stand 112 may be positioned manually and/or may be robotically positioned based on instructions received from an operator to change the position, orientation, and/or field of view of the camera 118 on the stage.
- the operator may provide instructions via hand/foot movement, hand/head gestures, and/or with voice activated controls for raising/lowering/positioning/orienting the stand, which, in turn, raises/lowers/positions/orients the stage on which the camera 118 is mounted.
- Hand movements may be received via a manual input device such as a joystick or mouse coupled to the processor 114 .
- Foot movements may be received via a manual input device such as one or more foot pedals (e.g., one foot pedal to raise/lower the stage and one foot pedal to move the stage in/out) coupled to the processor 114 .
- Head gestures may be received via an input device such as one or more motion sensors positioned in a headset (e.g., a wireless headset) coupled to the processor 114 .
- Hand gestures may be received via an input device such as one or more motion sensors (e.g., IR motion sensors) coupled to the processor 114 .
- Voice/verbal commands may be received via an input device such as a microphone coupled to the processor 114 .
- the input devices may be coupled to the processor via a wired connection or a wireless connection (e.g., Infrared (IR), Bluetooth, near field communication (NFC), WiFi, etc.).
- IR Infrared
- NFC
- the processor 114 may be configured to interpret signals received from one or more of the input devices and to control the stand in accordance with the operator's intentions.
- the processor may be configured to convert conventional speech to commands (e.g., trigger words) that may then be used to position/orient the stand and, in turn, the stage/camera.
- voice commands/trigger words to operate the X, Y, Z stage include: “Scope, move right” (which may causes the stage to move the camera one increment in the X direction), “Scope, move left” (which may cause the stage to move the camera one increment in the ⁇ X direction), “Scope, move up” (which may cause the stage to move the camera one increment in the Y direction), and “Scope, move closer” (which may cause the stage to move the camera one increment in the ⁇ Z direction).
- auto positioning may be enabled, for example, utilizing computer vision algorithms, e.g. “scope, auto position to left pupil” (which may cause the stage/camera to track, locate and lock field of view on the pupil in the patient's left eye.
- a surgeon can activate head-tracking for camera adjustment by using a voice command such as “Activate, Head-tracking”.
- input devices such as motion sensors embedded in the headset monitor the position of the surgeon's head, and translate the head movements into a corresponding position calibrated for the camera, e.g., if the surgeon turns his head to the right, an equivalent movement of the camera is produced as the camera pans to the right (along with a corresponding change in the field of view (FOV) in the headset), a turn of the head to the left produces an equivalent movement of the camera as the camera pans to the left (along with a corresponding change in the FOV in the headset), and looking up/down, would result in the camera tilting up/down (along with a corresponding change in the FOV in the headset.
- FOV field of view
- the surgeon can deactivate head-tracking for camera adjustment by using voice commands such as “Lock Field of view” and/or “Deactivate, head-tracking”.
- a proximity sensor may be coupled to the processor 114 and positioned on the stage adjacent the camera 118 to accurately determine the distance between the stage/camera and an object of interest, e.g., the patient's tissue.
- the processor 114 may continually monitor the distance to ensure that a safe distance is maintained at all times between the camera and, for example, tissues being operated upon. For example, the processor 114 may ignore instructions received from an input device and/or display a warning indicator to an operator in the event that the instruction would cause the minimum distance to no longer be maintained.
- the light source 116 may be configurable by the processor 114 .
- brightness levels and/or color temperature maybe adjusted/controlled using commands/instructions received from input devices such as those described above with reference to adjusting the position/orientation of the stage, e.g., hand movements, foot movements, head gestures, handgestures, and/or voice/verbal commands received via an input device coupled to the processor 114 .
- the processor may be configured to interpret signals received from one or more of the input devices and to control the light source in accordance with the operator's intentions.
- the processor may be configured to convert conventional speech to commands (e.g., trigger words) that may then be used to configure the light source.
- voice commands/trigger words to operate the light source include “Light, ON” (turns light ON), “Light, 50%” (turns light to 50% intensity), “Light, temperature 4000 K” (Adjusts light color to 4000 deg. K.), “Light, dimmer” (decreases intensity by one increment), “Light, brighter” (increases intensity by one increment), “Light, auto adjust for capsulorhexis” (auto adjusts settings optimized for visualizing and performing capsulorhexis), and “Light, OFF” (turns light OFF).
- Lighting conditions may be used to achieve optimal visibility for certain surgical procedures (e.g., capsulorhexis).
- Algorithms for providing optimal visibility during procedures such as capsulorhexis may be implemented by processor 114 .
- Such algorithms take into consideration intrinsic and/or static conditions such as those involving the patient's medical case (e.g., specific type of cataract; chamber to be operated: anterior or posterior), as well as extrinsic and/or dynamic factors (e.g. the ambient light in the room).
- the conditions in the room may be monitored by processor 114 (e.g., through inputs from camera 118 or other components such as a light sensor on a surgeons headset) and the processor 114 may actively control and optimize output of the light source in terms of wavelength, intensity and/or color temperature for a larger and more stable red reflex zone based on the algorithm.
- processor 114 e.g., through inputs from camera 118 or other components such as a light sensor on a surgeons headset
- the processor 114 may actively control and optimize output of the light source in terms of wavelength, intensity and/or color temperature for a larger and more stable red reflex zone based on the algorithm.
- This light source may be a single unit or, for greater illumination and/or flexibility, multiple light source modules may be arranged and attached to each other via interlinks (e.g., magnetic interfaces and/or mechanical snap fits). These modules when connected may communicate with each other via NFC to optimize illumination.
- interlinks e.g., magnetic interfaces and/or mechanical snap fits
- the light source may include an auto mode or the processor 114 may be configured with an auto-mode to automatically adjust the light source.
- the processor 114 when set to auto-mode, information from various sensors such as color sensors (e.g., for adjustments based on the type of tissues being operated upon) and ambient light sensors is integrated by the light source or by the processor 114 , for example, to automatically adjust/optimize brightness levels and color temperatures of the light source. Additional computer vision algorithms may be implemented to enhance the auto-mode.
- Additional light sources may also be included.
- a high power LED may be incorporate into a headset as described in further detail below.
- the additional light sources may be controlled via voice/trigger words, e.g., “Headset—light on”, “headset—light brighter”, “headset—light dimmer” etc.)
- the camera 118 may include two or more cameras (e.g., high-definition (HD) cameras to provide a stereo configuration.
- One or more IR LED cameras and/or other small IR cameras may also be used.
- an IR LED camera may be added to the system and may provide a video feed that may be used for enhanced visualization of blood vessels.
- Functionality of the camera 118 such as zooming in and out, white balance, etc. maybe adjusted and controlled using commands/instructions received from input devices such as those described above with reference to adjusting the position/orientation of the stage, e.g., hand movements, foot movements, head gestures, hand gestures, and/or voice/verbal commands received via an input device coupled to the processor 114 .
- the processor may be configured to interpret signals received from one or more of the input devices and to control the camera in accordance with the operator's intentions.
- the processor may be configured to convert conventional speech to commands (e.g., trigger words) that may then be used to adjust the camera 118 .
- voice commands/trigger words to control the camera include “Camera, Zoom in” (magnifies the field of view by one increment), “Camera, zoom out” (de-magnifies the field of view by one increment), “Camera, Zoom to 25 ⁇ ” (adjusts magnification to 25 ⁇ ), etc.
- the illustrated viewing equipment 120 includes a surgeon's viewing equipment 120 a.
- the viewing equipment 120 may also include an assistant's viewing equipment 120 b.
- the surgeon's viewing equipment may be a wireless 3D virtual reality (VR)/augmented reality (AR) headset (see FIG. 2B ) that emulates an 80 inch high-definition (HD) screen. In-situ AR visualization using pre-op images from MRI/CT/Ultrasound may be superimposed in real-time for viewing via the headset.
- the assistant's viewing equipment 124 may be a wireless headset that displays live feeds from the camera 118 , real time diagnostics (e.g., from a remote database (not shown)), and/or on-demand surgical aids.
- a suitable headset for use at the surgeon's and/or the assistance equipment 120 a / 120 b includes an Epson Moverio BT-200 headset available from Epson America, Inc. of Long Beach, Calif.
- the headset may include one or more ports (e.g., a powered micro USB port) for coupling the headset to an accessory such as a visor. Modifications to the headset to implement one or more of the features described herein will be understood by one of skill in the art from the description herein.
- the headset may incorporate a visor or the visor may be a separate piece of equipment that may be attached to the headset.
- the visor may include a shield that is transparent, tinted, or contains a material such as liquid crystals to digitally adjust its transparency/opacity.
- the visor may be controlled through voice commands, e.g., “visor-full transparency” or “visor-full opacity.”
- the visor may be mounted to the 3D headset using micro servo motors, enabling hands free control to deploy/disengage the use of this accessory.
- An example visor that attached to a headset is described below with reference to FIGS. 5A and 5B .
- the viewing equipment may additionally include optical loupes, which can be permanently affixed, mounted via clips, or detachable via ring magnets to the headset (or the optical loupes may be incorporated into a visor that attaches to the headset).
- the optical loupes may include optical lenses/lens systems that have a magnification range from 2.5 ⁇ to 6 ⁇ , for example.
- the zoom functions of these loupes maybe adjusted using voice control (e.g. “Loupes zoom to 4 ⁇ ”, “Loupes—zoom out”, etc.).
- the optical loupes are digital loupes that produce a digital feed that can be processed using computer vision algorithms to display surgical overlays.
- the surgical loupes include two 1080p HD Digital camera modules, with each module providing an image resolution of 1920 by 1080 pixels.
- the system may be configured such that the surgeon can toggle between a 3d video display in a headset from the camera 118 or from the digital loupes.
- the surgeon may toggle between the views using, and/or completely mute the video (e.g., all headset displays are turned off), which enables viewing through optical loupes, for example, using a voice command (e.g., “switch to microscope”; “switch to loupes”; “video mute”; etc.), or he can use a physical action such as a tap to, for example, do a complete video mute and begin viewing through the optical loupes.
- the headset may be configured such that an action such as a double tap on the right temple area of the headset completely mutes the video.
- the system provides one or more of the following five viewing modes:
- MODE 1 Normal viewing: similar to an unobstructed view as seen through clear safety goggles; in this mode the visor is clear/transparent, and the video display in the headset is muted or OFF;
- MODE 2 View through optical loupes in this mode, the video display in the headset is muted or OFF;
- MODE 3 View through digital loupes: video feed from the two 1080P HD cameras located on the visor attachment is displayed in stereo in the headset;
- MODE 4 View through the surgical camera(s) mounted on the robotic stage: video feed is displayed in 2D or stereo in the headset; and
- MODE 5 View Split screen modes: Simultaneously view video feeds displayed as 2D from the camera and from the digital loupes.
- MODE 1 and MODE 2 do not require any power from the power supply.
- the visor is configured to be clear in the absence of power and the optical loupes are conventional non-digital loupes.
- the headset may function as a replacement for a Surgeons' Loupe used in conventional surgical systems.
- VR may be combined with AR to provide surgeons with high resolution mixed reality (MR) including graphical surgical overlays, real-time diagnostic aids, monitors.
- MR mixed reality
- Wireless communication may be performed using conventional wireless communication protocols, e.g., BluetoothTM WiFi, near field communication (NFC), etc.
- the viewing equipment 120 may additionally include a full-size monitor 120 c such as a HD television and/or a projector 120 d such as a 3D projection system.
- a dial-in, tele-surgery conference call system 134 may be provided to enable remote viewing of a surgical procedure. All data from a surgery including patient information, audio/video feeds, diagnostic logs, etc. may be stored, e.g., in a memory associated with the processor 114 and/or via simultaneous secure back to the cloud (not shown), e.g., via an encrypted transmission.
- the processor 114 may retrieve visual information from the camera 118 and transmit the visual information (e.g., via a wireless transceiver) to the viewing equipment 120 . Additionally, the processor 114 may receive audio signals from the wireless headsets, convert the audio signals to control signals (e.g., using convention voice recognition technology), and send the controls signals to the stand 112 and/or camera 118 , e.g., to properly position the camera 118 to obtain optimum images of a patient 106 during a surgical procedure. Additional voice enabled commands, ‘smart-gestures’ and/or eye-gaze tracking may be employed for zoom control, X, Y positioning, and activating inline surgical aids such as augmented reality visual overlays and additional diagnostic features. A video-mute feature may be implemented through the processor 114 , e.g., for micro-pauses during surgery.
- FIG. 2A depicts a case 200 for housing and transporting the surgical system of FIG. 1 .
- the case 200 may be a briefcase including cushioning components with cutouts for receiving the various components of the surgical system and transporting them in a secure manner.
- the case 200 may be robust, e.g., shock proof and weather-proof.
- the case 200 is dimensioned to enable the surgical system to comply with carry-on luggage requirements on commercial airline flights, e.g., having dimensions of approximately 22′′ ⁇ 14′′ ⁇ 9′′ or less.
- FIG. 3 depicts a method 300 in accordance with one example for performing a surgical procedure using a portable surgical system such as the system described above with reference to FIG. 1 . It will be understood that one or more of the steps depicted in FIG. 3 may be omitted and/or performed in a different order.
- components of the surgical system are retrieved from the case.
- a camera, viewing equipment, a processor, and a stand are retrieved from a case.
- a light source and a battery may also be retrieved from the case.
- the retrieved components are assembled.
- the stand is assembled and then the processor and the camera are coupled to the stand for support.
- the light source and the battery may additionally be coupled to the stand for support.
- the patient is positioned for surgery.
- the patient is positioned within the surgical system in a desired orientation, e.g., horizontal on a table, vertical in a chair, or at an angle in between.
- the surgical system is configured.
- the surgical system is configured manually and/or automatically (e.g., via voice commands) to perform the surgery.
- a surgery is performed using the surgical system.
- the surgeon performing the surgery periodically reconfigures the surgical system (e.g., via voice commands and/or hand/head gestures) represented by arrow leading from block 310 back to block 308 .
- the surgical system is disassembled.
- the processor and the camera are removed from the stand and then the stand is disassembled.
- the light source and the battery may additionally be removed from the stand prior to disassembling the stand.
- the components of the surgical procedure are placed back in the case.
- the camera, the viewing equipment, the processor, and the stand are placed in the case.
- the light source and the battery may also be placed in the case.
- FIG. 4A and FIG. 4B depict an example stand 400 in accordance with various aspects of the invention.
- FIG. 4A depicts the stand assembled and
- FIG. 4B depicts the stand disassembled.
- FIGS. 4A and 4B depict one example of a stand for use with the invention.
- Other configurations will be understood by one of skill in the art from the description herein.
- the stand 400 includes a stage 416 configured to support the camera 104 ( FIG. 1 ) and optionally a light module(s).
- the various components of the stand 400 enable the stage 416 (and, in turn, the camera 104 ) to be positioned along three axis of freedom and rotated about these axis.
- the stand 400 includes a base 402 .
- the base 402 includes three base modules 404 a, b, c.
- the base modules 404 may be assembled to form the base 402 .
- Each of the base modules 404 may have a length of 12 inches, an outside diameter (OD) of 1 inch, and a T-joint joint in the center to accommodate a 1 inch OD.
- the base modules 404 may be solid for stability and balance.
- the stand 400 additionally includes multiple connecting arms 406 a - e. In the illustrated embodiment there are five connecting arms.
- Four of the connecting arms (connecting arms 406 a - d ) have a length of 12 inches and an OD of 1 inch and one of the connecting arms 406 e has a length of 6 inches and an OD of 1 inch.
- the connecting arms 406 may be hollow, e.g., to reduce weight.
- a pair of couplers 408 a, b are provided for interconnection of components. The couplers may have a length of 1 inch and an inside diameter (ID) of 1 inch.
- a first coupler 408 a interconnects one connecting arm 406 a to another connection arm 406 b and a second coupler 408 b interconnects a connecting arm 406 b to a telescoping arm 410 .
- the telescoping arm 410 is provided to adjusted the height of the stage 416 .
- the telescoping arm 410 may be adjustable between a collapsed state (see FIG. 5 ) in which the arm may have a length of 12 inches and an extended state (see FIG. 4 ) in which the telescoping arm may have a length of 18 inches.
- the telescoping arm may be motorized and controlled in accordance with the description herein.
- a rotating coupler 412 is provided to rotate the stage 416 about a vertical axis extending through the base of the stand.
- the rotating coupler 412 may have a length of 3 inches.
- a pair of rotating elbow joints 414 a, b are provided to enable further adjustability of the height of the stage 416 and its position.
- a third elbow joint 414 c is provided to orient the stage relative to the other components in the stand.
- the third elbow joint is a stationary elbow joint.
- the stationary elbow joint 414 may be a 1 inch elbow joint.
- One or more of the telescoping arm 410 , the rotating coupler 412 , and the elbow joints 414 may be motorized and controlled in accordance with the description herein.
- the stand may be assembled by inserting the T-joints of base modules 404 a and 404 b into the ends of base module 404 c.
- a connecting arm 406 a may then be attached to the T-joint of base model 404 c.
- a first coupler 404 c may be attached between a first connecting arm 406 a and a second connecting arm 406 b.
- a second coupler 408 b may be attached between the second connecting arm 406 b and the telescoping arm 410 .
- the rotating coupler 412 may be attached between the telescoping arm 410 and the first elbow joint 414 a.
- the third connecting arm 406 c may be attach between the first elbow joint 414 a and the second elbow joint 414 b.
- a fourth connecting arm 406 d may be attached between the second elbow joint 414 b and the third elbow joint 414 c.
- a fifth connecting arm 406 e may be attached between the elbow joint 414 c and the stage 416
- Appropriate materials for the construction of the various components of the stand 400 include metals, metal alloys, polymers, and polymer composites suitable for use in a surgical setting.
- Appropriate surface finishes include unfinished (e.g., for stainless steel), paint, or other coatings suitable for surgical use.
- FIGS. 5A and 5B depict an example visor 500 .
- FIG. 5A is a front view of the visor and FIG. 5B is a side view of the visor.
- the visor includes a frame 502 and a shield 504 .
- the visor 500 may additionally include an attachment mechanism (e.g., a pair of magnetic links 510 a and 510 b ) for attaching the visor to a headset 120 and a connector 550 (such as a micro USB connector) for receiving power from the headset and exchanging data with the processor 114 via the headset ( FIG. 1 ).
- an attachment mechanism e.g., a pair of magnetic links 510 a and 510 b
- a connector 550 such as a micro USB connector
- the visor 500 includes a light 520 , a pair of optical loupes 530 a, b, and a pair of digital loupes 540 a, b in-line with the optical loupes 530 .
- the light 520 may be a high power LED.
- the optical loupes 530 may be supported by the shield 504 of the visor 500 .
- the digital loupes 540 may be 1080P HD camera module and may be supported by the frame 502 of the visor 500 . Communication between the components of the visor 500 and the headset 120 and/or processor 114 may be provided through the connector 550 .
- instructions to the visor 500 may be provided through the connector 550 .
- data from the visor 500 e.g., images from the digital loupes may be provided through the connector 550 .
- Connector 550 may also be used to supply power to components of the visor 500 (e.g., to the light 520 , the shield 504 , and/or the digital loupes 540 .
- a digital platform to enable and facilitate the development, distribution, and deployment of surgical software/applications (apps) for use with the surgical system described above with respect to FIG. 1 will be made available.
- Software developers including third party vendors with appropriate licensing will be able to use this digital platform for creating, distributing, and selling software/apps for surgeries, which will complement the hardware and features associated with the surgical system.
- an online “store-front” is provided. End-users/surgeons of the surgical system will be able access the ‘store-front’ through a user-interface of the surgical system. For example, end-users/surgeons can search, find and/or browse through a catalogue of software/apps., view features and pricing of software/apps available for the surgical system.
- An app may be available for instant download and deployment on the surgical system. Depending on the functionality of the downloaded app it may be used via the surgical system, for example, prior and/or during surgeries, for assessment during investigations of adverse events, and/or for training/educational purposes etc. Additionally, this digital distribution platform may be utilized to remotely provide and perform system maintenance and/or upgrades.
- system may be configured to automatically provide contextual information by data mining (e.g., in real-time) of the most recent publications relevant to a surgery while the surgery is being performed to provide the surgeon with access to the latest surgical techniques. Also, the system may be configured to use cognitive load sharing tools for virtual assistance with performing complex surgical procedures.
- the system provides features for automatic generation of comprehensive surgical reports.
- the reports generated may be text based and optimized for printing on paper; these may include snippets of speech (converted to text) from the surgical staff interspersed with other information; screen shots of the video footage from the surgery. Additionally detailed electronic reports with interactive features and audio visual inserts maybe generated.
- the surgical system is designed to facilitate effective sterilization/disinfection. This may be accomplished through sanitary design of fittings, fixtures and joints. Electronics and sensitive components can be bagged up/encased during surgery in specially designed sterile plastic bags/sleeves. These bags/sleeves can be supplied as sterile (Gama, EtO) or Ready for onsite steam sterilization (single use or multiple use).
- the stand 112 may be formed from multiple components that can be quickly dissembled so that it can fit into a standard steam sterilization tray for autoclaving.
- aspects of the invention enable substantial reduction in the size, number of individual fixtures, and/or required complexity in assembly, which are common and inherent to existing surgical visualization systems; enhanced surgical outcomes by integrating intuitive hands free controls and/or a variety of CAS (Computer Assisted Surgery) software tools; reduce surgeon fatigue; and/or provide economical pricing.
- CAS Computer Assisted Surgery
- aspects of the invention are particularly useful for a wide range of surgical procedures including general surgery, ophthalmic surgery, pediatric surgery, cardiothoracic surgery, neurosurgery, cosmetic surgery, microsurgery, ENT surgery, dental/micro endodontic surgery, and military/battlefield surgery. Additionally, the inventive surgical systems and methods described herein may be utilized for training, education, and research studies, e.g., during small/large animal surgery. The surgical system and methods may also be used in poorly equipped ORs scattered across remote areas in third world countries.
- an economically priced, all-inclusive, compact, digital, high resolution, 3D surgical visualization system including a microscope camera, wireless HD Virtual Reality/Augmented Reality headset(s), high luminosity LED based cold light source, and foldable stand, all integrated with cutting edge computer assisted surgical aids (augmented reality overlays, inline monitors & diagnostic tools), with voice activated control software and smart gesture controls for hands free operation, and an inbuilt rechargeable power source, all of which can be packed as a kit and transported as one single briefcase.
- a compact inexpensive all-inclusive high tech surgical visualization system such as this can radically transform the overall outcome of surgeries performed, especially in economically challenged nations (such as for those surgeons who operate in multiple make-shift and mobile clinics with very limited equipment across remote locations in the countries of South America, Africa, Asia).
- a case may contain and transport the surgical visualization apparatus.
- the case may just be a clamshell design with an open volume that contains the apparatus's components, perhaps with foam or other form-fitting packaging to minimize damage.
- FIGS. 6A-6E and 7A-7D show variations of this stand-case 600 , 700 that shows the base 610 , 710 only, although it should be appreciated that this base could be part of a clamshell case design or other container concept with a top closure piece that it not shown.
- FIGS. 6A-6E show a stand case 600 that includes the base 610 that defines an interior volume for storage and a stand 620 that extends therefrom.
- the stand 620 includes an arm 630 shown as telescoping through segments 632 , 634 , 636 , 638 , 639 that releasably lock in place, where the end-most of the telescoping segments 639 may connect to other segments to hold the camera mentioned above.
- the arm 630 pivots about an axis 631 that may be a pin from its stored configuration ( FIG. 6E ) to its extended configuration ( FIGS. 6A-6D ).
- a bracing 640 that may have two side arms 642 pivotally connects the lowermost segment 632 to a base track 616 attached to the base's bottom 612 via an engagement piece 644 .
- the bracing 640 arms 642 draw the engagement piece 644 along the track 616 in a sliding—but attached—engagement.
- the bracing 640 acts to additionally stabilize the arm 630 from movement that would adversely affect the camera. Reversing the movement of the arm 630 reverses the sequence to its stored configuration.
- the lowermost segment 632 may be engaged to an arm base piece 635 that is secured to the base 630 or another piece of track 633 .
- the base piece may contain the axis 631 pivot.
- FIGS. 7A-7D show a variation of the stand 720 extending from the base 710 , in which the arm 730 includes two segments 732 , 734 , where the segments rotate such their movement relative to one another can be secured in a friction engagement.
- the lower segment 732 engages to an arm piece 735 and may rotate about an axis 735 a with respect thereto.
- the arm piece 735 When raising the arm 730 from the base 710 , the arm piece 735 rotates about an axis 751 that may include a pin in a base piece 750 connected thereby to the arm piece 735 via a pivot piece 739 , such that when the lower segment 732 is fully extended, it may engage or attach to a sidewall 712 of the base 710 .
- the glasses discussed herein may be OLED glasses that present a high quality, 1280 by 720 pixel image to the user.
- the pixel size may be well below minimum separable acuity under optimal conditions ( 1 arc-minute) as defined in the literature.
- latency for our purposes is defined as the time delay between the actual physical motion within the view of the CMOS cameras and time at which that motion is presented to the eyes. Latencies less than 300 milliseconds have been shown to not degrade performance of simulated surgical and game tasks and even with latency that is sufficient compared to these literature benchmarks, the system herein may be below 100 ms which will be undetectable by the user. Latency may be increased by: Increased CMOS sensor speed and/or the elimination of redundant processors.
- the invention described herein includes that stand and case as shown, that is also made with cleanliness and sterilization in mind. For example, it may allow for sterile draping of covering materials and include mounting/engagement points for such draping. Further, material choices will be such that they can be effectively sterilized. And even further, the design may minimize cavities that cannot be easily reached for cleaning.
Abstract
Portable surgical systems, methods, and kits are described. The surgical systems may include a camera configured to capture images, viewing equipment configured to receive and display the captured images, a processor, and a stand. The camera, the viewing equipment, the processor, and the stand are configured to be housed in a case. Surgery may be performed using the surgical system by retrieving surgical components from the case, assembling the retrieved surgical components into a surgical system, positioning a patient within the surgical system for surgery, configuring the surgical system, performing the surgery with the surgical system, reconfiguring the surgical system during the surgery, disassembling the surgical system after the surgery, and placing the components in the case.
Description
- The present application is a continuation-in-part application from U.S. patent Ser. No. 10,595,716 that issued on Mar. 24, 2020, which was a 35 USC 371 Application from PCT/US15/29888 filed May 18, 2015, which claimed priority to U.S. Provisional Application Ser. No. 61/990,938 filed on May 9, 2014. The application also claims priority to U.S. Provisional Application Ser. No. 62/981,704 filed Feb. 26, 2020. The contents of all of which are incorporated fully herein by reference.
- Existing surgical visualization systems typically include a surgeon's microscope, beam splitters, an assistant's microscope, a light source, stand and stabilization systems, video cameras, etc. These existing systems are large and heavy; and, due to the components typically found in such systems, have complex assembly requirements and require complex sterilization and draping procedures. Additionally, in use, these systems require that the surgeon constantly look through a fixed eye-piece of the surgeon's microscope while performing delicate surgeries for prolonged periods, which increases the risks of surgeon fatigue. Also, in addition to being expensive and requiring dedicated infrastructure, conventional surgical visualization systems (optical, digital, or a combination thereof) are not easy to move, and require tedious balancing and calibration procedures, which can be a major concern in developing countries during transport of operating room (OR) equipment from one remote site to another.
- The invention is embodied in portable surgical methods, systems, and apparatus. The surgical systems may include a camera configured to capture images, viewing equipment configured to receive and display the captured images, a processor, and a stand. The camera, the viewing equipment, the processor, and the stand are configured to be housed in a case. Surgery may be performed using the surgical system by retrieving surgical components from the case, assembling the retrieved surgical components into a surgical system, positioning a patient within the surgical system for surgery, configuring the surgical system, performing the surgery with the surgical system, reconfiguring the surgical system during the surgery, disassembling the surgical system after the surgery, and placing the components in the case.
- The invention is best understood from the following detailed description when read in connection with the accompanying drawings, with like elements having the same reference numerals. When a plurality of similar elements are present, a single reference numeral may be assigned to the plurality of similar elements with a small letter designation referring to specific elements. When referring to the elements collectively or to a non-specific one or more of the elements, the small letter designation may be dropped. Lines without arrows connecting components may represent a bi-directional exchange between these components. This emphasizes that according to common practice, the various features of the drawings are not drawn to scale. On the contrary, the dimensions of the various features are arbitrarily expanded or reduced for clarity. Included in the drawings are the following figures:
-
FIG. 1 depicts a surgical system in accordance with aspects of the invention; -
FIG. 2A depicts a case for transporting the surgical system ofFIG. 1 in accordance with an aspect of the invention; -
FIG. 2B depicts a wireless viewing equipment headset for use in the surgical system ofFIG. 1 in accordance with an aspect of the invention; -
FIG. 3 depicts a method for setting up a surgical system to perform a surgery in accordance with aspects of the invention; -
FIG. 4A depicts an assembled stand in accordance with aspects of the invention; -
FIG. 4B depicts the stand ofFIG. 4A in a disassembled state; -
FIG. 5A depicts a front view of a visor for use with the headset ofFIG. 2B in accordance with aspects of the invention; -
FIG. 5B depicts a side view of the visor of FIG. SA. -
FIGS. 6A-6E and 7A-7D show different case-stand embodiments respectively. -
FIG. 1 depicts a surgical system in accordance with aspects of the invention being used by asurgeon 102 and anassistant 104 to operate on apatient 106 positioned on a table 108. Although, in the illustrated embodiment, thepatient 106 is shown in a horizontal position (e.g., lying down on an operating table), it will be understood that the surgical system may be used with the patient oriented in other planes such as in a vertical plane (e.g., seated upright in an examination setting) or in an oblique plane (e.g., slanted seating in a dentist's chair). - The illustrated surgical system includes a
battery 110, astand 112, aprocessor 114, alight source 116, a camera 118, and viewing equipment 120. Thebattery 110 may be a rechargeable battery that is rechargeable via asingle power cord 111. The battery may supply six or more hours of operation on a single charge. Theprocessor 114 may be a processor in a conventional mobile device such as a smart phone or a tablet computer. Thelight source 116 may be a high luminosity “cold” light source such as a smart light emitting diode (LED) and may be configured to deliver coaxial or collimated light. The LEDs may be white, warm, or arranged in a combination array to produce a desired color temperature and wavelength(s) depending of the type of surgery to be performed and/or the type of tissue being operated upon. The camera 118 may be a three-dimensional (3D) stereo camera with voice activated zoom and positioning (e.g., in the x, y, and z directions). Suitable batteries,processors 114,light sources 116, and cameras 118 will be understood by one of skill in the art from the description herein. - The illustrated
stand 112 supports thebattery 110, theprocessor 114, thelight source 116, and the camera 118. Thestand 112 may also support additional ports 119 for transferring information between the equipment supported by thestand 112 and other equipment in the operating room. Thebattery 110, stand 112,processor 114,light source 116, and camera 118 may be configured for releasable assembly (e.g., using friction, snap fit, and/or twist connections). Additionally, one or more of the components may each be implemented as an individual system module (hardware) designed so that they facilitate a quick and easy electrical/electronic connection through a releasable assembly; e.g., the operating system such as a Linux based Kernel is optimized for a rapid boot time supporting ‘plug-and-play’ features for instantly integrating the other components. - The
stand 112 may be configured to position/orient a device such as the camera 118 mounted on a stage of the stand along one or more axis and/or around one or more axis. In an example, the stand may be configured to orient the stage/camera in three orthogonal axes (e.g., positioning in the x, y, and z directions) and to rotate the stage/camera about the axis (e.g., to pan, tilt, and rotate the camera) to enable positioning/orienting the stage/camera to accommodate positioning of the patient in multiple planes. Additionally, the stand may be configurable/adjustable/customizable for use with one or more accessories, e.g., to serve the needs of a particular surgical specialty and/or procedure. For example, the stand may be configured to serve as a holding and positioning arm for a neuro endoscope for performing neurological procedures. - In one embodiment, the stand includes a base, a first arm configured for attachment to the base, a second arm configured for attachment to the camera, and a rotatable elbow joint coupled between the first and second arms. At least one of the first and second arms may be a telescoping arm.
- A handheld endoscope may be incorporated as an accessory to the core system through the video feed of the endoscope. The video feed from the endoscope may be fed to and processed by the
processor 114. Theprocessor 114 may then display an image from the endoscope in a similar manner to that from the camera module when it is mounted on the stage. The light source for the endoscope may be a separate light of light siphoned from thelight source 116, e.g., via fiber optic cable. - As neurosurgical procedures typically require endoscopes with relatively smaller diameters and a high degrees of stabilization, stands such as those described herein are particularly well suited to support an endoscope for neurological procedures.
- The
stand 112 may be positioned manually and/or may be robotically positioned based on instructions received from an operator to change the position, orientation, and/or field of view of the camera 118 on the stage. The operator may provide instructions via hand/foot movement, hand/head gestures, and/or with voice activated controls for raising/lowering/positioning/orienting the stand, which, in turn, raises/lowers/positions/orients the stage on which the camera 118 is mounted. - Hand movements may be received via a manual input device such as a joystick or mouse coupled to the
processor 114. Foot movements may be received via a manual input device such as one or more foot pedals (e.g., one foot pedal to raise/lower the stage and one foot pedal to move the stage in/out) coupled to theprocessor 114. Head gestures may be received via an input device such as one or more motion sensors positioned in a headset (e.g., a wireless headset) coupled to theprocessor 114. Hand gestures may be received via an input device such as one or more motion sensors (e.g., IR motion sensors) coupled to theprocessor 114. Voice/verbal commands may be received via an input device such as a microphone coupled to theprocessor 114. The input devices may be coupled to the processor via a wired connection or a wireless connection (e.g., Infrared (IR), Bluetooth, near field communication (NFC), WiFi, etc.). - The
processor 114 may be configured to interpret signals received from one or more of the input devices and to control the stand in accordance with the operator's intentions. The processor may be configured to convert conventional speech to commands (e.g., trigger words) that may then be used to position/orient the stand and, in turn, the stage/camera. Some examples of voice commands/trigger words to operate the X, Y, Z stage include: “Scope, move right” (which may causes the stage to move the camera one increment in the X direction), “Scope, move left” (which may cause the stage to move the camera one increment in the −X direction), “Scope, move up” (which may cause the stage to move the camera one increment in the Y direction), and “Scope, move closer” (which may cause the stage to move the camera one increment in the −Z direction). Additionally auto positioning may be enabled, for example, utilizing computer vision algorithms, e.g. “scope, auto position to left pupil” (which may cause the stage/camera to track, locate and lock field of view on the pupil in the patient's left eye. - In one example of various aspects of the invention, a surgeon can activate head-tracking for camera adjustment by using a voice command such as “Activate, Head-tracking”. Upon activation, input devices such as motion sensors embedded in the headset monitor the position of the surgeon's head, and translate the head movements into a corresponding position calibrated for the camera, e.g., if the surgeon turns his head to the right, an equivalent movement of the camera is produced as the camera pans to the right (along with a corresponding change in the field of view (FOV) in the headset), a turn of the head to the left produces an equivalent movement of the camera as the camera pans to the left (along with a corresponding change in the FOV in the headset), and looking up/down, would result in the camera tilting up/down (along with a corresponding change in the FOV in the headset. After the FOV has been satisfactorily adjusted by corresponding panning/titling/rotating/zooming, the surgeon can deactivate head-tracking for camera adjustment by using voice commands such as “Lock Field of view” and/or “Deactivate, head-tracking”.
- A proximity sensor may be coupled to the
processor 114 and positioned on the stage adjacent the camera 118 to accurately determine the distance between the stage/camera and an object of interest, e.g., the patient's tissue. In addition to enabling theprocessor 114 to optimally position the stage/camera at the site of surgery, theprocessor 114 may continually monitor the distance to ensure that a safe distance is maintained at all times between the camera and, for example, tissues being operated upon. For example, theprocessor 114 may ignore instructions received from an input device and/or display a warning indicator to an operator in the event that the instruction would cause the minimum distance to no longer be maintained. Thelight source 116 may be configurable by theprocessor 114. In one example, brightness levels and/or color temperature maybe adjusted/controlled using commands/instructions received from input devices such as those described above with reference to adjusting the position/orientation of the stage, e.g., hand movements, foot movements, head gestures, handgestures, and/or voice/verbal commands received via an input device coupled to theprocessor 114. The processor may be configured to interpret signals received from one or more of the input devices and to control the light source in accordance with the operator's intentions. The processor may be configured to convert conventional speech to commands (e.g., trigger words) that may then be used to configure the light source. Some examples of voice commands/trigger words to operate the light source include “Light, ON” (turns light ON), “Light, 50%” (turns light to 50% intensity), “Light, temperature 4000 K” (Adjusts light color to 4000 deg. K.), “Light, dimmer” (decreases intensity by one increment), “Light, brighter” (increases intensity by one increment), “Light, auto adjust for capsulorhexis” (auto adjusts settings optimized for visualizing and performing capsulorhexis), and “Light, OFF” (turns light OFF). - Lighting conditions (e.g. in the green-blue wavelengths of the visible light spectrum for some cases) may be used to achieve optimal visibility for certain surgical procedures (e.g., capsulorhexis). Algorithms for providing optimal visibility during procedures such as capsulorhexis may be implemented by
processor 114. Such algorithms take into consideration intrinsic and/or static conditions such as those involving the patient's medical case (e.g., specific type of cataract; chamber to be operated: anterior or posterior), as well as extrinsic and/or dynamic factors (e.g. the ambient light in the room). For external and/or dynamic factors, the conditions in the room may be monitored by processor 114 (e.g., through inputs from camera 118 or other components such as a light sensor on a surgeons headset) and theprocessor 114 may actively control and optimize output of the light source in terms of wavelength, intensity and/or color temperature for a larger and more stable red reflex zone based on the algorithm. - This light source may be a single unit or, for greater illumination and/or flexibility, multiple light source modules may be arranged and attached to each other via interlinks (e.g., magnetic interfaces and/or mechanical snap fits). These modules when connected may communicate with each other via NFC to optimize illumination.
- In one embodiment, the light source may include an auto mode or the
processor 114 may be configured with an auto-mode to automatically adjust the light source. In accordance with this embodiment, when set to auto-mode, information from various sensors such as color sensors (e.g., for adjustments based on the type of tissues being operated upon) and ambient light sensors is integrated by the light source or by theprocessor 114, for example, to automatically adjust/optimize brightness levels and color temperatures of the light source. Additional computer vision algorithms may be implemented to enhance the auto-mode. - Additional light sources may also be included. For example, a high power LED may be incorporate into a headset as described in further detail below. The additional light sources may be controlled via voice/trigger words, e.g., “Headset—light on”, “headset—light brighter”, “headset—light dimmer” etc.)
- The camera 118 may include two or more cameras (e.g., high-definition (HD) cameras to provide a stereo configuration. One or more IR LED cameras and/or other small IR cameras may also be used. For example, an IR LED camera may be added to the system and may provide a video feed that may be used for enhanced visualization of blood vessels.
- Functionality of the camera 118 such as zooming in and out, white balance, etc. maybe adjusted and controlled using commands/instructions received from input devices such as those described above with reference to adjusting the position/orientation of the stage, e.g., hand movements, foot movements, head gestures, hand gestures, and/or voice/verbal commands received via an input device coupled to the
processor 114. The processor may be configured to interpret signals received from one or more of the input devices and to control the camera in accordance with the operator's intentions. The processor may be configured to convert conventional speech to commands (e.g., trigger words) that may then be used to adjust the camera 118. Some examples of voice commands/trigger words to control the camera include “Camera, Zoom in” (magnifies the field of view by one increment), “Camera, zoom out” (de-magnifies the field of view by one increment), “Camera, Zoom to 25×” (adjusts magnification to 25×), etc. - The illustrated viewing equipment 120 includes a surgeon's
viewing equipment 120 a. The viewing equipment 120 may also include an assistant'sviewing equipment 120 b. The surgeon's viewing equipment may be awireless 3D virtual reality (VR)/augmented reality (AR) headset (seeFIG. 2B ) that emulates an 80 inch high-definition (HD) screen. In-situ AR visualization using pre-op images from MRI/CT/Ultrasound may be superimposed in real-time for viewing via the headset. The assistant's viewing equipment 124 may be a wireless headset that displays live feeds from the camera 118, real time diagnostics (e.g., from a remote database (not shown)), and/or on-demand surgical aids. A suitable headset for use at the surgeon's and/or theassistance equipment 120 a/120 b includes an Epson Moverio BT-200 headset available from Epson America, Inc. of Long Beach, Calif. The headset may include one or more ports (e.g., a powered micro USB port) for coupling the headset to an accessory such as a visor. Modifications to the headset to implement one or more of the features described herein will be understood by one of skill in the art from the description herein. - The headset may incorporate a visor or the visor may be a separate piece of equipment that may be attached to the headset. The visor may include a shield that is transparent, tinted, or contains a material such as liquid crystals to digitally adjust its transparency/opacity. The visor may be controlled through voice commands, e.g., “visor-full transparency” or “visor-full opacity.” The visor may be mounted to the 3D headset using micro servo motors, enabling hands free control to deploy/disengage the use of this accessory. An example visor that attached to a headset is described below with reference to
FIGS. 5A and 5B . - The viewing equipment may additionally include optical loupes, which can be permanently affixed, mounted via clips, or detachable via ring magnets to the headset (or the optical loupes may be incorporated into a visor that attaches to the headset). The optical loupes may include optical lenses/lens systems that have a magnification range from 2.5× to 6×, for example. The zoom functions of these loupes maybe adjusted using voice control (e.g. “Loupes zoom to 4×”, “Loupes—zoom out”, etc.). In one embodiment, the optical loupes are digital loupes that produce a digital feed that can be processed using computer vision algorithms to display surgical overlays.
- In one embodiment, the surgical loupes include two 1080p HD Digital camera modules, with each module providing an image resolution of 1920 by 1080 pixels. The system may be configured such that the surgeon can toggle between a 3d video display in a headset from the camera 118 or from the digital loupes. The surgeon may toggle between the views using, and/or completely mute the video (e.g., all headset displays are turned off), which enables viewing through optical loupes, for example, using a voice command (e.g., “switch to microscope”; “switch to loupes”; “video mute”; etc.), or he can use a physical action such as a tap to, for example, do a complete video mute and begin viewing through the optical loupes. The headset may be configured such that an action such as a double tap on the right temple area of the headset completely mutes the video.
- In one embodiment, the system provides one or more of the following five viewing modes:
- MODE 1—Normal viewing: similar to an unobstructed view as seen through clear safety goggles; in this mode the visor is clear/transparent, and the video display in the headset is muted or OFF;
- MODE 2—View through optical loupes in this mode, the video display in the headset is muted or OFF;
- MODE 3—View through digital loupes: video feed from the two 1080P HD cameras located on the visor attachment is displayed in stereo in the headset;
- MODE 4—View through the surgical camera(s) mounted on the robotic stage: video feed is displayed in 2D or stereo in the headset; and
- MODE 5—View Split screen modes: Simultaneously view video feeds displayed as 2D from the camera and from the digital loupes.
- In accordance with one embodiment, MODE 1 and MODE 2 do not require any power from the power supply. In accordance with this embodiment, the visor is configured to be clear in the absence of power and the optical loupes are conventional non-digital loupes.
- Through the use of HD viewing equipment, the headset may function as a replacement for a Surgeons' Loupe used in conventional surgical systems. VR may be combined with AR to provide surgeons with high resolution mixed reality (MR) including graphical surgical overlays, real-time diagnostic aids, monitors. Wireless communication may be performed using conventional wireless communication protocols, e.g., Bluetooth™ WiFi, near field communication (NFC), etc.
- The viewing equipment 120 may additionally include a full-
size monitor 120 c such as a HD television and/or aprojector 120 d such as a 3D projection system. A dial-in, tele-surgeryconference call system 134 may be provided to enable remote viewing of a surgical procedure. All data from a surgery including patient information, audio/video feeds, diagnostic logs, etc. may be stored, e.g., in a memory associated with theprocessor 114 and/or via simultaneous secure back to the cloud (not shown), e.g., via an encrypted transmission. - The
processor 114 may retrieve visual information from the camera 118 and transmit the visual information (e.g., via a wireless transceiver) to the viewing equipment 120. Additionally, theprocessor 114 may receive audio signals from the wireless headsets, convert the audio signals to control signals (e.g., using convention voice recognition technology), and send the controls signals to thestand 112 and/or camera 118, e.g., to properly position the camera 118 to obtain optimum images of apatient 106 during a surgical procedure. Additional voice enabled commands, ‘smart-gestures’ and/or eye-gaze tracking may be employed for zoom control, X, Y positioning, and activating inline surgical aids such as augmented reality visual overlays and additional diagnostic features. A video-mute feature may be implemented through theprocessor 114, e.g., for micro-pauses during surgery. -
FIG. 2A depicts a case 200 for housing and transporting the surgical system ofFIG. 1 . The case 200 may be a briefcase including cushioning components with cutouts for receiving the various components of the surgical system and transporting them in a secure manner. The case 200 may be robust, e.g., shock proof and weather-proof. In one embodiment, the case 200 is dimensioned to enable the surgical system to comply with carry-on luggage requirements on commercial airline flights, e.g., having dimensions of approximately 22″×14″×9″ or less. -
FIG. 3 depicts amethod 300 in accordance with one example for performing a surgical procedure using a portable surgical system such as the system described above with reference toFIG. 1 . It will be understood that one or more of the steps depicted inFIG. 3 may be omitted and/or performed in a different order. - At
block 302, components of the surgical system are retrieved from the case. In an embodiment, a camera, viewing equipment, a processor, and a stand are retrieved from a case. A light source and a battery may also be retrieved from the case. - At
block 304, the retrieved components are assembled. In an embodiment, the stand is assembled and then the processor and the camera are coupled to the stand for support. The light source and the battery may additionally be coupled to the stand for support. - At
block 306, the patient is positioned for surgery. In an embodiment, the patient is positioned within the surgical system in a desired orientation, e.g., horizontal on a table, vertical in a chair, or at an angle in between. - At
block 308, the surgical system is configured. In an embodiment, the surgical system is configured manually and/or automatically (e.g., via voice commands) to perform the surgery. - At
block 310, a surgery is performed using the surgical system. In an embodiment, the surgeon performing the surgery periodically reconfigures the surgical system (e.g., via voice commands and/or hand/head gestures) represented by arrow leading fromblock 310 back to block 308. - At
block 312, the surgical system is disassembled. In an embodiment, the processor and the camera are removed from the stand and then the stand is disassembled. The light source and the battery may additionally be removed from the stand prior to disassembling the stand. - At
block 314, the components of the surgical procedure are placed back in the case. In an embodiment, the camera, the viewing equipment, the processor, and the stand are placed in the case. The light source and the battery may also be placed in the case. -
FIG. 4A andFIG. 4B depict anexample stand 400 in accordance with various aspects of the invention.FIG. 4A depicts the stand assembled andFIG. 4B depicts the stand disassembled.FIGS. 4A and 4B depict one example of a stand for use with the invention. Other configurations will be understood by one of skill in the art from the description herein. - The
stand 400 includes astage 416 configured to support the camera 104 (FIG. 1 ) and optionally a light module(s). The various components of thestand 400 enable the stage 416 (and, in turn, the camera 104) to be positioned along three axis of freedom and rotated about these axis. - The
stand 400 includes abase 402. Thebase 402 includes threebase modules 404 a, b, c. The base modules 404 may be assembled to form thebase 402. Each of the base modules 404 may have a length of 12 inches, an outside diameter (OD) of 1 inch, and a T-joint joint in the center to accommodate a 1 inch OD. The base modules 404 may be solid for stability and balance. - The
stand 400 additionally includes multiple connecting arms 406 a-e. In the illustrated embodiment there are five connecting arms. Four of the connecting arms (connecting arms 406 a-d) have a length of 12 inches and an OD of 1 inch and one of the connectingarms 406 e has a length of 6 inches and an OD of 1 inch. The connecting arms 406 may be hollow, e.g., to reduce weight. A pair ofcouplers 408 a, b are provided for interconnection of components. The couplers may have a length of 1 inch and an inside diameter (ID) of 1 inch. Afirst coupler 408 a interconnects one connectingarm 406 a to anotherconnection arm 406 b and asecond coupler 408 b interconnects a connectingarm 406 b to atelescoping arm 410. - The
telescoping arm 410 is provided to adjusted the height of thestage 416. Thetelescoping arm 410 may be adjustable between a collapsed state (seeFIG. 5 ) in which the arm may have a length of 12 inches and an extended state (seeFIG. 4 ) in which the telescoping arm may have a length of 18 inches. The telescoping arm may be motorized and controlled in accordance with the description herein. Arotating coupler 412 is provided to rotate thestage 416 about a vertical axis extending through the base of the stand. Therotating coupler 412 may have a length of 3 inches. A pair of rotatingelbow joints 414 a, b are provided to enable further adjustability of the height of thestage 416 and its position. A third elbow joint 414 c is provided to orient the stage relative to the other components in the stand. In the illustrated embodiment, the third elbow joint is a stationary elbow joint. The stationary elbow joint 414 may be a 1 inch elbow joint. One or more of thetelescoping arm 410, the rotatingcoupler 412, and the elbow joints 414 may be motorized and controlled in accordance with the description herein. - The stand may be assembled by inserting the T-joints of
base modules base module 404 c. A connectingarm 406 a may then be attached to the T-joint ofbase model 404 c. Afirst coupler 404 c may be attached between a first connectingarm 406 a and a second connectingarm 406 b. Asecond coupler 408 b may be attached between the second connectingarm 406 b and thetelescoping arm 410. Therotating coupler 412 may be attached between thetelescoping arm 410 and the first elbow joint 414 a. The thirdconnecting arm 406 c may be attach between the first elbow joint 414 a and the second elbow joint 414 b. A fourth connectingarm 406 d may be attached between the second elbow joint 414 b and the third elbow joint 414 c. A fifth connectingarm 406 e may be attached between the elbow joint 414 c and thestage 416. - Appropriate materials for the construction of the various components of the
stand 400 include metals, metal alloys, polymers, and polymer composites suitable for use in a surgical setting. Appropriate surface finishes include unfinished (e.g., for stainless steel), paint, or other coatings suitable for surgical use. -
FIGS. 5A and 5B depict anexample visor 500.FIG. 5A is a front view of the visor andFIG. 5B is a side view of the visor. The visor includes aframe 502 and ashield 504. Thevisor 500 may additionally include an attachment mechanism (e.g., a pair ofmagnetic links processor 114 via the headset (FIG. 1 ). - The
visor 500 includes a light 520, a pair ofoptical loupes 530 a, b, and a pair ofdigital loupes 540 a, b in-line with theoptical loupes 530. The light 520 may be a high power LED. Theoptical loupes 530 may be supported by theshield 504 of thevisor 500. Thedigital loupes 540 may be 1080P HD camera module and may be supported by theframe 502 of thevisor 500. Communication between the components of thevisor 500 and the headset 120 and/orprocessor 114 may be provided through theconnector 550. For example, instructions to thevisor 500, e.g., turn on/off the light 520, tint theshield 504, or turn on/off thedigital loupes 540, may be provided through theconnector 550. Additionally, data from thevisor 500, e.g., images from the digital loupes may be provided through theconnector 550.Connector 550 may also be used to supply power to components of the visor 500 (e.g., to the light 520, theshield 504, and/or thedigital loupes 540. - In accordance with aspects of the invention, a digital platform to enable and facilitate the development, distribution, and deployment of surgical software/applications (apps) for use with the surgical system described above with respect to
FIG. 1 will be made available. Software developers including third party vendors with appropriate licensing will be able to use this digital platform for creating, distributing, and selling software/apps for surgeries, which will complement the hardware and features associated with the surgical system. Before an app is made available for distribution and/or sale on the digital platform it may undergo a variety of robust test measures and/or have in-place all necessary regulatory clearances/approvals. - In accordance with other aspects, an online “store-front” is provided. End-users/surgeons of the surgical system will be able access the ‘store-front’ through a user-interface of the surgical system. For example, end-users/surgeons can search, find and/or browse through a catalogue of software/apps., view features and pricing of software/apps available for the surgical system. An app may be available for instant download and deployment on the surgical system. Depending on the functionality of the downloaded app it may be used via the surgical system, for example, prior and/or during surgeries, for assessment during investigations of adverse events, and/or for training/educational purposes etc. Additionally, this digital distribution platform may be utilized to remotely provide and perform system maintenance and/or upgrades.
- Surgeons can access these features on the system for training, education, and real-time guidance in an interactive format.
- These features, such as interactive medical encyclopedias, anatomical models associated with particular pathologies and/or their surgical specialty, may be available local on the system and/or accessible via applications run on the cloud.
- Additionally, the system may be configured to automatically provide contextual information by data mining (e.g., in real-time) of the most recent publications relevant to a surgery while the surgery is being performed to provide the surgeon with access to the latest surgical techniques. Also, the system may be configured to use cognitive load sharing tools for virtual assistance with performing complex surgical procedures.
- In addition to saving all information and feeds from the surgery, the system provides features for automatic generation of comprehensive surgical reports.
- The reports generated may be text based and optimized for printing on paper; these may include snippets of speech (converted to text) from the surgical staff interspersed with other information; screen shots of the video footage from the surgery. Additionally detailed electronic reports with interactive features and audio visual inserts maybe generated.
- In accordance with one aspect of the invention, the surgical system is designed to facilitate effective sterilization/disinfection. This may be accomplished through sanitary design of fittings, fixtures and joints. Electronics and sensitive components can be bagged up/encased during surgery in specially designed sterile plastic bags/sleeves. These bags/sleeves can be supplied as sterile (Gama, EtO) or Ready for onsite steam sterilization (single use or multiple use). The
stand 112 may be formed from multiple components that can be quickly dissembled so that it can fit into a standard steam sterilization tray for autoclaving. - Aspects of the invention enable substantial reduction in the size, number of individual fixtures, and/or required complexity in assembly, which are common and inherent to existing surgical visualization systems; enhanced surgical outcomes by integrating intuitive hands free controls and/or a variety of CAS (Computer Assisted Surgery) software tools; reduce surgeon fatigue; and/or provide economical pricing.
- Aspects of the invention are particularly useful for a wide range of surgical procedures including general surgery, ophthalmic surgery, pediatric surgery, cardiothoracic surgery, neurosurgery, cosmetic surgery, microsurgery, ENT surgery, dental/micro endodontic surgery, and military/battlefield surgery. Additionally, the inventive surgical systems and methods described herein may be utilized for training, education, and research studies, e.g., during small/large animal surgery. The surgical system and methods may also be used in poorly equipped ORs scattered across remote areas in third world countries.
- In one embodiment, an economically priced, all-inclusive, compact, digital, high resolution, 3D surgical visualization system is provided including a microscope camera, wireless HD Virtual Reality/Augmented Reality headset(s), high luminosity LED based cold light source, and foldable stand, all integrated with cutting edge computer assisted surgical aids (augmented reality overlays, inline monitors & diagnostic tools), with voice activated control software and smart gesture controls for hands free operation, and an inbuilt rechargeable power source, all of which can be packed as a kit and transported as one single briefcase.
- A compact inexpensive all-inclusive high tech surgical visualization system such as this can radically transform the overall outcome of surgeries performed, especially in economically challenged nations (such as for those surgeons who operate in multiple make-shift and mobile clinics with very limited equipment across remote locations in the countries of South America, Africa, Asia).
- Additionally, such a surgical visualization system would have particular applicability in an Emergency room (ER). There are numerous occasions wherein a procedure may benefit from enhanced visualization and magnification of a surgical scope, but for logistic reasons it is not possible to do so in an ER setting. An ultra-compact visualization system with the aforementioned capabilities and reasonably priced could greatly transform outcomes for emergency healthcare on a global scale.
- Case
- As discussed above, a case may contain and transport the surgical visualization apparatus. In its simplest form, the case may just be a clamshell design with an open volume that contains the apparatus's components, perhaps with foam or other form-fitting packaging to minimize damage.
- But in order to further minimize the volume required for the camera, battery, headset, and other gear to be contained in the case, the case itself may act as the base so that the base need not be transported separately within the case.
FIGS. 6A-6E and 7A-7D show variations of this stand-case base -
FIGS. 6A-6E show astand case 600 that includes the base 610 that defines an interior volume for storage and astand 620 that extends therefrom. Thestand 620 includes anarm 630 shown as telescoping throughsegments telescoping segments 639 may connect to other segments to hold the camera mentioned above. Thearm 630 pivots about anaxis 631 that may be a pin from its stored configuration (FIG. 6E ) to its extended configuration (FIGS. 6A-6D ). - To stabilize the
arm 630 in its extended position, a bracing 640 that may have twoside arms 642 pivotally connects thelowermost segment 632 to a base track 616 attached to the base's bottom 612 via anengagement piece 644. In operation, as thearm 630 pivots out of thebase 610, the bracing 640arms 642 draw theengagement piece 644 along the track 616 in a sliding—but attached—engagement. Once thelowermost segment 632 is perpendicular, contacts a side of the base 610 (to which thesegment 632 may clip or otherwise engage through a removeable fastener), or is in its desired position, the bracing 640 acts to additionally stabilize thearm 630 from movement that would adversely affect the camera. Reversing the movement of thearm 630 reverses the sequence to its stored configuration. - The
lowermost segment 632 may be engaged to anarm base piece 635 that is secured to the base 630 or another piece oftrack 633. The base piece may contain theaxis 631 pivot. -
FIGS. 7A-7D show a variation of thestand 720 extending from thebase 710, in which thearm 730 includes twosegments lower segment 732 engages to anarm piece 735 and may rotate about anaxis 735 a with respect thereto. - When raising the
arm 730 from thebase 710, thearm piece 735 rotates about anaxis 751 that may include a pin in abase piece 750 connected thereby to thearm piece 735 via apivot piece 739, such that when thelower segment 732 is fully extended, it may engage or attach to asidewall 712 of thebase 710. - Glasses/Viewing Equipment and Camera
- The glasses discussed herein may be OLED glasses that present a high quality, 1280 by 720 pixel image to the user. The pixel size may be well below minimum separable acuity under optimal conditions (1 arc-minute) as defined in the literature.
- One of the challenges faced by any video-based camera display system that is used for human-motion control in real time is latency. That is, latency for our purposes is defined as the time delay between the actual physical motion within the view of the CMOS cameras and time at which that motion is presented to the eyes. Latencies less than 300 milliseconds have been shown to not degrade performance of simulated surgical and game tasks and even with latency that is sufficient compared to these literature benchmarks, the system herein may be below 100 ms which will be undetectable by the user. Latency may be increased by: Increased CMOS sensor speed and/or the elimination of redundant processors.
- The invention described herein includes that stand and case as shown, that is also made with cleanliness and sterilization in mind. For example, it may allow for sterile draping of covering materials and include mounting/engagement points for such draping. Further, material choices will be such that they can be effectively sterilized. And even further, the design may minimize cavities that cannot be easily reached for cleaning.
- The invention is described in the attached documents and figures, and a person of ordinary skill in the art would understand that various changes or modifications may be made thereto without departing from the scope of the claims.
Claims (20)
1. A portable surgical visualization kit comprising:
a camera configured to captured live feed images;
viewing equipment configured to receive and display the captured live feed images, wherein the viewing equipment includes a visor upon which the captured live feed images are displayed;
a processor in communication with the camera and the viewing equipment;
a stand configured to support the camera; and
a case configured to house the camera, the viewing equipment, and the stand.
2. The kit of claim 1 , wherein the stand is configured for releasable assembly and disassembly, wherein the viewing equipment is located remote from the stand and camera.
3. The kit of claim 1 , wherein the stand is attached to the case and extends therefrom in operation of the viewing equipment.
4. The case of claim 1 , wherein the stand is intregal to the case.
5. The kit of claim 1 , further comprising:
a battery; and
a light source;
wherein the case is further configured to house the battery, the stand, and the light source; and
wherein the stand is further configured to support the light source.
6. The kit of claim 1 , wherein the stand is configured to position the camera along three orthogonal axes and to rotate the camera about the three orthogonal axes.
7. The kit of claim 1 , wherein the stand comprises:
a base;
a first arm configured for attachment to the base;
a second arm configured for attachment to the camera; and
a rotatable elbow joint coupled between the first and second arms;
wherein at least one of the first and second arms is a telescoping arm.
8. The kit of claim 7 , wherein the base, the first arm, the second arm, and the rotatable elbow joint are each configured for releasable assembly.
9. The kit of claim 1 , wherein the camera is a three-dimensional camera with voice activated zoom and positioning.
10. The kit of claim 1 , wherein the case is dimensioned to comply with commercial airline carry-on luggage requirement.
11. The kit of claim 1 , wherein the dimensions are 22 inches or less×14 inches or less×9 inches or less.
12. The kit of claim 1 , wherein the camera, viewing equipment, and processor are configured to fit within the dimensions of a 22 inches or les×14 inches or less×9 inches or less volume.
13. The kit of claim 1 , wherein the case includes a total volume of 2,772 cubic inches or less.
14. The kit of claim 1 , further comprising a base that supports the stand and camera, wherein the base is stable and balanced when the camera and stand rotate about three orthogonal axes, wherein the case is also configured to house the base.
15. The kit of claim 1 , wherein the viewing equipment displays the images in stereo.
16. A portable surgical system comprising:
a camera configured to capture live feed images;
viewing equipment configured to receive and display the captured live feed images, wherein the viewing equipment includes glasses upon which the captured live feed images are displayed;
a processor coupled to the camera and the viewing equipment; and
a stand supporting the camera;
wherein the camera, the viewing equipment, and the stand are configured to be housed in a case.
17. The system of claim 16 , wherein the stand is attached to the case and extends therefrom in operation of the viewing equipment.
18. The system of claim 16 , wherein the stand is configured to position the camera along three orthogonal axes and to rotate the camera about the three orthogonal axes.
19. The system of claim 16 , wherein the case dimensions are 22 inches or less×14 inches or less×9 inches or less.
20. The system of claim 16 , wherein the camera, viewing equipment, and stand are configured to fit within the dimensions of a 22 inches or less×14 inches or less×9 inches or less in volume.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/828,686 US20200221939A1 (en) | 2014-05-09 | 2020-03-24 | Portable surgical methods, systems, and apparatus |
US17/336,519 US20210290046A1 (en) | 2014-05-09 | 2021-06-02 | Portable surgical methods, systems, and apparatus |
US18/519,469 US20240090742A1 (en) | 2014-05-09 | 2023-11-27 | Portable surgical methods, systems, and apparatus |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201461990938P | 2014-05-09 | 2014-05-09 | |
USPCT/US2015/152988 | 2015-05-08 | ||
US201615309962A | 2016-11-09 | 2016-11-09 | |
US202062981704P | 2020-02-26 | 2020-02-26 | |
US16/828,686 US20200221939A1 (en) | 2014-05-09 | 2020-03-24 | Portable surgical methods, systems, and apparatus |
Related Parent Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
USPCT/US2015/152988 Continuation | 2014-05-09 | 2015-05-08 | |
PCT/US2015/029888 Continuation WO2015172021A1 (en) | 2014-05-09 | 2015-05-08 | Portable surgical methods, systems, and apparatus |
US15/309,962 Continuation US10595716B2 (en) | 2014-05-09 | 2015-05-08 | Portable surgical methods, systems, and apparatus |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/336,519 Continuation-In-Part US20210290046A1 (en) | 2014-05-09 | 2021-06-02 | Portable surgical methods, systems, and apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200221939A1 true US20200221939A1 (en) | 2020-07-16 |
Family
ID=54393048
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/309,962 Active - Reinstated 2035-10-15 US10595716B2 (en) | 2014-05-09 | 2015-05-08 | Portable surgical methods, systems, and apparatus |
US16/828,686 Abandoned US20200221939A1 (en) | 2014-05-09 | 2020-03-24 | Portable surgical methods, systems, and apparatus |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/309,962 Active - Reinstated 2035-10-15 US10595716B2 (en) | 2014-05-09 | 2015-05-08 | Portable surgical methods, systems, and apparatus |
Country Status (5)
Country | Link |
---|---|
US (2) | US10595716B2 (en) |
EP (1) | EP3139810B1 (en) |
JP (2) | JP2017523817A (en) |
KR (1) | KR102375662B1 (en) |
WO (1) | WO2015172021A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022204365A1 (en) * | 2021-03-25 | 2022-09-29 | Lazurite Holdings Llc | Portable medical imaging system |
Families Citing this family (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015172021A1 (en) | 2014-05-09 | 2015-11-12 | Nazareth Godfrey | Portable surgical methods, systems, and apparatus |
JP2016087248A (en) * | 2014-11-07 | 2016-05-23 | ソニー株式会社 | Observation device and observation system |
US10013808B2 (en) | 2015-02-03 | 2018-07-03 | Globus Medical, Inc. | Surgeon head-mounted display apparatuses |
CA3016266A1 (en) * | 2015-03-07 | 2016-09-15 | Dental Wings Inc. | Medical device user interface with sterile and non-sterile operation |
GB2536650A (en) | 2015-03-24 | 2016-09-28 | Augmedics Ltd | Method and system for combining video-based and optic-based augmented reality in a near eye display |
CN108882964B (en) * | 2015-10-09 | 2021-10-22 | 柯惠Lp公司 | Method for visualizing a body cavity using an angled endoscope employing a robotic surgical system |
US9844321B1 (en) * | 2016-08-04 | 2017-12-19 | Novartis Ag | Enhanced ophthalmic surgical experience using a virtual reality head-mounted display |
EP3285107B2 (en) * | 2016-08-16 | 2024-02-28 | Leica Instruments (Singapore) Pte. Ltd. | Surgical microscope with gesture control and method for a gesture control of a surgical microscope |
CN106558310B (en) * | 2016-10-14 | 2020-09-25 | 北京百度网讯科技有限公司 | Virtual reality voice control method and device |
US10660728B2 (en) * | 2016-10-20 | 2020-05-26 | Baliram Maraj | Systems and methods for dental treatment utilizing mixed reality and deep learning |
US10973391B1 (en) * | 2017-05-22 | 2021-04-13 | James X. Liu | Mixed reality viewing of a surgical procedure |
US10403046B2 (en) * | 2017-10-20 | 2019-09-03 | Raytheon Company | Field of view (FOV) and key code limited augmented reality to enforce data capture and transmission compliance |
WO2019083805A1 (en) * | 2017-10-23 | 2019-05-02 | Intuitive Surgical Operations, Inc. | Systems and methods for presenting augmented reality in a display of a teleoperational system |
EP3716839A1 (en) | 2017-11-30 | 2020-10-07 | Tec Med S.r.l. Tecnologie Mediche | Immersive display system for eye therapies |
EP3525023A1 (en) * | 2018-02-09 | 2019-08-14 | Leica Instruments (Singapore) Pte. Ltd. | Arm adapted to be attached to a microscope, and microscope |
US20190254753A1 (en) | 2018-02-19 | 2019-08-22 | Globus Medical, Inc. | Augmented reality navigation systems for use with robotic surgical systems and methods of their use |
WO2019163890A1 (en) * | 2018-02-21 | 2019-08-29 | オリンパス株式会社 | Medical system and medical system activation method |
EP3871143A4 (en) * | 2018-10-25 | 2022-08-31 | Beyeonics Surgical Ltd. | Ui for head mounted display system |
US11766296B2 (en) | 2018-11-26 | 2023-09-26 | Augmedics Ltd. | Tracking system for image-guided surgery |
US11382712B2 (en) | 2019-12-22 | 2022-07-12 | Augmedics Ltd. | Mirroring in image guided surgery |
US11166006B2 (en) | 2020-01-22 | 2021-11-02 | Photonic Medical Inc. | Open view, multi-modal, calibrated digital loupe with depth sensing |
US11464581B2 (en) | 2020-01-28 | 2022-10-11 | Globus Medical, Inc. | Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums |
US11382699B2 (en) | 2020-02-10 | 2022-07-12 | Globus Medical Inc. | Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery |
US11207150B2 (en) | 2020-02-19 | 2021-12-28 | Globus Medical, Inc. | Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment |
JP2021145788A (en) * | 2020-03-17 | 2021-09-27 | ソニー・オリンパスメディカルソリューションズ株式会社 | Control unit and medical observation system |
US11607277B2 (en) | 2020-04-29 | 2023-03-21 | Globus Medical, Inc. | Registration of surgical tool with reference array tracked by cameras of an extended reality headset for assisted navigation during surgery |
US11382700B2 (en) | 2020-05-08 | 2022-07-12 | Globus Medical Inc. | Extended reality headset tool tracking and control |
US11153555B1 (en) | 2020-05-08 | 2021-10-19 | Globus Medical Inc. | Extended reality headset camera system for computer assisted navigation in surgery |
US11510750B2 (en) | 2020-05-08 | 2022-11-29 | Globus Medical, Inc. | Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications |
US11737831B2 (en) | 2020-09-02 | 2023-08-29 | Globus Medical Inc. | Surgical object tracking template generation for computer assisted navigation during surgical procedure |
KR102430468B1 (en) * | 2020-10-13 | 2022-08-09 | 서울대학교 산학협력단 | Surgical robot system based on Headset using voice recognition Microphone |
US11896445B2 (en) | 2021-07-07 | 2024-02-13 | Augmedics Ltd. | Iliac pin and adapter |
US11792499B2 (en) | 2021-10-21 | 2023-10-17 | Raytheon Company | Time-delay to enforce data capture and transmission compliance in real and near real time video |
US11696011B2 (en) | 2021-10-21 | 2023-07-04 | Raytheon Company | Predictive field-of-view (FOV) and cueing to enforce data capture and transmission compliance in real and near real time video |
EP4275645A1 (en) | 2022-04-15 | 2023-11-15 | Luca Riccardi | Improved multimedia dental station |
US11700448B1 (en) | 2022-04-29 | 2023-07-11 | Raytheon Company | Computer/human generation, validation and use of a ground truth map to enforce data capture and transmission compliance in real and near real time video of a local scene |
Family Cites Families (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5207303A (en) * | 1991-07-15 | 1993-05-04 | Oswalt Brenda K | Medical emergency carrying case |
US5428660A (en) * | 1993-11-19 | 1995-06-27 | Medical University Of South Carolina | Portable medical panoramic radiographic device |
US6591239B1 (en) | 1999-12-09 | 2003-07-08 | Steris Inc. | Voice controlled surgical suite |
US6454097B1 (en) * | 2000-05-09 | 2002-09-24 | Juan Carlos Aceves Blanco | Prioritized first aid kit |
AU2002222543A1 (en) * | 2001-11-27 | 2003-06-10 | Freni Brembo S.P.A. | Duo-servo drum brake inside shoes adjusting device |
US20030153978A1 (en) * | 2002-02-08 | 2003-08-14 | Whiteside Biomechanics, Inc. | Apparatus and method of ligament balancing and component fit check in total knee arthroplasty |
JP3963779B2 (en) * | 2002-05-29 | 2007-08-22 | オリンパス株式会社 | Surgical microscope |
JP2004288474A (en) * | 2003-03-24 | 2004-10-14 | Takeuchi Seisakusho:Kk | Led array type lighting device |
US7982763B2 (en) | 2003-08-20 | 2011-07-19 | King Simon P | Portable pan-tilt camera and lighting unit for videoimaging, videoconferencing, production and recording |
US7432949B2 (en) * | 2003-08-20 | 2008-10-07 | Christophe Remy | Mobile videoimaging, videocommunication, video production (VCVP) system |
US7837473B2 (en) * | 2006-04-11 | 2010-11-23 | Koh Charles H | Surgical training device and method |
US8214016B2 (en) * | 2006-12-12 | 2012-07-03 | Perception Raisonnement Action En Medecine | System and method for determining an optimal type and position of an implant |
US8068648B2 (en) * | 2006-12-21 | 2011-11-29 | Depuy Products, Inc. | Method and system for registering a bone of a patient with a computer assisted orthopaedic surgery system |
US20080247749A1 (en) | 2007-04-03 | 2008-10-09 | David Law | Camera Wrap Cover |
JP2009098570A (en) * | 2007-10-19 | 2009-05-07 | Mitaka Koki Co Ltd | Head-mount type binocular loupe device |
JP4409594B2 (en) | 2007-11-02 | 2010-02-03 | オリンパス株式会社 | Endoscope system |
GB0722592D0 (en) | 2007-11-16 | 2007-12-27 | Birmingham City University | Surgeons headgear |
US8599097B2 (en) | 2008-05-15 | 2013-12-03 | Air Systems, Inc. | Collapsible portable stand with telescoping support and integral storage case |
KR101190265B1 (en) * | 2009-06-30 | 2012-10-12 | 고려대학교 산학협력단 | Head mouted operating magnifying apparatus |
US8900138B2 (en) * | 2009-11-05 | 2014-12-02 | James P. Horvath | Headlight apparatus and method |
WO2011137034A1 (en) * | 2010-04-27 | 2011-11-03 | Kopin Corporation | Wearable electronic display |
US8988483B2 (en) * | 2011-06-06 | 2015-03-24 | Ted Schwartz | Mobile conferencing system |
AP2014007420A0 (en) * | 2011-07-20 | 2014-02-28 | Stephen Teni Ayanruoh | Integrated portable medical diagnostic system |
US20140066700A1 (en) | 2012-02-06 | 2014-03-06 | Vantage Surgical Systems Inc. | Stereoscopic System for Minimally Invasive Surgery Visualization |
IL221863A (en) | 2012-09-10 | 2014-01-30 | Elbit Systems Ltd | Digital system for surgical video capturing and display |
US9729831B2 (en) * | 2012-11-29 | 2017-08-08 | Sony Corporation | Wireless surgical loupe |
EP2999414B1 (en) * | 2013-05-21 | 2018-08-08 | Camplex, Inc. | Surgical visualization systems |
WO2015172021A1 (en) | 2014-05-09 | 2015-11-12 | Nazareth Godfrey | Portable surgical methods, systems, and apparatus |
US20160119593A1 (en) * | 2014-10-24 | 2016-04-28 | Nurep, Inc. | Mobile console |
-
2015
- 2015-05-08 WO PCT/US2015/029888 patent/WO2015172021A1/en active Application Filing
- 2015-05-08 EP EP15789189.6A patent/EP3139810B1/en active Active
- 2015-05-08 KR KR1020167034636A patent/KR102375662B1/en active IP Right Grant
- 2015-05-08 JP JP2016567373A patent/JP2017523817A/en active Pending
- 2015-05-08 US US15/309,962 patent/US10595716B2/en active Active - Reinstated
-
2020
- 2020-03-24 US US16/828,686 patent/US20200221939A1/en not_active Abandoned
- 2020-05-06 JP JP2020081658A patent/JP7071434B2/en active Active
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022204365A1 (en) * | 2021-03-25 | 2022-09-29 | Lazurite Holdings Llc | Portable medical imaging system |
Also Published As
Publication number | Publication date |
---|---|
EP3139810A1 (en) | 2017-03-15 |
KR102375662B1 (en) | 2022-03-16 |
JP2017523817A (en) | 2017-08-24 |
WO2015172021A1 (en) | 2015-11-12 |
EP3139810B1 (en) | 2022-12-21 |
JP7071434B2 (en) | 2022-05-18 |
US20170273549A1 (en) | 2017-09-28 |
KR20170016363A (en) | 2017-02-13 |
US10595716B2 (en) | 2020-03-24 |
EP3139810A4 (en) | 2018-05-02 |
JP2020127770A (en) | 2020-08-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200221939A1 (en) | Portable surgical methods, systems, and apparatus | |
US20210290046A1 (en) | Portable surgical methods, systems, and apparatus | |
US11147443B2 (en) | Surgical visualization systems and displays | |
US10197803B2 (en) | Augmented reality glasses for medical applications and corresponding augmented reality system | |
JP6521982B2 (en) | Surgical visualization system and display | |
US11628038B2 (en) | Multi-option all-digital 3D surgery visualization system and control | |
US20100013910A1 (en) | Stereo viewer | |
EP3725254A2 (en) | Microsurgery system with a robotic arm controlled by a head-mounted display | |
JP2016536093A5 (en) | ||
US20210335483A1 (en) | Surgery visualization theatre | |
US11448868B2 (en) | Ergonomic EZ scope digital imaging system | |
CN103340686A (en) | General surgery three-dimensional micrography camera shooting presentation device | |
US20200030054A1 (en) | Observation system for dental and medical treatment | |
US20240090742A1 (en) | Portable surgical methods, systems, and apparatus | |
US20230232105A1 (en) | Alignment of User's Field of View With Head-Mounted Camera and/or Light | |
US20230179755A1 (en) | Stereoscopic imaging apparatus with multiple fixed magnification levels | |
US20230129708A1 (en) | Procedure guidance and training apparatus, methods and systems | |
WO2023052535A1 (en) | Devices and systems for use in imaging during surgery | |
WO2023052566A1 (en) | Devices and systems for use in imaging during surgery | |
Jessup | Smartphones and consumer electronics for eye examinations and ophthalmology teaching–proof of concepts for five novel and inexpensive optical instruments. | |
WO2023052474A1 (en) | Devices and systems for use in imaging during surgery | |
EP4146115A1 (en) | Surgery visualization theatre |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |