EP4034028A1 - Brille der erweiterten realität zur verwendung in der chirurgischen visualisierung und telechirurgie - Google Patents

Brille der erweiterten realität zur verwendung in der chirurgischen visualisierung und telechirurgie

Info

Publication number
EP4034028A1
EP4034028A1 EP20867781.5A EP20867781A EP4034028A1 EP 4034028 A1 EP4034028 A1 EP 4034028A1 EP 20867781 A EP20867781 A EP 20867781A EP 4034028 A1 EP4034028 A1 EP 4034028A1
Authority
EP
European Patent Office
Prior art keywords
axr
images
user
eye
surgical system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20867781.5A
Other languages
English (en)
French (fr)
Other versions
EP4034028A4 (de
Inventor
Michael Hayes Freeman
Mitchael C. Freeman
Jordan BOSS
Brian Santee
David Cary
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Raytrx LLC
Original Assignee
Raytrx LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Raytrx LLC filed Critical Raytrx LLC
Priority claimed from PCT/US2020/053098 external-priority patent/WO2021062375A1/en
Publication of EP4034028A1 publication Critical patent/EP4034028A1/de
Publication of EP4034028A4 publication Critical patent/EP4034028A4/de
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B90/53Supports for surgical instruments, e.g. articulated arms connected to the surgeon's body, e.g. by a belt
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/35Surgical robots for telesurgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00203Electrical control of surgical instruments with speech control or speech recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00216Electrical control of surgical instruments with eye tracking or head position tracking control
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles

Definitions

  • This invention relates generally to augmented and extended reality glasses, and more particularly, but not by way of limitation, to augmented and extended reality glasses for use in surgery visualization. Description of the Related Art. 30 [0004] There is a need for surgeons to be able to access real-time and pre-recorded computer-generated and/or camera images in the operating room.
  • VR virtual reality
  • RR real-reality
  • some virtual reality surgery systems include immersive goggles with micro-displays facing the user’s eyes, much like small televisions, and noise-cancelling headphones.
  • the micro-displays completely replace the surgeon’s view of the operating room and outside world. The surgeon is thus cut-off from the surgery room and the video from the micro-displays is surrounded by complete darkness.
  • Such virtual reality surgery systems are designed as a 3D mobile theater, essentially simulating holding two small televisions in front of the user’s eyes while completely eliminating outside light and sound.
  • Such existing virtual reality surgery systems are generally uncomfortable and must be worn tight on the head, blocking out reality.
  • VR systems seal out real-word light, sound, and air around the surgeon’s eyes and cheeks, making the device hot and uncomfortable.
  • the heat generated by the surgeon wearing the VR headset and from the headset itself often causes condensation on the interior lenses, which makes the images appear foggy and requires the surgeon to take of the VR headset for cleaning during the surgery. Clearing the lenses typically only helps temporarily.
  • Some such systems use a trackpad that is turned 90 degrees from the user interface, so that swiping forward actually moves right and swiping backward moves left. This can be frustrating for the user, particularly if the user is left-handed.
  • typing within a VR headset menu is a painstaking and time-consuming chore, making entering HIPPA compliant passwords for sensitive data difficult.
  • such virtual reality systems are typically heavy, with most of the weight forward on the head, making it uncomfortable for the user.
  • augmented/extended reality surgical systems
  • virtual reality immerses the user into the images presented and closes RR
  • AXR permits the user to see RR and what is actually happening in the user’s world and then adds computer-generated, computer-manipulated, or secondary camera images to RR.
  • virtual reality completely covers and replaces the user’s field-of-vision with virtual images
  • augmented/extended reality provides the user with vision of the real-world plus an overlay of computer-generated and/or manipulated photographic imagery or video (“virtual”) images, which positions the user in the RR with virtual images added.
  • an augmented/extended reality system permits the surgeon to both view and have magnified the virtual image or area of operation, while still having a sense of the operating or diagnostic room and being with all the other things happening in that space.
  • the problem with current AXR surgical systems is that they offer a small field of vision on a heavy wearable that is often tethered to the system by a large cord, limiting the surgeon’s movements and putting strain on the surgeon’s neck and back.
  • current AXR surgical systems must block out a great deal of ambient light to make the AXR images visible and are difficult to see in daylight or highly-lighted conditions, making the systems function more like a virtual reality system than an AXR system.
  • the system is further desirable for the system to be lightweight, comfortable, untethered, and is feature- and user-friendly
  • the invention in general, in a first aspect, relates to an AXR surgical system comprising: a wearable device comprising one or more micro-displays, one or more lenses, where the micro-displays are capable of projecting images onto the lenses, a head-tracking subsystem, and an eye-tracking subsystem; and a central processing unit in communication with and capable of controlling the micro-displays, lenses, head-tracking subsystem, and eye-tracking subsystem.
  • the system may be capable of displaying images on the lenses with a position based on a user’s head position as tracked by the head-tracking subsystem and the user’s eye position as tracked by the eye-tracking subsystem, while allowing the user to see through the lenses where the images are not being projected.
  • the lenses may comprise a reflective layer and a layer of cholesteric liquid crystal comprising a plurality of pixels where each pixel is capable of independently becoming opaque.
  • the system may be capable of selectively making pixels opaque only where images are projected by the micro-displays, while any pixels located where the images are not projected remain see- through.
  • the AXR surgical system may further comprise at least one collimator located between the micro-displays and the reflective layer such that the at least one collimator is capable of concentrating rays from the micro-displays in an eye box while utilizing less resolution in a periphery.
  • the camera lenses may be capable of capturing a wider field of vision than the micro-displays may be capable of projecting.
  • the reduced field of vision may be comprised of the view where user is gazing as tracked and analyzed by the eye tracking system.
  • the wearable device may further comprise one or more forward-facing cameras, where the images projected by the micro-displays are at least partially images obtained from the forward facing cameras.
  • the wearable device may further comprise one or more at least partially downward-facing or upwards-facing cameras, where the images projected by the micro-displays are at least partially images obtained from the at least partially downward-facing or upwards- facing cameras, which angle may be adjustable.
  • the AXR surgical system may further comprise one or more microphones in communication with the central processing unit, where the system is capable of being controlled via voice input via the microphone, input from the eye-tracking subsystem, or a combination of voice input via the microphone and input from the eye-tracking subsystem.
  • the microphones may have noise cancelling features capable of reducing ambient noise.
  • the wearable device may further comprise one or more batteries.
  • the wearable device may further comprise a remote communication device such that the wearable device is wireless.
  • the system may be in communication with one or more second systems such that one or more remote users can view the images from the system on the one or more second systems and communicate will the user and other remote users.
  • the images projected by the micro-displays may be from preoperative imaging, and the system may be capable of aligning the images with a patient.
  • the system may further comprise a remote camera system in wireless communication with the wearable device, where the images come from the remote camera system.
  • the remote camera system may be mounted on a six-axis cobotic arm, which may be in communication with the system such that the cobotic arm is controlled by the user.
  • Figure 1 is a perspective view of the AXR surgical system in use
  • Figure 2 is a perspective view of the AXR surgical system headset
  • Figure 3 is a front view of the AXR surgical system headset;
  • Figure 4 is an exploded view of the lenses;
  • Figure 5 is a diagrammatic illustration of an eye box
  • Figure 6 is a diagrammatic view of the micro-displays
  • Figure 7 is a close-up view of the dynamic opacity
  • Figure 8 is a diagrammatic view of three camera options
  • Figure 9 is a back view of a person wearing the AXR headset, illustrating different views presented by the virtual overlay;
  • Figure 10 is a perspective view of a 3D surgical camera with two sensors.
  • Figure 11 is a perspective view of the camera on a cobotic arm.
  • AXR Augmented and Extended Reality
  • AXR is defined herein in its common scientific use, which may include an interactive experience typically in a see-through headset with lenses of a real-world environment where the objects that reside in the real world are enhanced by computer-generated perceptual images and information, sometimes across multiple sensory modalities, including visual, auditory, haptic technologies, somatosensory, and/or olfactory.
  • Extended Reality is defined in its common scientific use, which is typically an umbrella term encapsulating augmented reality (AR) and/or virtual reality (VR) and/or mixed reality (MR) and/or real reality (RR) and everything in between. It may also include combined environments and human-machine interactions generated by computer technology such as 6DoF and SLAM, and artificial intelligence (AI), including machine learning (ML), where the 'X' represents a variable for any current or future spatial computing technologies, including digital content of any sort; for instance, in the medical field, a 3D MRI or CT scan images or data visualizations, like patient vitals, superimposed on an AR headset.
  • AR augmented reality
  • VR virtual reality
  • MR mixed reality
  • RR real reality
  • AI machine learning
  • ML machine learning
  • “Six Degrees of Freedom” (6DoF) is defined herein in its common meaning, including the way virtual objects can be moved in virtual space in AR. There are six total degrees of freedom in placing virtual images in AR. Three (3) correspond to rotational movement around the x, y, and z axes, commonly termed pitch, yaw, and roll. The other three (3) correspond to translational movement along those axes, which can be thought of as moving forward or backward, moving left or right, and moving up or down.
  • IMUs Inertial Measurement Units
  • referencing devices for measuring rotational movements such as an accelerometer, a gyroscope, and a magnetometer, all located within the headset.
  • IMUs may measure the headset’s velocity, orientation, and gravitational forces to infer rotational orientation and movement.
  • Haptic technologies is used herein in its common scientific meaning and is sometimes called kinaesthetic communication or 3D touch. It may also refer to any technology which may create an experience of touch by applying forces, vibrations, or motions to the user or to an object. Haptics may enable users to feel the sense of touch via vibrations of forced motion. Haptic technologies can be used to create virtual objects in a computer simulation or virtual space, or to control those virtual objects, and may be used to enhance remote control of machines and devices (telerobotics). Haptic devices may incorporate tactile sensors that measure forces exerted by the user on the interface. This technology may employ touch sensors for control.
  • AI Artificial Intelligence
  • ML Machine Learning
  • AI may enable AR to interact with the physical environment in a multidimensional way. For instance, AI may permit object recognition and tracking, gestural input, eye tracking, and voice command recognition to combine to let the user manipulate 2D and 3D objects in virtual space with the user’s hands, eyes, and/or words.
  • Object Recognition (OR) or “Object Identification” (OI) is used herein in its common scientific meaning, including a computer vision technique for identifying objects in images or videos.
  • Object recognition may be a key output of deep learning and AI algorithms. When humans look at a photograph or watch a video, we can readily spot people, objects, scenes, and visual details. OR/OI does this from visual analysis based on a neural network algorithms reconciliation with pre-existing information.
  • SLAM Simultaneous Localization and Mapping
  • image(s) or “virtual image(s) or “imaging” or “virtual objects” or “AXR imaging” is defined for the purpose of this patent as visualization of either 2D images or video or 3D images or video.
  • the definition also includes the concept that one or more 2D images can be viewed in stereoscopy to create one or more virtual 3D perspectives.
  • image(s) definition, herein, is the idea that AXR 3D models may be viewed as a single or series of 2D images, as in a still picture or video, or a single or series of stereoscopic 3D images, as in a 3D images or video.
  • the 3D effect may be created in the AXR headset by using an off-set paired perspective of a 3D model.
  • 3D models in AXR can be viewed from different perspectives by the user or multiple users can view the same image from multiple perspectives.
  • wireless means the electromagnetic transfer of information between two or more points which are not connected by an electrical conductor, or a communication by technologies, such as light, magnetic, or electric fields, or the use of sound.
  • wireless communication includes all methods of wireline communication including, but not limited to, directly connected devices, telephone networks, ethemet connections, cable networks, internet access, fiber-optic communications, and waveguide (electromagnetism) connections.
  • the invention in general, in a first aspect, relates to an augmented and extended reality (AXR) surgical system.
  • the system may comprise a wearable device 1, such as a head mounted display (HMD) or glasses, that provides the user with virtual reality (VR), augmented reality (AR), and/or mixed-extended reality (XR) for surgery visualization, as shown in Figure 1.
  • HMD head mounted display
  • AR augmented reality
  • XR mixed-extended reality
  • This may allow the user to access 2D or 3D imaging, magnification, virtual visualization, six- degrees of freedom (6DoF) image and simultaneous localization and mapping (SLAM) management, and/or other images while still viewing real reality (RR) and thus maintaining a presence in the operating room.
  • 6DoF six- degrees of freedom
  • SLAM simultaneous localization and mapping
  • the system may comprise one or more micro-displays 2, a head-tracking subsystem 3, an eye-tracking subsystem 4, and one or more cameras 5, all of which may be included on the wearable device 1.
  • the system may further comprise one or more lenses 10, where the micro-displays 2 are capable of projecting images on the lenses 10, where the images may be reflected back to the user’s eyes.
  • the wearable device 1 may be a head mounted display with a pair of lenses 10, one in front of each of the user’s eyes.
  • One or more micro-displays 2 may be located above the user’s eyes and may be pointed toward the lenses 10.
  • the cameras 5 may provide image input, while the head-tracking subsystem 3 and the eye-tracking subsystem 4 may provide positional input, allowing the system to project the desired images to the desired location for the user to view the images. Additional image input may be provided from other sources. All components may be controlled by a CPU, which may be located on the wearable device 1 or remotely. Other components may include additional central processing units, one or more graphics processing units, one or more digital signal processors, firmware, hardware, software, and/or memory components, as well as other desired components. The high-level components may control the features and functions of the AXR headset 1, including, but not limited to its cameras 5, micro-displays 2, lenses 10, sensors, communications, and subsystems.
  • the system may be capable of displaying both real reality and computer generated images (CG or CGI) or computer captured and manipulated images (CMI), effectively creating the illusion of AXR.
  • CMI may mean previously recorded, captured, or created images or video from a different reality than the RR displayed in the AXR headset.
  • the system may be capable of functioning as a “heads-up” system, allowing the user to look at the images on the micro-displays 2 or look beyond the display to the larger environment of the real-world operating room and attendants.
  • the system may provide a full field of vision, unlike existing systems.
  • the micro displays 2 may provide a wide field of vision of, for instance, 120 degrees, namely 60 degrees horizontal and 36 or more 65 degrees vertically in each eye, or other desired field of vision. This may allow a high angular resolution of 60 pixels per degree in the eye box, which is the highest resolution the eye can distinguish at 20/20. Humans have a slightly over 210-degree forward facing arc of their visual field.
  • the cameras 5 of the system may capture all or most of the human forward-facing degrees, when needed.
  • the user may view 120 degrees field-of-view (FOV) of AXR through the cameras 5 and micro-displays 2 and 210 degrees of RR with the system functioning as a heads-up display (HUD).
  • FOV field-of-view
  • This field of vision may actually be even larger from a practical standpoint as the user may, for example, look down at his or her hands, which are outside the AR/RR presented field of vision.
  • the availability of viewing the real reality environment may be important to a surgeon when he or she is trying to pick up a tool or adjust his or her hands during surgery. This type of viewing is not possible with existing VR systems, which require the eye of the surgeon to be always exactly aligned, something that might well prove exhausting in a lengthy surgery.
  • the cameras 5 may be two on-board 4K or higher resolution cameras and may, as noted above, capture a wide field-of-view, such as 180 to 210 degrees forward-facing vision. This oversampling of the field of vision may then be stored per frame and used in conjunction with the eye-tracking subsystem 4 to present the actual field of vision depending on the user’s gaze. In this fashion, the system may use images from the entirety of the 180 degrees captured or a reduced sample of the entire captured cameras degrees FOV. As the system’s eye tracking follows the eye of the surgeon as his or her eyes move, the system may be able to provide imagery from the fully captured 180 or more degrees.
  • the virtual images projected by the micro-displays 2 may come from existing data, pictures, graphs, videos, MRI’s, CT scans, or other pre-recorded images or information.
  • the large field of vision of the system may result in a large eye box for the user, as shown in Figure 5.
  • the eye box of any AXR or VR system may be crucial as it may serve as the connection between the device and the user.
  • the eye box of the system may be large enough to provide comfortable viewing of the full field of vision with the highest resolution even if the headset moves while wearing. Further, the eye box of the system may be large enough to account for eye relief for the user, including allowances for brow size and how deep-set the user’s eyes are, as well as clearance for eyeglasses and allowances for lateral pupil movement.
  • eye box is analogous to the term “eye relief’ of a typical optical instrument, such as a telescope, a microscope, or binoculars, which is the distance from the last surface of an eyepiece within which the user’s eye can obtain a full viewing angle. If a viewer’s eye is outside this distance, a reduced field of view may be obtained. Thus, the smaller eye box of previous VR systems is inferior to the large eye box of the current system.
  • the eye box of the current system may be as much as 20x20mm x 60p/p/d. This may be achieved by providing three micro-displays 2 having the outer two displays 2 sharing pixels with the central display 2 through the system’s algorithms, as shown in Figure 6.
  • the system may present approximately 50 degrees horizontal by 20 degrees vertical field of vision at 60 pixels per degree.
  • the remainder of the field of vision may have approximately 20 pixels per degree, which may be equal to or better than the acuity in the outer parts of the retina.
  • the lenses 10 may be concave in construction, aiding to focus and enlarge the surgeon’s eye pupil view for the biggest possible eye box. Thus, it is almost impossible for the user to lose the view of the AXR or surgery visualization image due to the large FOV and very high resolution of the system.
  • the resolution of the micro-displays 2 may specifically be 22 pixels per degree, or 2560 x 1440 (Quad HD); 25 pixels per degree, at 3200 x 1440; 60 pixels per degree at 7200 x 1600; or any other desired resolution.
  • the luminance may be 1,000 cd/m2, or higher, while contrast may be 100,000:1 or higher.
  • the micro-displays 2 may support 110 percent of the sRGB color gamut ratio.
  • the system may utilize voice control, eye tracking, or gesture recognition technologies. Alternatively, one or more of these technologies may be used together in order to access and manipulate a control.
  • This method may allow the user to control the AXR system or other external equipment or systems via wired or wireless connection without requiring input through foot pedals, buttons, hand dials, or other hardwired methods of control.
  • Combining two or more methods of control, i.e. voice with eye tracking, may provide redundancy and ensure proper operation of controls.
  • the system may utilize a high-resolution high-speed wireless video connection with approximately the same latency as an actual wired system and may be synced with one or more wireless headsets and one or more wired display systems to present a simultaneous view on all the wireless headset(s) and wired display systems from the original camera(s) source.
  • the one or more original source cameras may be mounted on the AXR headset 1 or as a part of an external cameral systems like the 3D 4K camera system described herein.
  • the networked headsets and monitors of the system may allow multiple participants to see and experience the same surgery or diagnostic view or provide an experienced surgeon with the ability to remotely assist an immediately present or remote inexperienced surgeon. This technique may be used to also teach and train healthcare workers.
  • the system may enable remote virtual telemedicine collaboration between multiple surgeons, assistants, techs, students, or others.
  • the system may optionally exactly align the CG or GMI image with the real environment. This alignment may be accomplished by creating an overlay, which permits the alignment of preoperative CT or MRI 3D images with the currently treated patient’s body, body parts, or internal organs.
  • the surgeon may be able to both view the whole person in RR while seeing images of internal items like the person’s internal organs, blood, bone, or tissue while using 6DoF, SLAM, and gesturing recognition technologies or others techniques mentioned herein where the user can change the orientation and registry of the virtual image to match the real organ.
  • the system may utilize dynamic opacity, described below, making the AXR image either a complete view, blocking RR and the real organ, or a partial transparency, where the AXR organ image or model and the RR organ can be viewed at the same time to align them together.
  • surgery precision may be increased as the areas identified in the lab on the CT or MRI can be superimposed over the real organ to know exactly where to inject, incise, resect, or otherwise operate.
  • the dynamic opacity subsystem that allows the system to function as a true AXR system may be provided by a multilayered lens 10, which may be part of the wearable device 1.
  • a multilayered lens 10 which may be part of the wearable device 1.
  • the reflected image can be washed out.
  • Other systems solve this problem with dark lenses. Having the lens shaded all the time, however, makes the wearer vulnerable to falling or tripping over unseen obstacles.
  • the dynamic opacity of the lens 10 of the current system may only obscure that portion of the lens 10 where the eyes are viewing the AXR image as alpha matte composites, meaning the combining of several images from different sources into a single image.
  • Figure 7 illustrates the dynamic opacity of the present system.
  • the system may utilize alpha matte software that works in conjunction with eye tracking technology and software to map the user’s eye gaze and adjust not only the image, but also move or vary the opacity of the exterior of the lens 10 where the eyes are gazing and the image is projected.
  • the software may automatically or manually adjust the opaqueness of the alpha matte display up or down to meet ambient lighting conditions.
  • the lens 10 may have multiple layers, as shown in Figure 4, with dynamic opacity provided on the outer layer 11, furthest from the user’s eye.
  • This layer 11 may be pixelated, which may permit the system to create a shadow or mirrored image of whatever virtual image is being displayed. This may provide a backdrop for the virtual image, blocking out light that might otherwise wash out the image. The remainder of the layer 11, where the image is not being displayed, may remain clear. Alternately, all of the pixels of the layer 11 may be activated, making the layer 11 fully obscure and blocking out the RR.
  • surgeons who do not like surgery room distractions can chose to engage the dynamic opacity via voice command and make the system more like a VR headset, blocking out the view through the lens 10 behind the AXR image or video when ultra-concentration is needed.
  • the surgeon can choose to make the dynamic opacity off or clear in the portion of the lens where there is no reflected image, to use the system in normal mode, where only the AXR image is shadowed form the back.
  • the dynamic opacity of the lens 10 may provide a buffer between the displayed image and exterior light, giving the AXR image greater brightness to the eye.
  • the system may allow the dynamic opacity to be enabled automatically, under pre set conditions, manually, or with voice, gesture, or eye tracking command.
  • the layer 11 may comprise a plurality of pixels of cholesteric liquid crystal, each of which may be independently capable of becoming clear or opaque, or in between, as desired.
  • the lens 10 may further comprise a reflective layer 12, which may be a lens or a coating.
  • the reflective layer 12 may be located closest to the user’s eye and may be the surface upon which images projected by the micro-displays 2 for reflection back to the user’s eyes.
  • An anti -reflective layer 13 may be positioned next and may be a layer or optical coating that may prevent unwanted artifacts, such as ghosting.
  • the lens 10 may further comprise one or more collimators 14.
  • the collimator 14 may be a separate layer included in the lens 10; additionally or alternately, layer 11 or layer 12 may have aspects of a collimator, and thus may function as the collimator 14; additionally or alternately, the collimator 14 may be a separate lens located between the micro-displays 2 and the reflective layer 12.
  • the collimator 14 may be capable of concentrating rays from the micro-displays 2 in the eye box while utilizing less resolution in the periphery for an overall highest resolution and field of vision.
  • the eye tracking subsystem 4 may work through hardware and software.
  • the software may be connected to the system’s GPU working in connection with the system’s modular controller.
  • the eye tracking may be captured by infrared light being projected into the user’s eye, which may create a glint or reflection, which may then be captured by one or more IR sensitive cameras 8.
  • the eye tracking subsystem 4 may be capable of capturing the glint from the eye from 30 frames per second to 500 frames per second. This information may be stored in real-time in the CPU and DSP, and then processed into a virtual space represented by x,y,z or Cartesian coordinates.
  • These coordinates may provide the system with the information about where the user’s gaze is in relation to the reflective lens and the alpha matte layer so that both stay aligned with the user’s gaze.
  • the eye tracking subsystem may be used to map the user’s eye gaze and adjust not only the reflected images or video but also the alpha matte image located on the separate plane to keep the alpha combined image aligned with the eye box.
  • the eye- gaze and the alpha matte layer may be controlled by the eye tracking subsystem 4 to always stay in sync.
  • the cameras 5 may comprise two forward facing cameras 5.
  • the system may further comprise two additional cameras 5, which may be adjustable from 60 to 90 degrees. These additional, adjustable cameras 5 may permit the surgeon to operate with his or her head in a comfortable position while the view on the lens 10 is that of straight down.
  • Optional camera configurations are shown in Figure 8.
  • the one or more cameras 5 may be housed within their own system 40, like a digital 3D 4K system, and may be connected wired or wirelessly to the AXR headset 1.
  • the camera system 40 may consist of two parallax mounted cameras 5 to create a 3D viewing experience and may be mounted on a six-axis robotic/cobotic arm 41 for surgery use, as shown in Figure 11.
  • the six-axis arms are called cobotic, which means the combination of robotic automated movement combined with the collaboration of an operator, who may activate and control the cobotic arm 41 by voice control, eye tracking, gesture recognition, haptic technologies, touch, or other control technologies mentioned herein or with a joy-stick.
  • two or more of these controls may work in combination with another for control.
  • the cobotic arm 41 when voice or otherwise activated using the technologies described herein, may recognize the exact site of the surgery on a patient’s body to be viewed during the surgery from pre-programmed information and may travel according to the software and the ability of the six-axis arm 41 to the exact position the camera is needed for surgery.
  • the six-axis arm 41 may be connected to a stationary or movable side-carte component stationed on the floor or connected to a boom on the ceiling or a wall.
  • the arms 41 may be powered by motors and may be gravity-compensated and may respond to either the touch of an assistant, or by voice command, or any other of the control technologies mentioned herein.
  • the six-axis cobotic arm 41 may receive and transmit sound, light, vision, movement, and/or sensitive sense-of-touch (force tactile transmission) to a remotely located user or controller in real time.
  • the precision motors contained within the cobotic arm 41 may use the haptic sensors or internal algorithms to work with a user’s so that, for instance, a slight touch in the direction of its repose may cause the cobotic arm 41 to continue to its position of repose.
  • the cobotic arm 41 may also be manually placed in a certain position by a user, and the cobotic arm’s controller may remember the exact movement so that it can duplicate that movement automatically upon command by any of the technologies mentioned herein. For instance, if a cobotic arm 41 is manually placed at a surgery location needed for viewing the surgery site, and then, during the surgery the patient developed a bleed or other cause of emergency, the cobotic arm 41 could be activated to move to its repose position. Once the issue was resolved, the cobotic arm 41 with the camera may, on command, return to the exact location needed to continue the surgery.
  • a surgeon or tech may slightly push the robotic arm 41 in one direction, and the cobotic arm 41 would continue to move to that direction until it ended up in the intended position. Likewise, if a surgeon or assistant pulled on an arm 41, it would continue until it reached a predestined spot.
  • the surgeon may additionally or alternately wirelessly receive a 3D video feed from a digital microscope, providing the surgeon with an alternative surgical video input.
  • the system may be capable of overlaying information, such as text and graphs, in a virtual display over the operating view, as shown in Figure 9. the system may allow the surgeon or user to control and present the overlayed information, pictures, graphs, or videos in other views inside the headset via a visual presentation subsystem.
  • the visual presentation subsystem powered by IMU, SLAM, and/or eye tracking technologies, may provide an overlay of vital information, such as text and graphs, in virtual display over the 3D operating view.
  • the visual presentations may be like windows or chyron generated view visible within the AXR FOV and may be virtually presented in a certain pre-set location of the user’s view.
  • the system may display intraocular pressure, cut rate, and flow rate or may show which mode a surgeon is in, such as vitrectomy, extrusion, dense tissue, etc., and may retrieve and track a preloaded surgery plan.
  • this information displayed may vary depending on the equipment or information useful to the surgeon during the surgery.
  • the overlay may be used to view preoperative or interoperative images in virtual format, including pictures, videos, MRI’s, CT scans, and the like.
  • This information may be visible upon voice command of the surgeon, and the system may provide the user the option of displaying information at the bottom, side, or top of the AXR lens view.
  • the surgeon may move his or her head or eyes at a predetermined degree of rotation, for instance 15 or 30 degrees either to the side or up and down. With this turn of the eyes or head, the surgery video feed images may disappear and alternative information like patient vitals may appear.
  • equipment readouts, preoperative information, or other important information may appear.
  • the surgery images may appear to return focus to the surgery.
  • the surgery images may disappear, and the surgeon could refocus on the RR patient and surgery.
  • the system can be set to leave the information always in view.
  • a surgeon who does not like distractions can have the option of making a slight head or eye adjustment as needed to see the information.
  • a retina surgeon is in laser mode, he or she may enable the information display to show power, standby versus on, duration, and intensity.
  • the AXR headset may use IMU, SLAM, eye tracking, and/or other technology to permit the surgeon to move his or her head forward or use an eye movement or other manner of control described herein to cause the z coordinate to reorient to magnify or reduce the surgery image.
  • the AXR headset 1 may be embedded with SLAM, 6DOF, inertial measurement units (IMU), or eye tracking technology, which may interpret the angle of the user’s head or eyes versus the displayed image. Then, when either the eyes or head move to focus on a portion of the image that is originally on the edge of the view, the system may digitally reposition the image to the center of the user’s visual field, providing a high-focus view independent of where the image was originally located.
  • IMU inertial measurement units
  • the head tracking subsystem 3 may include an internal array of IMUs 7, which may include one or more accelerometers, gyros, and/or magnetometers. Using these sensors, the system may be capable of automatically enabling and switching between camera systems depending on the position of the surgeon’s head. For example, when the surgeon looks down, the system may enable the front-facing cameras 5, and then when the surgeon looks up or straight ahead, the system may enable the downward cameras 5 to permit the surgeon to comfortably find a forward-looking position while the downward facing cameras 5 capture the surgeon’s hands and the operating space. Upon a voice command issued from the surgeon, the system may switch off the RR cameras 5 and convert to projecting the images from a scope or digital microscope.
  • IMUs 7 may include one or more accelerometers, gyros, and/or magnetometers.
  • the accurate alignment of AXR images with RR images may be achieved by using AI and a set of trackers 6, which may be used to determine the exact position of the cameras 5 and the patient’s body.
  • the AI engine together trackers 6 may identify and track fiducial markers placed on the surface of specific structures that remain still during surgery, such as iliac crest, clavicles, etc., and thus provide the system with points of reference.
  • the system may take the fiducial marker information and fuse it with other inertial measurement data, which may be provided by the internal array of inertial measurement units 7, to provide a stable localization of the overlay system.
  • the system may utilize proprietary 2D/3D software maximized for surgery.
  • the system may include a six degrees of freedom (6DoF) sub-system capable of providing real-time interfacing and no time loss between accessing 3D-type CT or MRI scans and projecting those images for surgery.
  • 6DoF degrees of freedom
  • the system may be capable of displaying CG images over the top of pass-through RR. This may include presenting images generated using a fusion of optical images with near- infrared fluorescence images not visible to the human eye. These images can provide more useful immediate feedback that is overlayed in context to what is needed, such as blood flow. This technique may be used to increase precision by providing additional data for the surgeon to consider. Using this technique, surgeons could be able to detect blood vessels under the organ surface or detect other tissue abnormalities.
  • the system may be used as a digital magnifier, providing up to lOx magnification.
  • the AXR surgical system may further comprise one or more microphones in communication with the central processing unit, where the system is capable of being controlled via voice input via the microphone, input from the eye-tracking subsystem, or a combination of voice input via the microphone and input from the eye-tracking subsystem.
  • the one or more microphones may be configured to create noise cancellation, or the AXR headset may include noise cancelling microphones to reduce, eliminate, or remove background noise so that the receiving person, device, or AXR headset itself can better understand the speech of the user.
  • the wearable device may further comprise a battery and a remote communication device such that the wearable device is wireless and has communication features.
  • the AXR headset may contain one or more batteries.
  • the primary battery may be located in an external position on the headset in a manner to facilitate removal and replacement of the battery during use.
  • the primary battery may include a mechanism for a spring-loaded battery to facilitate removal during use.
  • a surgery tech may press the spring-loaded battery in the back of the headset, then reattach a new, fully charged battery.
  • the AXR headset in this instance may include a hot-swap feature, which may include one or more secondar, typically smaller, batteries, which typically would only carry enough capacity to last a few minutes.
  • the common battery control circuit may shift to the auxiliary battery to keep the headset functioning with all features continuing until the primary battery is replaced.
  • the system may include a battery full/battery empty capacity feature which alerts the user and others that there is only a certain about of battery charge remaining so that a timely battery change may be planned.
  • the system may be in communication with one or more second systems such that one or more remote users can view the images from the system on the one or more second systems and communicate will the user and other remote users.
  • any number of wired or wireless connected, or networked users may see the same virtual image.
  • any connected user can point to a specific point or set of points, or define one or more areas in virtual space, on the commonly seen virtual image in all the user’ s AXR headsets, which may then communicate and correspond that same reference information into the view of a select set or all other user’s AXR headsets or to any other monitors and displays in the network.
  • This technique may work with either a current simultaneous view or a current or past picture or video feed. Since the controller on all connected and simultaneously viewing headsets knows exactly where each pixel exists in the displayed virtual image, it may be able to identify the specific point or set of points, or area or areas, of interest and transmit that information wirelessly, or over a wired connection, to create a corresponding marker on all connected user’s headsets so that all connected users can see and understand the specific point, set of points, area, or areas of interest originated by the initial pointing user. Likewise, the point or set of points or area or areas commonly displayed may be used as a point of reference or a measurement. In addition to images, any textual, graphical, or other information may also be commonly viewed by connected users.
  • any connected AXR headset user using the technologies mentioned herein or through AI techniques may choose to view commonly displayed 2D or 3D images or 2D or 3D models in the same perspective as another; or any user may choose a different perspective of the same 2D or 3D images or models.
  • one or more current or stored images can be analyzed and compared against a data model or composite of data models to find either specific or general information based on specific criteria, and show a result as a virtual image(s) on one AXR headset in real-time or on a plurality of connected AXR headsets
  • the AI engine may make a similar comparison with a number of specific criteria’s or no criteria at all, and bring novel information inferred from the system.
  • a 3D MRI virtual image of an organ could be managed or compared with AI by one of the techniques set out herein in virtual space by the surgeon without getting contaminated by touching a real object, in order to change the registry and orientation of the virtual organ image to match the RR organ to increase the surgeon’s understanding of where to incise, inject, or perform some other similar surgical act.
  • the system may increase or decrease the opacity gradient of the lenses, so that both the virtual organ and the real organ may be viewed, or aligned, by the surgeon seeing them both at the same time.
  • the wearable device 1 may be lightweight and may be wireless.
  • One of the ways to reduce weight is to have only the cameras, a battery, and sensors in the headset 1 with connectors to a WI Gig modem using the fastest wireless protocol available, such as the IEEE 802. (a, y) protocol.
  • Another embodiment places the intelligence in the headset 1, such as a Qualcomm XR-2 chipset, and have the chipset circuit board be connected to WI Gig modems to send/receive streaming video to/from another WI Gig connected location, such as a digital microscope, endoscope, or other surgery imaging device.
  • WI Gig wireless Gig connection
  • WIFI IEEE 802.11 would work, the best method would be to use a WI Gig 802.1 l(ad;ay, ax) so that uncompressed video can be sent from any image producing system to the AXR headset.
  • the AXR headset may include a 5G modem to be capable of edge computing at super-fast speeds.
  • Edge Computing is a technique to bring data back from the cloud to a localized network where all computing goes to an on-site or close-by data center.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
EP20867781.5A 2019-09-27 2020-09-28 Brille der erweiterten realität zur verwendung in der chirurgischen visualisierung und telechirurgie Pending EP4034028A4 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962907300P 2019-09-27 2019-09-27
PCT/US2020/053098 WO2021062375A1 (en) 2019-09-27 2020-09-28 Augmented and extended reality glasses for use in surgery visualization and telesurgery

Publications (2)

Publication Number Publication Date
EP4034028A1 true EP4034028A1 (de) 2022-08-03
EP4034028A4 EP4034028A4 (de) 2024-01-03

Family

ID=82199322

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20867781.5A Pending EP4034028A4 (de) 2019-09-27 2020-09-28 Brille der erweiterten realität zur verwendung in der chirurgischen visualisierung und telechirurgie

Country Status (1)

Country Link
EP (1) EP4034028A4 (de)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10149958B1 (en) * 2015-07-17 2018-12-11 Bao Tran Systems and methods for computer assisted operation
WO2018148845A1 (en) * 2017-02-17 2018-08-23 Nz Technologies Inc. Methods and systems for touchless control of surgical environment
US10716643B2 (en) * 2017-05-05 2020-07-21 OrbisMV LLC Surgical projection system and method

Also Published As

Publication number Publication date
EP4034028A4 (de) 2024-01-03

Similar Documents

Publication Publication Date Title
US11819273B2 (en) Augmented and extended reality glasses for use in surgery visualization and telesurgery
US11628038B2 (en) Multi-option all-digital 3D surgery visualization system and control
WO2021062375A1 (en) Augmented and extended reality glasses for use in surgery visualization and telesurgery
JP6043821B2 (ja) 仮想インタラクティブプレゼンスのシステムおよび方法
EP3146715B1 (de) Systeme und verfahren zur chirurgischen visualisierung mit vermittelter realität
Rolland et al. Comparison of optical and video see-through, head-mounted displays
Rolland et al. Optical versus video see-through head-mounted displays in medical visualization
US6847336B1 (en) Selectively controllable heads-up display system
EP2903551B1 (de) Digitales system für videoerfassung und -anzeige in der chirurgie
US20210335483A1 (en) Surgery visualization theatre
US20220387128A1 (en) Surgical virtual reality user interface
JP6364022B2 (ja) 多重現実環境における役割切り替えのためのシステムおよび方法
Satava 3-D vision technology applied to advanced minimally invasive surgery systems
KR20160033721A (ko) 정보 처리 장치, 정보 처리 방법 및, 정보 처리 시스템
JP2015019679A (ja) 情報処理装置、情報処理方法、および、情報処理システム
US11094283B2 (en) Head-wearable presentation apparatus, method for operating the same, and medical-optical observation system
WO2021226134A1 (en) Surgery visualization theatre
JP6311393B2 (ja) 情報処理装置、情報処理方法、および、情報処理システム
Rolland et al. Optical versus video see-through head-mounted displays
US20210278671A1 (en) Head wearable device with adjustable image sensing modules and its system
JP2023526716A (ja) 外科手術ナビゲーションシステムおよびそのアプリケーション
EP4034028A1 (de) Brille der erweiterten realität zur verwendung in der chirurgischen visualisierung und telechirurgie
US20240127931A1 (en) Surgery visualization theatre
EP4146115A1 (de) Operationsvisualisierungstheater
EP4106664A1 (de) Volldigitales multioptionales visualisierungssystem für die 3d-chirurgie und steuerung

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220420

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
RIC1 Information provided on ipc code assigned before grant

Ipc: A61B 17/00 20060101ALN20230825BHEP

Ipc: A61B 34/20 20160101ALN20230825BHEP

Ipc: A61B 90/50 20160101ALN20230825BHEP

Ipc: G06F 3/01 20060101ALI20230825BHEP

Ipc: A61B 90/53 20160101ALI20230825BHEP

Ipc: A61B 34/35 20160101ALI20230825BHEP

Ipc: A61B 90/00 20160101AFI20230825BHEP

A4 Supplementary search report drawn up and despatched

Effective date: 20231205

RIC1 Information provided on ipc code assigned before grant

Ipc: A61B 17/00 20060101ALN20231129BHEP

Ipc: A61B 34/20 20160101ALN20231129BHEP

Ipc: A61B 90/50 20160101ALN20231129BHEP

Ipc: G06F 3/01 20060101ALI20231129BHEP

Ipc: A61B 90/53 20160101ALI20231129BHEP

Ipc: A61B 34/35 20160101ALI20231129BHEP

Ipc: A61B 90/00 20160101AFI20231129BHEP