WO2023052474A1 - Dispositifs et systèmes utilisés pour l'imagerie pendant la chirurgie - Google Patents

Dispositifs et systèmes utilisés pour l'imagerie pendant la chirurgie Download PDF

Info

Publication number
WO2023052474A1
WO2023052474A1 PCT/EP2022/077058 EP2022077058W WO2023052474A1 WO 2023052474 A1 WO2023052474 A1 WO 2023052474A1 EP 2022077058 W EP2022077058 W EP 2022077058W WO 2023052474 A1 WO2023052474 A1 WO 2023052474A1
Authority
WO
WIPO (PCT)
Prior art keywords
head mounted
mounted display
content
real time
processing device
Prior art date
Application number
PCT/EP2022/077058
Other languages
English (en)
Inventor
Manon ROSTYKUS
Original Assignee
Leica Instruments (Singapore) Pte. Ltd.
Leica Microsystems Cms Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Leica Instruments (Singapore) Pte. Ltd., Leica Microsystems Cms Gmbh filed Critical Leica Instruments (Singapore) Pte. Ltd.
Publication of WO2023052474A1 publication Critical patent/WO2023052474A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

Definitions

  • Examples relate to image processing, displays, and communication devices which are used in a surgical setting.
  • Surgical procedures carried out in operating rooms often utilize imaging apparatuses such as microscopes to aid medical professionals in viewing the surgical site.
  • more than one device which may provide data such as vital signs
  • more than one type of real-time video can be useful in the surgical environment.
  • Various medical practitioners which can include surgeons, nurses, and other personal (within the operating room and outside of it), can be aided by having access to multiple sources of information, data, and/or video during the surgery. Such access can improve the quality of care and/or aid in the instruction of new medical professionals.
  • an image processing device configured for: communicatively coupling to at least one head mounted display, communicatively coupling to at least one sensor configured for generating a real time video of a surgical site, and determining a content for display by the at least one head mounted display, and transmitting the content to at least one head mounted display.
  • the content is selectable from at least two of: the real time video from, real time data from an auxiliary data source, or stored data.
  • the ability to select content, including real time content can improve patient care and/or provide flexibility to healthcare workers in a surgical environment.
  • an image processing device configured to communicatively couple to a plurality of auxiliary data sources, and possibly receive a real-time vital sign data from the auxiliary data source which is selected from the plurality of auxiliary data sources.
  • the ability to receive data from auxiliary data sources can provide more flexibility to providing relevant information for various healthcare workers such as those involved with patient care in a surgical environment.
  • an image processing device configured for: communicatively coupling to a second sensor for generating a second real time video of the surgical site.
  • the content can be further selectable from the second real time video.
  • the ability to select content, including real time content, can improve patient care and/or provide flexibility to healthcare workers in a surgical environment.
  • the real time video is a superposition image of a white light video and a fluorescence video.
  • Selectable real time content which includes white light video and/or fluorescence, e.g. from a surgical site, can be particularly useful for surgeons and other medical professionals in a surgical environment.
  • an image processing device in which the real time video is a stereoscopic image.
  • Selectable real time content which includes stereoscopic images, e.g. from a surgical site, can be particularly useful for surgeons and other medical professionals in a surgical environment.
  • an image processing device in which the stored data includes at least one of: a pre-op image or patient identifying data.
  • Selectable content which includes pre-op and/or patient identifying data can be particularly useful for surgeons and other medical professionals in a surgical environment.
  • an image processing device configured to provide: a first mode in which: a first user interface is configured for receiving input of the user selection of the content and the format from a first user. Having a mode available which allows for user selection of content provides more flexibility to providing relevant information for various healthcare workers such as those involved with patient care in a surgical environment.
  • each head mounted display receives the content selected by the first user. It can be convenient for one user to determine the content for the other users, e.g. when explaining or collaborating regarding a surgery. Having the head mounted display for receiving the selection can be convenient for the user.
  • an image processing device in which the at least one head mounted display is a plurality of head mounted displays.
  • the image processing device is configured to provide: a second mode in which: at each head mounted display, a respective user interface is provided.
  • Each respective user interface can be configured for receiving a user selection for selection of at least one of a respective content or a respective format for the respective head mounted display.
  • the user selection can be received by the image processing device for determination of the respective content or the respective format.
  • the user interfaces at each head mounted display can provide or greater flexibility for the users, e.g. to provide more relevant information (e.g. real time videos and/or data) for various professionals associated with the surgery, who may have different responsibilities and/or needs/interests in different types of data.
  • an image processing device in which the format is selectable from a plurality of possible formats including at least one of: a superposition, a picture in picture, a rotation, full-screen, portion-screen, zoom, stereoscopic, monoscopic, augmentation, left eye only, or right eye only; wherein augmentation is a superposition of an image on a semitransparent display, the semitransparent display passing ambient light to a user.
  • a superposition a picture in picture, a rotation, full-screen, portion-screen, zoom, stereoscopic, monoscopic, augmentation, left eye only, or right eye only
  • augmentation is a superposition of an image on a semitransparent display, the semitransparent display passing ambient light to a user.
  • a surgical system including a sensor configured for generating a real time video of a surgical site, at least one head mounted display, and an image processing device.
  • the image processing device can communicatively couple to the at least one head mounted display, communicatively couple to the sensor for generating a real time video of a surgical site, determine a content for display by the at least one head mounted display, and transmit the content to at least one head mounted display.
  • the content can be selectable from at least two of: the real time video, real time data from an auxiliary data source, or stored data.
  • the ability to select content, including real time content can improve patient care and/or provide flexibility to healthcare workers in a surgical environment.
  • a surgical system which includes a stereoscopic surgical microscope which includes the sensor.
  • the real time video can be a stereoscopic image of a surgical site.
  • the ability to select content, including real time content, can improve patient care and/or provide flexibility to healthcare workers in a surgical environment.
  • Selectable real time content which includes stereoscopic images, e.g. from a surgical site, can be particularly useful for surgeons and other medical professionals in a surgical environment.
  • a head mounted display which includes one or more displays, and a mounting structure for mounting on a head of a user.
  • the head mounted display is configured for: communicatively coupling to a surgical system, and receiving a user input for determining a content.
  • the content is selectable from at least two of: a real time video of a surgical site, real time data from an auxiliary data source, and stored data.
  • the ability to select content, including real time content can improve patient care and/or provide flexibility to healthcare workers in a surgical environment.
  • a head mounted display which is configured for selecting the content from a plurality of video streams receivable from a surgical system.
  • the capability of selecting content can improve patient care and/or provide flexibility to healthcare workers in a surgical environment.
  • a head mounted display which is further configured for: transmitting a user input to the surgical system which determines the video stream.
  • Patient care and/or flexibility of information provided to healthcare workers in a surgical environment can be improved, for example, by enabling the selection of content.
  • a head mounted display which is further configured for: communicatively coupling to an auxiliary device, and displaying data received from the auxiliary device.
  • Patient care and/or flexibility of information provided to healthcare workers in a surgical environment can be improved, for example, by enabling the selection of content.
  • a head mounted display which is further configured for: user selection of a format of the displayed content.
  • the format can be selectable from at least: monoscopic display, stereoscopic display, picture-in-picture, or superpositional display. Having various formats available which allows for user selection can provide more flexibility to providing relevant and/or improved accessibility to information for various healthcare workers such as those involved with patient care in a surgical environment.
  • a head mounted display which further includes an adjuster for variable interpupillary distance and/ or an adjuster for variable diopter.
  • adjustability can aid in providing health care workers in a surgical environment surgical information such as real time videos and other information, while possibly reducing eye strain.
  • Fig. 1 shows a surgical system and auxiliary data source
  • Fig. 2 shows an imaging processing device and head mounted displays
  • Fig. 3 A shows a head mounted display
  • Fig. 3B shows a head mounted display
  • Fig. 3C shows a head mounted display
  • Fig. 4 shows a method of communicating a surgical procedure.
  • a trailing “(s)” indicates one or more; for example display(s) indicates one or more displays.
  • a “head mounted display” can include at least one display which can be mounted on a human head for viewing content such as real time video based on video captured at a surgical site.
  • real time video may be provided to multiple users at the same time, e.g. to multiple users, each having a head mounted display for displaying the real time video.
  • image can mean a moving image, e.g. a video.
  • image can mean a static single image.
  • a stored image can be a static image accessible in a memory.
  • a stereoscopic image is a stereoscopic video/moving image.
  • augmentation can be a format of display which is a superposition of an image (video) on a semitransparent display, the semitransparent display passing ambient light to a user.
  • a fluorescence microscope can be a microscope with optics for capturing fluorescence images.
  • a fluorescence microscope may also include optics for capturing other images such as reflectance white light images, including.
  • a fluorescence microscope, herein may be capable of simultaneously capturing fluorescence and white light images.
  • auxiliary device can be used interchangeably with “auxiliary data source.”
  • the surgical system 100 can include at least one sensor 170 such as a camera that can be for generating a real time video of a surgical site.
  • the system can include at least one head mounted display 160, e.g. for displaying content such as a real-time video of the surgical site.
  • the systemlOO can include an image processing device 101, which may include a processor(s) 110 (computer and/or graphics processor) and/or memory 120.
  • Memory 120 may alternatively/additionally include remote memory such as memory accessible in a network coupled to the image processing device 101.
  • the processor(s) may alternatively/additionally include remote processor(s), e.g. in an accessible network.
  • the image processing device 101 can communicatively couple to the at least one head mounted display 160.
  • the image processing device 101 can communicatively couple to the sensor 170.
  • the image processing device 101 can be configured, e.g. programmed, for generating a real time video of a surgical site, such as by processing the data from the sensor(s) 170.
  • the image processing device 101 can determine content for display by the at least one head mounted display 160, and transmit the content to the at least one head mounted display 160.
  • the content can be selectable from: a real time video (e.g. a real-time video determined from the sensor), real time data from an auxiliary data source 180, and stored data (e.g. data stored in the memory 120).
  • the sensor 170 can be part of a surgical imaging device 150 which may be part of the surgical system 100.
  • the device 150 can be a surgical microscope (such as a stereoscopic surgical microscope and/or fluorescence microscope).
  • the system 100 can include an arm 155 which may be movable, and can connect the surgical imaging 150 to the image processing device 101.
  • a head mounted display 160 can be optionally connectable to the surgical system 100, such as at the imaging device 101, surgical microscope 150, and/or arm 155. Alternatively/additionally/ the head mounted display(s) 160 can be wirelessly connected for receiving content, e.g. real time video(s).
  • the configurations described herein can improve ergonomics, such as for allowing flexibility in the positioning of the medical professionals using the imaging device and/or viewing the generated content.
  • Remote access to the content including possibly from ranges that may be beyond the typical wireless transmission range capability, can provide for collaboration and/or teaching events with remote users.
  • Head mounted displays 160 for users can be particularly helpful in allowing direct visualization of the content, e.g. video from the surgical site.
  • a head mounted display 160 can provide improved ergonomics, for example by removing the constraint of being positioned to access oculars and/or a shared panel display.
  • Fig. 2 illustrates an imaging processing device and head mounted displays.
  • the image processing device 101 can process images/data received from the sensor(s) 170, auxiliary data device 180, and/or surgical imaging device 150.
  • the image processing device 101 can generate at least one real-time video 210a, 210b as content 220.
  • the image processing device 101 can transmit selectable content 220, including real-time video(s) 210a, 210b to the head mounted display(s) 160. It can be desirable to have selectable content 220 for the head mounted display(s), such as when multiple users are using headsets 160.
  • the content can be selected from real time video (e.g. real time video from the sensor(s) 170), real time data from the auxiliary data source, and/or stored data.
  • Real-time video(s) 210a, 210b can include video of white light reflectance microscopy, fluorescence microscopy, a superposition of white light and fluorescence microscopy, and/or optical coherence tomography.
  • the content 220 can include ultrasound (e.g. real time ultrasound which can be in the form of video).
  • the selectable content 220 can include selectable real-time video(s) 210a, 210b that can be generated/transmitted by image processing device 101.
  • the image processing device 101 can generate/transmit any content 220 that is selected by the user, e.g. at a user interface of the image processing device 101, surgical system 100, and/or head mounted display 160.
  • the content 220 can be transmitted to each of a plurality of head mounted display 160.
  • the entirety of the content 220 (the selected content) is transmitted to each head mounted display 160.
  • the format 230 of the content 220 is also transmitted to each head mounted display 160.
  • the format 230 is determined by user input from a first user, e.g. at a first head mounted display, the surgical system 100, the image processing device 101, and/or the surgical imaging device 150.
  • the content 220 can be transmitted with the format 230.
  • a picture-in-picture format 230 is transmitted, the content 220 including a stereoscopic video in a main portion of the displayed content, and a stored image (e.g. a pre-op image) displayed in the smaller portion of the displayed content.
  • a superimposed format is transmitted, the content 220 including a stereoscopic video of a white light image and a fluorescence video.
  • the content 220 can be pieced out such that each head mounted display 160 may receive any portion of the content 220.
  • the content 220 includes a real time stereoscopic video and a fluorescence video.
  • a first head mounted display can receive and display, in a first format 230a (e.g. a superposition format), the real time stereoscopic video and the real time fluorescence video (simultaneously); and a second head mounted display 160 can display only part of the content 220, e.g. the real time stereoscopic video.
  • a second user at the second head mounted display can select the format 230b and/or content 220 for the second head mounted display.
  • the second user can select auxiliary data (e.g. vital signs) to be displayed in a picture-in-picture format with the real time stereoscopic video.
  • Each user such as a first user and second user, can possibly determine the respective formats 230a, 230b, e.g. by user input at the respective head mounted displays 160. It is possible that there is a mode selection, e.g. at the image processing device 101, surgical system 100, and/or first head mounted display, that authorizes user input from each head mounted display 160 to be used to determine the respective content(s) and/or format(s) displayed at each respective head mounted display 160.
  • a mode selection e.g. at the image processing device 101, surgical system 100, and/or first head mounted display, that authorizes user input from each head mounted display 160 to be used to determine the respective content(s) and/or format(s) displayed at each respective head mounted display 160.
  • the content 220 can include real time data from an auxiliary data source 180.
  • vital sign data 210c can be selected as content 220 for display to one or more of the head mounted displays 160.
  • Vital sign data can include data such as heartrate, breathing rate, and/or blood pressure, for example.
  • Data such as vital sign data can come from one or more auxiliary data sources 180, which may be communicatively coupled to the image processing device 101 and/or surgical system 100.
  • the real time vital sign data and/or one or more of the auxiliary data source(s) 180 can be selected from a plurality of auxiliary data sources, e.g. as content to be displayed by at least one head mounted display 160.
  • Stored data can be, for example, patient data such as patient identifying data, weight, age, height, and/or images (e.g. pre-op images) that are stored on local and/or remotely located memory 120.
  • patient data such as patient identifying data, weight, age, height, and/or images (e.g. pre-op images) that are stored on local and/or remotely located memory 120.
  • images e.g. pre-op images
  • Alternative/additional stored data can be a brain map.
  • a first selectable real time video 101a may be generated by a first sensor.
  • a first and second sensor may be used to generate the real time video 101a.
  • the real time video is a stereoscopic video.
  • the real time video is a fluorescence video.
  • a second real time video 101b can be selected for display.
  • a first real time video 101a is a stereoscopic video
  • a second real time video 101b is a fluorescence video.
  • the real time video(s) is a superposition of a white light video and a fluorescence video.
  • the image processing device 101 and/or surgical imaging device 100 can be capable of more or fewer real time videos.
  • One or more sensors 170 may provide data for the image processing device 101 to generate real time video(s) 101a, 101b as selected by user(s).
  • the image processing device 101 can have multiple modes of operation which may be selected by user(s).
  • a first mode can be one in which a user interface receives, from a first user, input of the user selection of the content and/or the format.
  • Each head mounted display 160 can receive the content 220 that is selected by the first user.
  • the surgeon can be the first user who may have the ability to select and/or determine the content and/or format for all the head mounted displays 160.
  • having the ability of a single user to select/determine the displayed content can allow for efficient communication to the users.
  • the user interface can be at the image processing device 101, the surgical system 100, and/or at one of the head mounted displays 160, e.g. a first head mounted display which may be attached ot the image processing device 101 and/or surgical system.
  • the format can be selectable from a plurality of possible formats including at least one of: a superposition, a picture in picture, a rotation, full-screen, portion-screen, zoom, stereoscopic, monoscopic, augmentation, left eye only, or right eye only.
  • the image processing device 101 and/or surgical system 100 can provide user interfaces at each head mounted display 160.
  • Each user interface can receive a user selection for content and/or format for each respective head mounted display.
  • the user selection can be received by the image processing device for determination of the respective content and/or the respective format for each head mounted display.
  • the second mode can be helpful to provide flexibility in the type of content and/or format that is viewed by each user. For example, during surgery, tasks and/or responsibilities can be delegated to different medical professionals who may each be better able to perform his/her respective tasks or meet responsibilities by having access to different content and/or format.
  • the surgical system 100 can be used for communication of the surgical procedure, for example.
  • the detector 170 can capture a captured video of the surgical site in an operating room, and the system 100 can include an output 190 which outputs a real-time video to at least one remote user outside of the operating room.
  • the real-time video can be based on the captured video, e.g. the real time video is processed from video captured at least one sensor 170.
  • Fig. 3A illustrates a head mounted display.
  • the head mounted display 300 described herein may be one of a plurality which may be coupled to the image processing device 101, surgical system 100, and/or surgical imaging device 150.
  • the head mounted display can include at least one display, such as two displays 30, 40.
  • the head mounted display 300 can be configured for augmented reality (AR), mixed reality (MR), and/or virtual reality (VR).
  • AR augmented reality
  • MR mixed reality
  • VR virtual reality
  • the head mounted display 300 can have settings for selecting AR, MR, or VR.
  • AR the head mounted display 300 passed ambient light so that the user sees the environment and variable superimposed images generated at the display of the head mounted display.
  • head mounted displays 300 When a plurality of head mounted displays 300 are used, it is possible for one or more to be located outside of the operating room in which the video of the surgical procedure is captured. This can be advantageous, for example, in teaching/training environments, and/or when a surgeon would like to consult with a colleague remotely, e.g. to discuss the content 220. This may improve patient outcome, e.g. by allowing for remote collaboration in real time during surgery. In environments in which there are multiple users (e.g. in teaching environments), having head mounted displays 300 for the users can improve the learning experience, e.g. by facilitating visualization, to each user, of the content 220, rather than having a single screen for which multiple viewers may limit each others’ view. Head mounted displays 300 can also reduce the amount of space in the operating room used by panel displays.
  • the head mounted display(s) 300 can be communicatively coupled to any of the surgical imaging devices 150, surgical systems 100, and/or image processing devices 101 described herein.
  • the head mounted display 300 of Fig. 3 includes a display 310 and a mounting structure 320 for mounting on the head of a user.
  • the mounting structure can be a pair of legs, as shown. Alternatively/additionally, a head band can be used.
  • the head mounted display 300 can receive a real time video stream of a surgical site which is based on a video captured by a surgical imaging device 150.
  • the head mounted display 300 can display selectable content 220, e.g. selected from any one or more of real time video of a surgical site, real time data from an auxiliary data source, and stored data.
  • the head mounted display 300 can receive user input, e.g. for determining the content 220.
  • the user input can be made by a graphical user interface which may be displayed such as superimposed on the content 220.
  • the user input can be determined by voice activation, e.g. through a microphone(s) 340 which may be integrated in the head mounted display 300.
  • User input can be for determining the content 220, e.g. the content 220 transmitted and/or displayed at the head mounted display 300.
  • each head mounted display 300 may determine which content 220 and/or format to display.
  • a head mounted display 300 can be configured for audio communication by the microphone 340 and speaker 350, for example.
  • the head mounted display 300 can communicatively couple to an audio communication device, such as an external device like an audio headset.
  • the audio communication e.g. the microphone 340 can also be used for selection (e.g. menu driven selection) of content 220 and/or format 230 of the displayed content 220.
  • a menu e.g. for user selection of content and/or format, can be provided visually (e.g. at the head mounted display 300) and/or audibly (e.g. at the speaker 350 of the head mounted display 300 and/or via a coupled audio communication device).
  • the content 220 can be selected, e.g. at the respective head mounted display(s) 300, from a plurality of video streams receivable from a surgical system 100 and/or image processing device 101.
  • User input can be received, e.g. at each respective head mounted display 300, for determining the content 220 for the respective head mounted display 300 which receives the user input. It is possible to transmit the user input to the surgical system 100 and /or image processing device 101.
  • the surgical system 100 and /or image processing device 101 can determine the content 220 (e.g. any one or more of the video streams 210a, 210b can be transmitted to the head mounted display(s) 300 as at least part of the content 220.
  • a single user can determine the content 220 and/or format for all of the coupled head mounted display(s) 300.
  • each user can determine content 220 and/or format at each head mounted display 300.
  • the format e.g. of displayed content, can be selected from, for example, monoscopic display, stereoscopic display, picture-in-picture, or superpositional display.
  • the head mounted display(s) 300 can be communicatively coupled to one or more auxiliary devices 180. Data received from the auxiliary device can be displayed by the head mounted display(s) 300.
  • the head mounted display(s) can be directly communicatively coupled to one or more auxiliary devices 180 or, for example, through the image processing device 101 and/or surgical imaging system 100.
  • the real time auxiliary data can include vital signs (e.g. heart rate, breathing rate, blood pressure).
  • vital signs e.g. heart rate, breathing rate, blood pressure.
  • the auxiliary device(s) 180 can be directly coupled to the head mounted display(s) 300.
  • User input e.g. audible input received by the microphone 340 and/or buttons on the head mounted display 300
  • different modes of the imaging device 150 can be activated or deactivated, such as fluorescence imaging modes, modes which utilize image guided surgery (IGS), and/or modes which include activation of a communication channel with auxiliary equipment, e.g. IGS equipment.
  • IGS image guided surgery
  • the head mounted display (s) 300 can displaly content 220 which is selectable from at least the real time video of the surgical site, the real time data from the auxiliary data source, and stored data.
  • content 220 there can be more than one real time video which is selectable as content 220.
  • a second real time video based on video captured by the surgical imaging device 150 can be selected as content 220 and displayed.
  • the selectable content 220 can include real time video which is a processed video, e.g. a superposition image of a white light image nada fluorescence image.
  • the content 220 can include, or example real time video can be a stereoscopic image.
  • Fig. 3B illustrates a head mounted display.
  • the head mounted display 300b described herein may be one of a plurality which may be coupled to the image processing device 101, surgical system 100, and/or surgical imaging device 150.
  • Features described with respect to the illustrated head mounted display 300b can be used in any head mounted display described herein.
  • the head mounted display 300b can display content 220, e.g. content transmitted by the surgical imaging device 150, image processing device 101, and/or surgical imaging system 100.
  • the head mounted display 300b can include a display 10, e.g. for displaying the content 220.
  • the head mounted display 300b can include a mounting structure 20 for mounting on a head of a user.
  • the mounting structure 20 can fasten the head mounted display 300b to the head.
  • the head mounted display 300b can include at least one adjuster 30, 50 for variable diopter(s).
  • Each adjuster(s) 30 can include one or more lenses 32, 34, 36.
  • at least one lens 32, 34, 36 of the adjuster 30 is movable so as to alter the effective diopter of the head mounted display 300b, e.g. for allowing a user to adjust the focus.
  • the user can use the adjuster 30 to focus the image of the display 10 in the user’s eye.
  • the adjuster may allow at least partially for some correction of the user’s myopia or hyperopia.
  • the diopter adjustment by the adjuster 30 can allow the user to better focus the image plane of the ambient image (from the surgical site, for example) on the user’s eye.
  • the adjuster 30 can allow the image plane of the user’s field of view to be moved so as to allow better focus.
  • the head mounted display 300b can include at least one adjuster 30 (such as one or two) for the adjustment/correction of diopter (such as for each eye).
  • At least three lenses of the adjuster can be between the display 10 and the eye of the user.
  • Three or more lenses can allow for a suitable range of correction (e.g. diopter adjustment) and/or magnification.
  • One of the lenses can be an aspheric lens. By using at least one aspheric lens, the size and weight of the optical arrangement may be significantly reduced in comparison to a system using only spherical lenses.
  • the first optical arrangement 30 comprises a first lens 32, a second lens 34 and a third lens 36.
  • the first lens 32 may be the lens of the three lenses closest to the first display 10.
  • the second lens 34 may be arranged between the first lens 32 and the third lens 36.
  • the first optical arrangement 30 may comprise exactly three lenses or may comprise more than three lenses.
  • the three lenses may be glass lenses or may be made of other suitable material.
  • the aspheric lens of the three lenses may be the first, second or third lens.
  • the aspheric lens may be the second lens 34 while the first lens 32 and the third lens may be spherical lenses.
  • all three lenses may be aspheric lenses. In this way, size and weight of the first optical arrangement 30 may be kept low.
  • each lens of the three lenses may comprise a first surface and a second surface.
  • the surfaces of the three lenses may represent or form a sequence of surfaces.
  • the first surface of the first aspheric lens may be a first spherical surface
  • the second surface of the first aspheric lens may be a first aspherical surface
  • the first surface of the second aspheric lens may be a second spherical surface
  • the second surface of the second aspheric lens may be a second aspherical surface
  • the first surface of the third aspheric lens may be a third spherical surface
  • the second surface of the third aspheric lens may be a third aspherical surface.
  • the sequence of surfaces may comprise a first spherical surface followed by a first aspherical surface followed by a second spherical surface followed by a second aspherical surface followed by a third spherical surface followed by a third aspherical surface.
  • each lens of the three lenses comprises a different glass material.
  • Three different glass materials may be used for the three lenses.
  • the first lens may comprise or consist of a first glass material
  • the second lens may comprise or consist of a second glass material
  • the third lens may comprise or consist of a third glass material.
  • the first glass material, the second glass material and the third glass material are three different glass materials.
  • the first lens 32 may be a positive lens and/or an aspheric lens.
  • a focal length of the first lens 32 may be at most 25mm (or at most 20mm or at most 30mm) and/or at least 15mm (or at least 10mm or at least 20mm).
  • the second lens 34 may be a negative lens and/or an aspheric lens.
  • a focal length of the second lens 34 may be at most -15mm (or at most -20mm or at most -13mm) and/or at least -5mm (or at least -10mm or at least -3mm).
  • the third lens 36 may be a positive lens and/or an aspheric lens.
  • a focal length of the third lens 36 may be at most 20mm (or at most 25mm or at most 17mm) and/or at least 10mm (or at least 13 mm or at least 7mm).
  • a desired viewing angle, overall size, weight and/or exit pupil size may be obtained.
  • one or more of the three aspheric lenses may be free form lenses. In this way, the size and weight may be further reduced.
  • a total weight of the three lenses may be at most 30 g (or at most 25 g, at most 20 g or at most 35 g).
  • a diameter of each lens of the three lenses may be at most 25 mm (or at most 20mm or at most 30mm).
  • a total focal length of the adjuster 30 may be at most 30 mm (or at most 35mm or at most 25mm) and/or at least 15 mm (or at least 10 mm or at least 20 mm).
  • a total optimal distance between the first display 10 and an eye of a user caused by the adjuster 30 may be at most 60mm (or at most 70 mm, at most 55 mm or at most 50 mm).
  • size and/or weight of the first optical arrangement 30 and the head mounted display 300b may be kept low.
  • a display diagonal of the display 10 may be at least 125 mm (or at least 150 mm) and/or at most 250 mm (or at most 200 mm). In this way, a sufficiently large image can be displayed while the weight may be kept low.
  • the head mounted display 300b may include a second display and a second adjuster.
  • a single display spans across the fields of view of each eye, and each eye has a corresponding adjuster 30 for the diopter adjustment.
  • the head mounted display 300b can include at least one adjuster 30, 50 for the adjustment/correction of diopter.
  • the head mounted display 300b can include one or more additional optional features corresponding to one or more aspects of any examples described herein.
  • Fig. 3C illustrates schematically a head mounted display 300c.
  • the head mounted display 300c described herein may be one of a plurality which may be coupled to the image processing device 101, surgical system 100, and/or surgical imaging device 150.
  • the head mounted display 300c can include an adjuster 310 (e.g. an interpupillary adjuster) for adjusting the interpupillary distance (IPD).
  • the adjuster can adjust the relative positions of lenses and/or displays 10, 40 (e.g. in a direction substantially parallel to the direction of a line connecting the pupils of a user).
  • the head mounted display 300c can include one or more adjusters 320, 330 for the diopter adjustment.
  • the adjuster(s) 320, 330 for the diopter adjustment can include respective lenses in the optical paths between the user’s respective eyes and respective displays 10, 40.
  • the diopter adjuster(s) 320, 330 can allow the user to improve the focus of the displays 10, 40 of the head mounted display 300c.
  • the adjusters 320, 330 can include respective optics.
  • the head mounted display 300c can include a first adjuster for variable interpupillary distance and/or a second adjuster for variable diopter, and possibly a third adjuster for variable diopter of a second eye. It is possible to adjust the focus of the surgical imaging device 150 such that the third adjuster is not strictly necessary for one user. When multiple users each have a respective head mounted display 300c, it can be advantageous for each user to have the capability of adjusting diopter for each eye.
  • the display(s) 10, 40 may be an LCD display (Liquid Crystal Display), a TFT display (Thin- film transistor-Display) or an OLED display (organic light-emitting diode display).
  • LCD display Liquid Crystal Display
  • TFT display Thin- film transistor-Display
  • OLED display organic light-emitting diode display
  • the head mounted display(s) 300 can have multiple modes.
  • a first mode can be that each head mounted display 300 receives the content 220 selected by a first user, e.g. the surgeon. This can be convenient particularly when the surgeon is communicating to other users, e.g. remote collaborators and/or students.
  • a second mode can be one in which, at each head mounted display, a respective user interface is provided.
  • Each respective user interface can be configured for receiving a user selection for selection of at least one of a mode, a respective content, or a respective format for the respective head mounted display.
  • a second user for example, can select the mode in which the content is determined by the first user (e.g. the surgeon).
  • a user can select a mode in which the content is determined by the respective user.
  • the content 220 can be selected by the respective user and/or the format.
  • Examples of various formats which can be selected are a superposition, a picture in picture, full-screen, portion-screen, zoom, stereoscopic, monoscopic, augmentation, left eye only, or right eye only.
  • the selection can be transmitted to the image processing device 101, surgical imaging device 150, and/or surgical imaging system 100 for transmission of the respective content 220 to the respective head mounted display 300 or for transmission of the respective format to the respective head mounted display.
  • the image processing device 101, surgical imaging device 150, and/or surgical imaging system 100 transmits the selected respective content 220 to the respective head mounted display 300, and the head mounted display 300 controls the format.
  • a user may wish to view the real time video of the white light image of the surgical site as the main portion of the screen, and to view, as a smaller picture in picture display, another source (such as video of an OCT image, a stored image, or the like).
  • the content 220 can be transmitted to the head mounted display 300.
  • the format e.g. the picture and picture selection, can be transmitted by the image processing device 101, surgical imaging device 150, and/or surgical imaging system 100 to the respective head mounted display 300.
  • the format can be determined at the head mounted display 300.
  • the content 220 and the format may be transmitted, e.g. after a selection of picturein-picture format, from the image processing device 101, surgical imaging device 150, and/or surgical imaging system 100 to the respective head mounted display 300.
  • the respective head mounted display 300 may display the content 220 and format as received from the image processing device 101, surgical imaging device 150, and/or surgical imaging system 100.
  • the content and format are transmitted by the image processing device 101 including the superposition format of a stereoscopic white light image and a false-color fluorescence image. These cases illustrate that the content and format can be transmitted to the head mounted display(s) 300.
  • the image processing device 101, surgical imaging device 150, and/or surgical imaging system 100 transmits the selected content 220, and the headset 300 determines the format (e.g. after selection of picture in picture or superposition formats).
  • the image processing device 101 transmits a real time stereoscopic video of a white light image of the surgical site, and a real time video of a fluorescence image.
  • the user at the head mounted display 300, can change the format without the selection of format being transmitted to the image processing device 101.
  • the user can change from a picture-in-picture format to a superposition format.
  • the format selection can be done by transmitting the selected format to the image processing device 101, surgical imaging device 150, and/or surgical imaging system 100; the content 220 and format are transmitted by the image processing device 101, surgical imaging device 150, and/or surgical imaging system 100 to the head mounted display(s) 300.
  • the content 220 can possibly be continuously transmitted from the image processing device 101, surgical imaging device 150, and/or surgical imaging system 100 to the head mounted display(s) 300.
  • the user may select different formats of the content 220, e.g. without transmitting the selection of the format to the image processing device 101, surgical imaging device 150, and/or surgical imaging system 100.
  • Each of the head mounted displays 300, 300b, 300c describe herein include features which can be used in the other head mounted displays 300, 300b, 300c.
  • each can have features for user input, displays, and/or adjusters for IPD and/or diopter(s).
  • Each can be communicatively coupled to auxiliary devices 180 and/or image processing devices 101. The coupling may be wired and/or wireless.
  • Fig. 4 illustrates a method of communicating a surgical procedure.
  • the method 400 can include capturing 410, using a detector 170, a captured video of a surgical site in an operating room, and outputting 420 a real time video to at least one remote user outside of the operating room (e.g. outputting 420 to a head mounted display 300 which is outside of the operating room).
  • the real time video is based on the captured video.
  • the method 400 can be done by an apparatus, such as the surgical system 100 including the surgical imaging device 150 and image processing device 101.
  • the surgical imaging device can be a stereomicroscope.
  • the apparatus can include a head mounted display for displaying the real time video to a user(s) in the operating room, e.g. a surgeon.
  • the output can include the real time video displayed to the user(s) in the operating room.
  • Any of the surgical imaging devices 150, image processing devices 101, and/or surgical systems 100 described herein can be configured, such as by a computer program stored in memory 120, to perform the method 400.
  • An apparatus for performing the method 400 can include at least one detector 170 for capturing a captured video of a surgical site in an operating room, and an output configured to output 420 the real time video.
  • a second detector for example, can be for generating the real time video or a second real time video. Any one or more of the real time videos generated can be a stereoscopic video. Alternatively/additionally, a stereoscopic video can be generated at a user, e.g. at a remotely located head mounted display 300, from two or more real time videos.
  • the apparatus may communicatively couple to one or more auxiliary data sources 180.
  • the method 400 can include receiving a real-time vital sign data from the auxiliary data source(s) 180.
  • the apparatus can determine the content 220 for output, and transmit the content to the remote user(s).
  • the content can be selectable from at least one of: the real time video, real time data from an auxiliary data source, or stored data.
  • the content includes the real time video and at least one of a selectable content of at least one of: real time data from an auxiliary data source, or stored data.
  • Some or all of the method steps described herein may be executed by (or using) a hardware apparatus, like for example, a processor, a microprocessor, a programmable computer or an electronic circuit.
  • a hardware apparatus like for example, a processor, a microprocessor, a programmable computer or an electronic circuit.
  • the methods described herein can be implemented in hardware or in software.
  • the implementation can be performed using a non -transitory storage medium such as a digital storage medium, for example a floppy disc, a DVD, a Blu-Ray, a CD, a ROM, a PROM, and EPROM, an EEPROM or a FLASH memory, having electronically readable control signals stored thereon, which cooperate (or are capable of cooperating) with a programmable computer system such that the respective method is performed.
  • the digital storage medium may be computer readable.
  • Some embodiments include a data carrier having electronically readable control signals, which are capable of cooperating with a programmable computer system, such that one of the methods described herein is performed.
  • Embodiments described herein can be implemented as a computer program product with a program code, the program code being operative for performing one of the methods when the computer program product runs on a computer.
  • the program code may, for example, be stored on a machine readable carrier.
  • inventions include the computer program for performing one of the methods described herein, stored on a machine readable carrier.
  • a computer program having a program code for performing the methods described herein, when the computer program runs on a computer.
  • a computer program configured to operate any one or more of the head mounted display(s) described herein, the image processing device 101 described herein, the surgical system 100 described herein, and/or the surgical imaging device 150 described herein.
  • a storage medium (or a data carrier, or a computer-readable medium) comprising, stored thereon, the computer program for performing the methods described herein when it is performed by a processor.
  • an apparatus as described herein comprising a processor and the storage medium for executing the methods described herein.
  • a data stream or a sequence of signals representing the computer program for performing one of the methods described herein.
  • the data stream or the sequence of signals may, for example, be configured to be transferred via a data communication connection, for example, via the internet.
  • processing means for example, a computer or a programmable logic device, configured to, or adapted to, perform the methods described herein.
  • a programmable logic device for example, a field programmable gate array
  • a field programmable gate array may cooperate with a microprocessor in order to perform one of the methods described herein. The methods described herein are preferably performed by any hardware apparatus.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Endoscopes (AREA)

Abstract

L'invention concerne un dispositif de traitement d'images, configuré pour : être couplé en communication à au moins un affichage monté sur la tête, être couplé en communication à au moins un capteur configuré pour générer une vidéo en temps réel d'un site chirurgical, déterminer un contenu à afficher par le au moins un affichage monté sur la tête, et transmettre le contenu à au moins un affichage monté sur la tête. Le contenu peut être sélectionné parmi au moins deux éléments parmi : la vidéo en temps réel, une donnée en temps réel provenant d'une source de données auxiliaire, ou des données stockées.
PCT/EP2022/077058 2021-09-30 2022-09-28 Dispositifs et systèmes utilisés pour l'imagerie pendant la chirurgie WO2023052474A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021125417.5 2021-09-30
DE102021125417 2021-09-30

Publications (1)

Publication Number Publication Date
WO2023052474A1 true WO2023052474A1 (fr) 2023-04-06

Family

ID=84044448

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/077058 WO2023052474A1 (fr) 2021-09-30 2022-09-28 Dispositifs et systèmes utilisés pour l'imagerie pendant la chirurgie

Country Status (1)

Country Link
WO (1) WO2023052474A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018217951A1 (fr) * 2017-05-24 2018-11-29 Camplex, Inc. Systèmes de visualisation chirurgicale et dispositifs d'affichage
WO2021168449A1 (fr) * 2020-02-21 2021-08-26 Raytrx, Llc Système et commande de visualisation chirurgicale 3d multi-option entièrement numériques

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018217951A1 (fr) * 2017-05-24 2018-11-29 Camplex, Inc. Systèmes de visualisation chirurgicale et dispositifs d'affichage
WO2021168449A1 (fr) * 2020-02-21 2021-08-26 Raytrx, Llc Système et commande de visualisation chirurgicale 3d multi-option entièrement numériques

Similar Documents

Publication Publication Date Title
US10197803B2 (en) Augmented reality glasses for medical applications and corresponding augmented reality system
US20230122367A1 (en) Surgical visualization systems and displays
US20230255446A1 (en) Surgical visualization systems and displays
US20220054223A1 (en) Surgical visualization systems and displays
US11147443B2 (en) Surgical visualization systems and displays
US11154378B2 (en) Surgical visualization systems and displays
US10028651B2 (en) Surgical visualization systems and displays
CN103238339B (zh) 查看和跟踪立体视频图像的系统和方法
EP2903551B1 (fr) Système numérique pour la capture et l'affichage d'une vidéo chirurgicale
US12062430B2 (en) Surgery visualization theatre
US20210282887A1 (en) Surgical augmented reality
EP3547095A1 (fr) Appareil et procédé de traitement d'informations et programme
WO2021226134A1 (fr) Salle de visualisation chirurgicale
US20240266033A1 (en) Surgery visualization theatre
Mueller-Richter et al. Possibilities and limitations of current stereo-endoscopy
JP2004320722A (ja) 立体観察システム
WO2023052474A1 (fr) Dispositifs et systèmes utilisés pour l'imagerie pendant la chirurgie
WO2023052535A1 (fr) Dispositifs et systèmes destinés à être utilisés en imagerie pendant une chirurgie
WO2023052566A1 (fr) Dispositifs et systèmes destinés à être utilisés en imagerie pendant une chirurgie
Southern et al. Video microsurgery: early experience with an alternative operating magnification system
US20230179755A1 (en) Stereoscopic imaging apparatus with multiple fixed magnification levels
US20230380912A1 (en) Medical imaging control apparatus, medical imaging system and method of operating a medical imaging system
US20240090742A1 (en) Portable surgical methods, systems, and apparatus
EP4146115A1 (fr) Salle de visualisation chirurgicale

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22797720

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE