EP2769270A1 - Holographic user interfaces for medical procedures - Google Patents

Holographic user interfaces for medical procedures

Info

Publication number
EP2769270A1
EP2769270A1 EP12798835.0A EP12798835A EP2769270A1 EP 2769270 A1 EP2769270 A1 EP 2769270A1 EP 12798835 A EP12798835 A EP 12798835A EP 2769270 A1 EP2769270 A1 EP 2769270A1
Authority
EP
European Patent Office
Prior art keywords
anatomical image
recited
rendered anatomical
monitored
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP12798835.0A
Other languages
German (de)
French (fr)
Other versions
EP2769270B1 (en
Inventor
Laurent VÉRARD
Raymond Chan
Daniel Simon Anna Ruijters
Sander Hans DENISSEN
Sander Slegt
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Publication of EP2769270A1 publication Critical patent/EP2769270A1/en
Application granted granted Critical
Publication of EP2769270B1 publication Critical patent/EP2769270B1/en
Not-in-force legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/04Processes or apparatus for producing holograms
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/0005Adaptation of holography to specific applications
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/22Processes or apparatus for obtaining an optical image from holograms
    • G03H1/2202Reconstruction geometries or arrangements
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/0005Adaptation of holography to specific applications
    • G03H2001/0061Adaptation of holography to specific applications in haptic applications when the observer interacts with the holobject
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H2210/00Object characteristics
    • G03H2210/303D object
    • G03H2210/333D/2D, i.e. the object is formed of stratified 2D planes, e.g. tomographic data
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H2226/00Electro-optic or electronic components relating to digital holography
    • G03H2226/04Transmission or communication means, e.g. internet protocol
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Definitions

  • the present disclosure relates to medical systems, devices and methods, and more particularly to systems, devices and methods pertaining to integration of holographic image data with other information to improve accuracy and effectiveness in medical applications.
  • an interactive holographic display system includes a holographic generation module configured to display a holographically rendered anatomical image.
  • a localization system is configured to define a monitored space on or around the holographically rendered anatomical image.
  • One or more monitored objects have their position and orientation monitored by the localization system such that coincidence of spatial points between the monitored space and the one or more monitored objects triggers a response in the holographically rendered anatomical image.
  • Another interactive holographic display system includes a processor and memory coupled to the processor.
  • a holographic generation module is included in the memory and configured to display a holographically rendered anatomical image as an in-air hologram or on a holographic display.
  • a localization system is configured to define a monitored space on or around the holographically rendered anatomical image.
  • One or more monitored objects has their position and orientation monitored by the localization system such that coincidence of spatial points between the monitored space and the one or more monitored objects triggers a response in the holographically rendered anatomical image wherein the response in the holographically rendered anatomical image includes one or more of: translation or rotation of the holographically rendered anatomical image, magnification adjustment of the
  • a method for interacting with a holographic display includes displaying a
  • holographically rendered anatomical image localizing a monitored space on or around the holographically rendered anatomical image to define a region for interaction; monitoring a position and orientation of one or more monitored objects by the localization system;
  • FIG. 1 is a block/flow diagram showing a system for interfacing with holograms in accordance with exemplary embodiments
  • FIG. 2 is a perspective view of a hologram rendered with a data map or overlay thereon in accordance with an illustrative embodiment
  • FIG. 3 is a block diagram showing an illustrative process flow for displaying a data map or overlay in a holographic image in accordance with an illustrative embodiment
  • FIG. 4 is a block diagram showing an illustrative system and process flow for displaying static or animated objects in a holographic image in accordance with an illustrative embodiment
  • FIG. 5 is a diagram showing an illustrative image for displaying an objects menu for selecting a virtual objects during a procedure for display in a holographic image in accordance with an illustrative embodiment
  • FIG. 6 is a block diagram showing an illustrative system for controlling a robot using a holographic image in accordance with an illustrative embodiment
  • FIG. 7 is a block diagram showing an illustrative system which employs haptic feedback with a holographic image in accordance with an illustrative embodiment
  • FIG. 8 is a diagram showing multiple views provided to different perspectives in an illustrative system for displaying a holographic image or the like in accordance with one embodiment
  • FIG. 9 is a block diagram showing an illustrative system for controlling a robot remotely over a network using a holographic image in accordance with an illustrative embodiment.
  • FIG. 10 is a flow diagram showing a method for interfacing with a hologram in accordance with an illustrative embodiment.
  • systems, devices and methods which leverage holographic display technology for medical procedures.
  • This can be done using 3D holographic technologies (e.g., in-air holograms) and real-time 3D input sensing methods such as optical shape sensing to provide a greater degree of human-data interaction during a procedure.
  • Employing holographic technology with other technologies potentially simplifies procedure workflow, instrument selection, and manipulation within the anatomy of interest.
  • Such exemplary embodiments described herein can utilize 3D holographic displays for real-time visualization of volumetric datasets with exemplary localization methods for sensing movements in free space during a clinical procedure, thereby providing new methods of human-data interaction in the interventional suite.
  • 3D holography may be used to fuse anatomical data with functional imaging and "sensing" information.
  • a fourth dimension e.g., time, color, texture, etc.
  • a display can be in (near) real-time and use color-coded visual information and/or haptic feedback/tactile information, for example, to convey different effects of states of the holographically displayed object of interest.
  • Such information can include morphological information about the target, functional information about the object of interest (e.g.
  • the exemplary 3D holographic display can be seen from (virtually) any angle/direction so that, e.g., multiple users can simultaneously interact with the same understanding and information.
  • Such "touch” can also be used to, e.g., rotate the virtual organ, zoom, tag points in 3D, draw a path and trajectory plan (e.g., for treatment, targeting, etc.), select critical zones to avoid, create alarms, and drop virtual objects (e.g., implants) in 3D in the displayed 3D anatomy.
  • a path and trajectory plan e.g., for treatment, targeting, etc.
  • select critical zones e.g., for treatment, targeting, etc.
  • create alarms e.g., implants
  • Exemplary embodiments according to the present disclosure can also be used to facilitate a remote procedure (e.g., where the practitioner "acts" on the virtual organ and a robot simultaneously or subsequently performs the procedure on the actual organ), to practice a procedure before performing the actual procedure in a training or simulation setting, and/or to review/study/teach a procedure after it has been performed (e.g., through data recording, storage, and playback of the 3D holographic display and any associated multimodality signals relevant to the clinical procedure).
  • a remote procedure e.g., where the practitioner "acts" on the virtual organ and a robot simultaneously or subsequently performs the procedure on the actual organ
  • to practice a procedure before performing the actual procedure in a training or simulation setting e.g., through data recording, storage, and playback of the 3D holographic display and any associated multimodality signals relevant to the clinical procedure.
  • Exemplary embodiments according to the present disclosure are further described herein below with reference to the appended figures. While such exemplary embodiments are largely described separately from one another (e.g., for ease of presentation and understanding), one having ordinary skill in the art shall appreciate in view of the teachings herein that such exemplary embodiments can be used independently and/or in combination with each other. Indeed, the implementation and use of the exemplary embodiments described herein, including combinations and variations thereof, all of which are considered a part of the present disclosure, can depend on, e.g., particular laboratory or clinical use/application, integration with other related technologies, available resources,
  • a real-time 3D holographic display in accordance with the present principles may include a real-time six degree of freedom (DOF) input via localization technology embedded into a data interaction device (e.g., a haptic device for sensory feedback).
  • DOF real-time six degree of freedom
  • An imaging / monitoring system for multidimensional data acquisition may also be employed. Datalinks between the holography display, localization system / interaction device, and imaging / monitoring system may be provided for communication between these systems.
  • the display, feedback devices, localization devices, measurement devices may be employed with or integrated with a computational workstation for decision support and data libraries of case information that can be dynamically updated / recalled during a live case for training / teaching / procedure guidance purposes (e.g., for similar archived clinical cases relative to the procedure and patient undergoing treatment).
  • the present invention will be described in terms of medical instruments; however, the teachings of the present invention are much broader and are applicable to any systems that can benefit from holographic visualization.
  • the present principles are employed in tracking or analyzing complex biological or mechanical systems.
  • the present principles are applicable to internal tracking procedures of biological systems, procedures in all areas of the body such as the lungs, gastro -intestinal tract, excretory organs, blood vessels, etc.
  • the elements depicted in the FIGS may be implemented in various combinations of hardware and software and provide functions which may be combined in a single element or multiple elements.
  • processor can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software.
  • the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared.
  • explicit use of the term "processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor ("DSP”) hardware, read-only memory (“ROM”) for storing software, random access memory (“RAM”), non-volatile storage, etc.
  • DSP digital signal processor
  • ROM read-only memory
  • RAM random access memory
  • non-volatile storage etc.
  • embodiments of the present invention can take the form of a computer program product accessible from a computer-usable or computer-readable storage medium providing program code for use by or in connection with a computer or any instruction execution system.
  • a computer-usable or computer readable storage medium can be any apparatus that may include, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium.
  • Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk.
  • Current examples of optical disks include compact disk - read only memory (CD-ROM), compact disk - read/write (CD-R/W), Blu-RayTM and DVD.
  • System 100 may include a workstation or console 112 from which a procedure is supervised and/or managed.
  • Workstation 112 preferably includes one or more processors 114 and memory 116 for storing programs and applications.
  • Memory 116 may store a holographic generation module 115 configured to render a holographic image on a display 158 or in-air depending on the application.
  • the holographic generation module 115 codes image data to generate a three dimensional hologram.
  • the coding may provide the hologram on a 2D display or in 3D media or 3D display.
  • data from 3D imaging e.g., computed tomography, ultrasound, magnetic resonance may be transformed into a hologram using spatial distribution and light intensity to render the hologram.
  • a localization system 120 includes a coordinate system 122 to which a holographic image or hologram 124 is registered.
  • the localization system 120 may also be employed to register a monitored object 128, which may include virtual instruments, which are separately created and controlled, real instruments, a physician's hands, fingers or other anatomical parts, etc.
  • the localization system 120 may include an electromagnetic tracking system, a shape sensing system, such as a fiber optic based shape sensing system, an optical sensing system, including light sensors and arrays, or other sensing modality, etc.
  • the localization system 120 is employed to define spatial regions in and around the hologram or the holographic image 124 to enable a triggering of different functions or actions as a result of movement in the area of the hologram 124.
  • dynamic locations of a physician's hands may be tracked using a fiber optic shape sensing device.
  • the intensity of the hologram may be increased.
  • the physician's hand movements may be employed to spatially alter the position or orientation of the hologram 124 or to otherwise interact with the hologram 124.
  • a monitored object or sensing system 128 may be spatially monitored relative to the hologram 124 or the space 126 around the hologram 124.
  • the monitored object 128 may include the physician's hands, a real or a virtual tool, another hologram, etc.
  • the monitored object 128 may include a sensor or sensors 132 adapted to monitor the position of the monitored object 128 such that when a position of the object or a portion thereof is within the hologram 124 or the space 126 around the hologram 124, a reaction occurs that is consistent with the type of the monitored object 128 and the action performed or to be performed by the monitored object 128.
  • the sensor or sensors 132 may include EM sensors, fiber optic shape sensors, etc.
  • the sensors 132 include fiber optic shape sensors.
  • a sensor interpretation module 134 may be employed to interpret feedback signals from a shape sensing device or system (132).
  • Interpretation module 134 is configured to use the signal feedback (and any other feedback, e.g., optical, electromagnetic (EM) tracking, etc.) to reconstruct motion, deflection and other changes associated with the monitored object 128, which may include a medical device or instrument, virtual tools, human anatomical features, etc.
  • the medical device may include a catheter, a guidewire, a probe, an endoscope, a robot, an electrode, a filter device, a balloon device, or other medical component, etc.
  • the shape sensing system (132) may include one or more optical fibers which are coupled to the monitored object 128 in a set pattern or patterns.
  • the optical fibers connect to the workstation 112 through cabling 127.
  • the cabling 127 may include fiber optics, electrical connections, other instrumentation, etc., as needed.
  • Shape sensing system (132) may be based on fiber optic Bragg grating sensors.
  • a fiber optic Bragg grating (FBG) is a short segment of optical fiber that reflects particular wavelengths of light and transmits all others. This is achieved by adding a periodic variation of the refractive index in the fiber core, which generates a wavelength-specific dielectric mirror.
  • a fiber Bragg grating can therefore be used as an inline optical filter to block certain wavelengths, or as a wavelength-specific reflector.
  • a fundamental principle behind the operation of a fiber Bragg grating is Fresnel reflection at each of the interfaces where the refractive index is changing. For some wavelengths, the reflected light of the various periods is in phase so that constructive interference exists for reflection and, consequently, destructive interference for transmission.
  • the Bragg wavelength is sensitive to strain as well as to temperature. This means that Bragg gratings can be used as sensing elements in fiber optical sensors. In an FBG sensor, the measurand (e.g., strain) causes a shift in the Bragg wavelength.
  • One advantage of this technique is that various sensor elements can be distributed over the length of a fiber. Incorporating three or more cores with various sensors (gauges) along the length of a fiber that is embedded in a structure permits a three dimensional form of such a structure to be precisely determined, typically with better than 1 mm accuracy.
  • a multitude of FBG sensors can be located (e.g., 3 or more fiber sensing cores). From the strain measurement of each FBG, the curvature of the structure can be inferred at that position. From the multitude of measured positions, the total three-dimensional form is determined.
  • workstation 112 includes an image generation module 148 configured to receive feedback from the shape sensing system 132 or other sensor to sense interactions with the hologram 124.
  • a position and status of the hologram 124 and its surrounding space 126 is known to the localization system 120.
  • a comparison module 142 determines whether an action is triggered depending on a type of motion, a type of monitored object 128, a type of procedure or activity and/or any other criteria.
  • the comparison module 142 informs the holographic generation module 115 that a change is needed.
  • the holographic generation module 115 recodes the image data, which is processed and output to the image generation module 148, which updates the hologram 124 in accordance with set criteria.
  • the hologram 124 may include an internal organ rendered based on 3D images 152 of a patient or subject 150.
  • the images 152 may be collected from the patient 150 preoperative ly using an imaging system 110.
  • the imaging system 110 and the patient 150 need not be present to employ the present principles as the system 100 may be employed for training, analysis or other purposes at any time.
  • a physician employs a pair of gloves having sensors 132 disposed thereon.
  • the gloves/sensors 132 enter the space 126 and coincide with the hologram 124, the physician is able to rotate or translate the hologram 124.
  • the gloves include a haptic device 156 that provides tactile feedback depending on a position of the gloves
  • the haptic feedback is indicative of the tissue type corresponding with the hologram 124 and its representation.
  • the haptic device or system 156 may include ultrasound sources, speakers or other vibratory sources to convey differences in state of the hologram 124 using vibrations or sound.
  • a display 118 and or display 158 may also permit a user to interact with the workstation 112, the hologram 124 and its components and functions, or any other element within the system 100. This is further facilitated by an interface 130 which may include a keyboard, mouse, a joystick, a haptic device, or any other peripheral or control to permit user feedback from and interaction with the workstation 112.
  • a user can touch (or otherwise interact with) a specific region of interest (ROI) 154 within the 3D holographic display 158 or the hologram 124 within the 3D holographic display (or elsewhere) to display additional information related to the selected specific region of interest, e.g., tissue characteristics, such as temperature, chemical content, genetic signature, pressure, calcification percent, etc.
  • ROI region of interest
  • An overlay of information can be displayed or presented on a separate exemplary 2D display (118), whereby parts of the 2D display can be transparent, for example, for better viewing of displayed information.
  • the exemplary 2D display 118 presents or displays other graphics and text in high resolution (e.g., in exemplary embodiments where the 3D display may be of relatively low or limited resolution).
  • zones or regions of interest 154 can be automatically highlighted and/or outlined within the 3D holographic display 158 or hologram 124.
  • Such other zones of interest can be, e.g., zones which have similar characteristics as the selected zone of interest and/or zones which are otherwise related.
  • the 3D holographic display 158 or hologram 124 may be employed with six degrees of freedom (6DOF) user tracking, e.g., with shape enabled instruments 132 and/or with camera based sensors 137, allowing for use as a user interface in 3D and real-time 6DOF user interaction.
  • 6DOF degrees of freedom
  • a user e.g., practitioner
  • the user can rotate, zoom in/out (e.g., changing the
  • Seed points 162 may be created and dropped into the 3D holographic display 158 or hologram 124 by touching (and/or tapping, holding, etc.) a portion of the display 158 or the hologram 124.
  • the seed points 162 may be employed for, e.g., activation of virtual cameras which can provide individually customized viewing perspectives (e.g., orientation, zoom, resolution, etc.) which can be streamed (or otherwise transmitted) onto a separate high resolution 2D display 118.
  • the touch feature can by employed to create or drop virtual seed points 162 into the 3D display 158 for a plurality of tasks, e.g., initialization of segmentation, modeling, registration or other computation, visualization, planning step, etc.
  • the display can also be used to display buttons, drop down menus, pointers/trackers, optional functions, etc. allowing users to interact and give commands to the system and/or any computer included therein or connected thereto (e.g., directly connected or via the Internet or other network).
  • a microphone 164 may be employed to receive verbal information to connect, control, interact, etc. with the exemplary 3D holographic display 158 or hologram 124 via voice-controlled commands.
  • a speech recognition engine 166 may be provided to convert speech commands into program commands to allow a user (e.g., surgeon) to interact with the 3D holographic display 158 or hologram 124 without having to use their hands. For example, a user could say "SHOW LAO FORTY", and the volume displayed within the holographic image would rotate to the proper angle to provide the user with the desired view.
  • commands can range from those which are relatively simple, such as "ZOOM", followed by a specific amount e.g., "3 times” or so as to display particular (additional) information, to more complex commands, e.g., which can be related to a specific task or procedure.
  • a recording mode can be provided in memory 116 and made available to, e.g., play back a case on a same device for full 3D replay and/or on conventional (2D or 3D) viewing devices with automatic conversion of recorded 3D scenes into multiple 2D viewing perspectives (or rotating 3D models, e.g., in virtual reality modeling language (VRML)).
  • Data connections between the holographic display 158 and recordings archived in a library/database 168 such as a picture archiving and communication system (PACS), Radiology Information Systems (RIS) or other electronic medical record system can be used to facilitate, e.g., visualization and diagnostic interpretation/data mining.
  • a library/database 168 such as a picture archiving and communication system (PACS), Radiology Information Systems (RIS) or other electronic medical record system can be used to facilitate, e.g., visualization and diagnostic interpretation/data mining.
  • PACS picture archiving and communication system
  • RIS Radiology Information Systems
  • other electronic medical record system can be used to facilitate, e.g
  • Recordings can be replayed and used for, e.g., teaching and training purposes, such as to teach or train others in an individual setting, (e.g., when a user wants to review a recorded procedure performed), a small group environment (e.g., with peers and/or management), a relatively large class, lecture, etc.
  • Such exemplary recordings may also be used for marketing presentations, research environments, etc. and may also be employed for quality and regulatory assessment, e.g., process evaluation or procedure assessment by hospital administrators, third-party insurers, investors, the Food and Drug Administration (FDA) and/or other regulatory bodies.
  • Virtual cameras may be employed to capture or record multiple viewpoints/angles and generate multiple 2D outputs for, e.g., video capture or simultaneous display of images on different 2D television screens or monitors (or sections thereof).
  • three-dimensional (3D) holography may be used to display volumetric data of an anatomy (e.g., from a 3D CT scan), for example, to fuse anatomical with functional imaging and "sensing" information, as well as temporal (time-related) information.
  • the information may be employed to create (generate, produce, display, etc.) a dynamic 3D multimodality representation 202 (e.g., a hologram) of an object (e.g., organ) and a status thereof using visual indicators 204, 206, such as colors, contrast levels and patterns from a display 210.
  • the object 202 e.g., hologram 124) may show different regions 204, 206 to indicate useful data on the object 202.
  • epicardial and/or endocardial mapping data can be used to, e.g., display electrical activity data on a heart image during an electrophysiology procedure, superimposed with the anatomical imaging data of the heart (e.g., coming from CT, XperCT or MR).
  • anatomical imaging data of the heart e.g., coming from CT, XperCT or MR
  • Another example is the display of temperature maps which can be provided by MR during ablation, or magnetic resonance high- intensity focused ultrasound (MR-HIFU) 4D (four-dimensional) information during an intervention (e.g., using MR digital data transfer systems and procedures).
  • MR-HIFU magnetic resonance high- intensity focused ultrasound
  • 4D four-dimensional information during an intervention
  • information associated with a real-time radiation dose distribution map superimposed over the anatomical target during a radiation oncology treatment Linac, brachytherapy, etc.
  • Other embodiments are also contemplated.
  • a volumetric image 302 of a heart is acquired and may be segmented to reduce computational space and to determine anatomical features of the heart as opposed to other portions of the image. This results in a segmented image 304.
  • Functional or device data is acquired by performing measurements or tests in block 306 on the heart or other anatomical feature.
  • an electroanatomical map or other map is generated corresponding with the heart or organ. The map is registered to the segmented image 304 to provide a registered image 310 that may be generated and displayed as a hologram.
  • data may be collected from within or about the heart using a localization technique (shape sensing, etc.).
  • Data traces of catheter positions or other related data (treatment locations, etc.) may be rendered in a holographic image 312 which includes both the anatomical data (e.g., segmented hologram) and the device data (e.g., catheter data).
  • Another exemplary embodiment according to the present disclosure includes the acquisition of incomplete data (e.g., projections rather than full 3D images).
  • This may include, for example, data in Fourier (frequency) space where intermittent or incomplete images are acquired.
  • undersampled image data in the frequency domain are collected.
  • it is possible to construct (generate, produce, display, etc.) a 3D holographic image display with relatively less or a reduced amount of input data, and thus a relatively less or reduced amount of associated
  • the resultant 3D holographic image may be constructed/displayed with (some) limitations.
  • Such exemplary embodiments can help achieve real-time or near-real-time dynamic displays with significantly less radiation exposure (e.g., in the case of live X-ray imaging) as well as computational overhead, which benefits can be considered (e.g., balanced, weighed against) in view of the potential limitations associated with this exemplary embodiment.
  • another exemplary embodiment includes inputting virtual instruments or objects into a holographic display.
  • objects 402 may be digitized or otherwise rendered into a virtual environment 404 and displayed.
  • the objects 402 may be drawn or loaded into the workstation 112 as object data 405 and may be coded into the display 158 and concurrently renders with the hologram 124.
  • a static image of the object 402 may appear in the hologram 124 and may be separately manipulated with the hologram 124 (and or on the display 158).
  • the static image may be employed for size comparisons or measurements between the object 402 and the hologram 124.
  • a converter box 406 may be included to employ a standardization protocol to provide for a "video-ready" interface to the 3D holographic display 158.
  • the converter box 406 can format the x, y, z coordinates from each localized instrument or object 402 (catheter, implant, etc.) into a space readable by the holographic display 158 (e.g., rasterized/scan converted voxel space, vector space, etc.). This can be performed using the workstation 112 in FIG. 1.
  • the 3D format should at least support voxels (for volumes), and graphical elements / primitives e.g., meshes (a virtual catheter can be displayed as a tube) and lines in 3D (to encode
  • the 3D format can be varied in accordance with the present disclosure based on, e.g., particular laboratory or clinical use or applications, integration with other related technologies, available resources, environmental conditions, etc.
  • the object 402 e.g., a computer aided design rendering, model, scan, etc. for an instrument, medical device, implant, etc.
  • the object 402 may be independently manipulated relative to the hologram 124 on the display 158 or in the air.
  • the object 402 can be placed in or around the hologram 124 to determine whether the object will fit within a portion of the hologram 124, etc.
  • an implant may be placed through a blood vessel to test the fit visually. It is also contemplated that other feedback may be employed.
  • a comparison module may be capable of determined interference between the hologram 124 and the objects 402 to enable, say, haptic feedback to indicate that a clearance for the implant is not possible.
  • Other applications are also contemplated.
  • the system 100 of FIG. 1 and/or FIG. 4 may be employed as an education and/or training tool.
  • a practitioner e.g., surgeon, physician, fellow, doctor, etc.
  • a procedure surgery, case, etc.
  • a fellow/practitioner could practice (perform virtually) a surgical procedure
  • a tracked input device e.g., an instrument tracked with shape sensing, electromagnetic tracking, acoustic tracking, or machine vision based optical tracking (time-of- flight cameras or structured light cameras), may be employed in conjunction with the display 158 to access a virtual help mode trigger point 504 (or other functions) in the hologram 124 and generated by the image generation module 148.
  • the virtual help trigger point 504 may include pixel regions within the display or hologram.
  • the region or trigger point 504 may be selected (e.g., virtually selected and displayed by using the tip of the tracked virtual tool (402) (or using the monitored object 128) which is automatically registered with the hologram 124 in the image.
  • the trigger points 504 are selected in the hologram 124 and a menu 502 or other interactive space may open to permit further selections.
  • a fellow/practitioner could first select a program called "HIP” by activating a trigger point 504 to display a 3D CT image of a patient's (subject's) hip, and then select different "HIP IMPLANTS” from different manufacturers to see and "feel” which implant would fit best for the particular patient. It is also possible to use (e.g., physically hold and manipulate) the actual implant in the air and position it within the 3D holographic display to see, feel and assess fit (e.g., if and how well such implant may fit the particular patient).
  • FIG. 5 shows the virtual menu 502 that may be provided in the holographic display 158 or other display 118 to permit the selection of a stent 508.
  • the virtual menu 502 can be called using the display 158, the hologram 124 or by employing interface 130.
  • a virtual model is rendered (see FIG. 4) in the display 158 or hologram 124 to permit manipulation, measurement, etc.
  • the virtual menu 502 provides for clinical decision support tying together localization and an exemplary holographic user interface in accordance with an exemplary embodiment.
  • the shape tracked instrument (128) e.g., a catheter
  • the virtual menu 502 can pop up automatically for each region, or the trigger point 504 may be activated by placing the object tip into the region of the trigger point 504 or otherwise activating the trigger point 504 (e.g., touching it, etc.).
  • An implant or other device may be selected, which is then introduced to allow for device sizing/selection to be performed in the virtual holography space (e.g., within or in close proximity to the holographic display).
  • a 3D holographic display 158 or hologram 124 may be employed during surgery to interact with a device 602 inside the patient.
  • Robotics via a master/slave configuration can be used, where a shape sensed analog 604 of the device 602 moving within the display 158 is employed to actuate the motion of the actual device 602 within a target region 606.
  • a practitioner's (surgeon's, physician's, etc.) hands 610 or voice can be tracked by sensor-based and/or voice-based techniques, such as by, e.g., tracking a physician's hands using a shape-sensing device 608 and shape sensing system 614 in the 3D holographic display.
  • a practitioner's movements including, e.g., (re)positioning, orientation, etc. of their hands
  • the device 602 such as a robot 612 (e.g., robotically controlled instruments) inside the patient to replicate such movements within the patient's body, and thereby perform the actual surgery, procedure or task inside the patient's body.
  • a surgeon can see, touch and feel a 3D holographic display of an organ, perform a procedure thereon (i.e., within the 3D holographic display), causing such procedure to be performed inside of a patient on the actual organ via, or simply to move instruments, e.g., robotically controlled instruments.
  • the movement of the physician creates sensing signals using sensor 608 (and/or sensors in device 604), which are adapted to control signals by the system 100 for controlling the robot or other device 602.
  • the signals may be stored in memory (1 16) for delayed execution if needed or desired.
  • the actual procedure may be performed in real-time (or near real-time), e.g., a robot 612 performs a specific movement within a patient's body concurrently with a surgeon's performance within the 3D holographic display 158.
  • the actual procedure may be performed by, e.g., a robot (only) after a surgeon completes a certain task/movement (or series of tasks or movements), and/or the surgeon confirms that the robot should proceed (e.g., after certain predefined criteria and/or procedural milestone(s) are reached).
  • a delay e.g., between the virtual performance of a task or movement within the 3D holographic display to the actual performance within a patient's body
  • a practitioner could opt to redo a specific movement/task that is virtually performed within the 3D holographic display 158 if the practitioner is not satisfied with such movement or task (for any reason).
  • the surgeon could opt to redo such virtual movement or task (as many times as desired or may be necessary) until it is performed correctly.
  • the actual movement or task can be performed by a robot inside of the patient with or without dynamic adaptation of the task to adjust for changes in target or therapy instrument on-the-fly (e.g., dynamically, on a continuous basis, in real-time, etc.).
  • a haptic device 710 may take many forms.
  • a set of ultrasonic probes 702 can be used to send customized waves towards a 3D holographic display 704 to give the user (e.g., a practitioner) a sense of feeling of structures being displayed and represented by the 3D holographic display 704.
  • such waves can be tailored or customized to represent hard or stiff materials of bony structures, while other tissues, such as of the liver and/or vessels, can be represented with a relatively softer "feel" by waves which are tailored or configured accordingly.
  • Such encoding can be realized by, e.g., modulation of the frequency, amplitude, phase, or other spatiotemporal modulation of the excitation imparted by a haptic device 710 to the observer.
  • the haptic feedback device 710 may be employed to, e.g., represent physical resistance of a particular structure or tissue in response to a movement or task performed by a practitioner.
  • a haptic device 712 such as, e.g., a glove(s), bracelet, or other garments or accessories having actuators or other vibratory elements.
  • the exemplary 3D holographic display 158 can be seen from (virtually) any angle, e.g., so that all users can interact with the same understanding and information. This is the case for an in-air hologram; however, other display techniques may be employed to provide multiple individuals with a same view.
  • display information may be provided to different users positioned in a room or area, by displaying the same or different information on a geometrical structure 802 (holographically), such as a multi- faceted holographic display where each face of the display (e.g., a cube or polyhedron) displays the information.
  • a geometrical structure 802 holographically
  • This can be achieved by projecting multiple 2D video streams 804 on the geometrical structure 802 (e.g., side by side, or partially overlapping) rendered within a holographic output 806.
  • a holographic "cube" display in 3D can show/display on one cube face information (e.g., a 2D live x-ray image) in one particular direction (e.g., the direction of a first practitioner 808), while another cube face of the same "cube” display can show/display another type of information (e.g., an ultrasound image) to a second practitioner 810 positioned elsewhere in the room (e.g., diametrically opposite the display from the first practitioner).
  • cube face information e.g., a 2D live x-ray image
  • another cube face of the same "cube” display can show/display another type of information (e.g., an ultrasound image) to a second practitioner 810 positioned elsewhere in the room (e.g., diametrically opposite the display from the first practitioner).
  • Such exemplary display can be configured at will depending on, e.g., the number of users in the room. It is also possible that the position (in the room) of each
  • each user/practitioner can be tracked (in the room) and that each individual's display information follows each user's viewing perspective as the user moves (e.g., during a procedure). For example, one particular user (doctor, nurse, etc.) can be provided with the specific information that the user needs regardless of where in the room such particular user moves during a procedure. Further, it is also possible that each user is provided with a unique display, which can be a 2D cube face, such as described above, or a 3D holographic display customized or tailored for such user, and that such a unique display can "follow" the user as the user moves around a room.
  • a unique display which can be a 2D cube face, such as described above, or a 3D holographic display customized or tailored for such user, and that such a unique display can "follow" the user as the user moves around a room.
  • text is an inherently 2D mode of communication.
  • the system may display shapes/symbols identifiable from multiple viewpoints, or represent the text oriented towards the viewer.
  • the oriented text may be shown in multiple directions simultaneously or to each independently in different frames.
  • a remote system 900 may include at least some of the capabilities of system 100 (FIG. 1) but is remotely disposed relative to a patient 902 and data collection instruments.
  • a user may conduct a procedure remotely (e.g., with the user being physically located off-site from the location where the subject/patient 902 is located) or assist or provide guidance remotely to the procedure.
  • a user can perform a procedure/task on an exemplary holographic display 904 located at their location.
  • the display 904 is connected (e.g., via the Internet or other network 910 (wired or wireless)) to the system 100 co-located with the patient 902.
  • System 100 can be in continuous
  • the system 100 may include robotically controlled instruments 906, e.g., inside of a patient) which are controlled via commands provided (e.g., via the Internet) by the remote system 900, as described above. These commands are generated based on the user's interaction with the holographic display 904.
  • Holographic displays 158 and 904 may include the same subject matter at one or more locations so that the same information is conveyed at each location. For example, this embodiment may include, e.g., providing guidance or assistance to another doctor around the globe, for peer-to-peer review, expert assistance or a virtual class room where many students could attend a live case from different locations throughout the world.
  • a method for interacting with a holographic display is shown in accordance with illustrative embodiments.
  • a holographically rendered anatomical image is generated and displayed.
  • the image may include one or more organs or anatomical regions.
  • the holographically rendered anatomical image may be generated in-air.
  • a monitored space is localized on or around the holographically rendered anatomical image to define a region for interaction.
  • the localization system may include one or more of a fiber optic shape sensing system, an electromagnetic tracking system, a light sensor array and/or other sensing modality.
  • the position and orientation of the monitored space and the one or more monitored objects is preferably determined in a same coordinate system.
  • the one or more monitored objects may include a medical instrument, an anatomical feature of a user, a virtual object, etc.
  • a position and orientation of one or more monitored objects is monitored by the localization system.
  • coincidence of spatial points is determined between the monitored space the one or more monitored objects.
  • a response is triggered in the ho lo graphically rendered anatomical image.
  • the response may include moving the holographically rendered anatomical image (e.g. 6DOF) or changing its appearance.
  • the response may include adjusting a zoom (magnification) or other optical characteristics of the holographically rendered anatomical image.
  • the holographically rendered anatomical image may be marked, tagged, targeted, etc.
  • camera viewpoints can be assigned (for other viewers or displays).
  • feedback may be generated to a user.
  • the feedback may include haptic feedback (vibrating device or air), optical feedback (visual or color differences), acoustic feedback (verbal, alarms), etc.
  • a response region may be provided and monitored by the localization system such that upon activating the response region a display event occurs.
  • the display event may include generating a help menu in block 1024; generating a menu of virtual objects to be included in the holographically rendered anatomical image upon selection in block 1026; and generating information to be displayed in block 1028.
  • the holographically rendered anatomical image may be generated with superimposed medical data mapped to positions on the holographically rendered anatomical image.
  • the response that is triggered may include generating control signals for operating robotically controlled instruments. The control signals may enable remote operations to be performed.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Optics & Photonics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Endoscopes (AREA)
  • Holo Graphy (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Architecture (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Computer Hardware Design (AREA)
  • Image Generation (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)
  • Multimedia (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)

Abstract

An interactive holographic display system includes a holographic generation module (115) configured to display a holographically rendered anatomical image. A localization system (120) is configured to define a monitored space (126) on or around the holographically rendered anatomical image. One or more monitored objects (128) have their position and orientation monitored by the localization system such that coincidence of spatial points between the monitored space and the one or more monitored objects triggers a response in the holographically rendered anatomical image.

Description

HOLOGRAPHIC USER INTERFACES FOR MEDICAL PROCEDURES
CROSS-REFERENCE TO RELATED APPLICATIONS:
This application claims priority to U.S. provisional application number 61/549,273 filed on October 20, 2011, the entire disclosure of which is hereby incorporated herein by reference in its entirety.
BACKGROUND:
Technical Field
The present disclosure relates to medical systems, devices and methods, and more particularly to systems, devices and methods pertaining to integration of holographic image data with other information to improve accuracy and effectiveness in medical applications.
Description of the Related Art
Auto-stereoscopic displays (ASDs) for three-dimensional (3D) visualization on a two-dimensional (2D) panel, without the need for user goggles/glasses, have been
investigated. However, resolution and processing time limits the ability to render high quality images using this technology. Additionally, these displays have generally been confined to a 2D plane (e.g., preventing a physician from moving around or rotating the display to view the data from different perspectives). Although different perspectives may be permitted with a limited field of view, the field of view for this type of display still suffers from breakdown of movement parallax.
Similarly, user input for manipulation of data objects has largely been confined to mainstream 2D mechanisms, e.g., mice, tablets, keypads, touch panels, camera-based tracking, etc. Accordingly, there is a need for a system, device and method as disclosed and described herein which can be used to overcome the above-identified deficiencies.
SUMMARY
In accordance with the present principles, an interactive holographic display system includes a holographic generation module configured to display a holographically rendered anatomical image. A localization system is configured to define a monitored space on or around the holographically rendered anatomical image. One or more monitored objects have their position and orientation monitored by the localization system such that coincidence of spatial points between the monitored space and the one or more monitored objects triggers a response in the holographically rendered anatomical image.
Another interactive holographic display system includes a processor and memory coupled to the processor. A holographic generation module is included in the memory and configured to display a holographically rendered anatomical image as an in-air hologram or on a holographic display. A localization system is configured to define a monitored space on or around the holographically rendered anatomical image. One or more monitored objects has their position and orientation monitored by the localization system such that coincidence of spatial points between the monitored space and the one or more monitored objects triggers a response in the holographically rendered anatomical image wherein the response in the holographically rendered anatomical image includes one or more of: translation or rotation of the holographically rendered anatomical image, magnification adjustment of the
holographically rendered anatomical image, marking of the holographically rendered anatomical image and feedback generation.
A method for interacting with a holographic display includes displaying a
holographically rendered anatomical image; localizing a monitored space on or around the holographically rendered anatomical image to define a region for interaction; monitoring a position and orientation of one or more monitored objects by the localization system;
determining coincidence of spatial points between the monitored space the one or more monitored objects; and if coincidence is determined, triggering a response in the
holographically rendered anatomical image.
These and other objects, features and advantages of the present disclosure will become apparent from the following detailed description of illustrative embodiments thereof, which is to be read in connection with the accompanying drawings.
BRIEF DESCRIPTION OF DRAWINGS
This disclosure will present in detail the following description of preferred embodiments with reference to the following figures wherein:
FIG. 1 is a block/flow diagram showing a system for interfacing with holograms in accordance with exemplary embodiments;
FIG. 2 is a perspective view of a hologram rendered with a data map or overlay thereon in accordance with an illustrative embodiment; FIG. 3 is a block diagram showing an illustrative process flow for displaying a data map or overlay in a holographic image in accordance with an illustrative embodiment;
FIG. 4 is a block diagram showing an illustrative system and process flow for displaying static or animated objects in a holographic image in accordance with an illustrative embodiment;
FIG. 5 is a diagram showing an illustrative image for displaying an objects menu for selecting a virtual objects during a procedure for display in a holographic image in accordance with an illustrative embodiment;
FIG. 6 is a block diagram showing an illustrative system for controlling a robot using a holographic image in accordance with an illustrative embodiment;
FIG. 7 is a block diagram showing an illustrative system which employs haptic feedback with a holographic image in accordance with an illustrative embodiment;
FIG. 8 is a diagram showing multiple views provided to different perspectives in an illustrative system for displaying a holographic image or the like in accordance with one embodiment;
FIG. 9 is a block diagram showing an illustrative system for controlling a robot remotely over a network using a holographic image in accordance with an illustrative embodiment; and
FIG. 10 is a flow diagram showing a method for interfacing with a hologram in accordance with an illustrative embodiment.
DETAILED DESCRIPTION OF EMBODIMENTS
In accordance with the present principles, systems, devices and methods are described which leverage holographic display technology for medical procedures. This can be done using 3D holographic technologies (e.g., in-air holograms) and real-time 3D input sensing methods such as optical shape sensing to provide a greater degree of human-data interaction during a procedure. Employing holographic technology with other technologies potentially simplifies procedure workflow, instrument selection, and manipulation within the anatomy of interest. Such exemplary embodiments described herein can utilize 3D holographic displays for real-time visualization of volumetric datasets with exemplary localization methods for sensing movements in free space during a clinical procedure, thereby providing new methods of human-data interaction in the interventional suite.
In one exemplary embodiment, 3D holography may be used to fuse anatomical data with functional imaging and "sensing" information. A fourth dimension (e.g., time, color, texture, etc.) can be used to represent a dynamic 3D multimodality representation of the status of an object of interest (e.g., organ). A display can be in (near) real-time and use color-coded visual information and/or haptic feedback/tactile information, for example, to convey different effects of states of the holographically displayed object of interest. Such information can include morphological information about the target, functional information about the object of interest (e.g. flow, contractility, tissue biomechanical or chemical composition, voltage, temperature, pH, p02, pC02, etc.), or the measured changes in target properties due to interaction between the target and therapy being delivered. The exemplary 3D holographic display can be seen from (virtually) any angle/direction so that, e.g., multiple users can simultaneously interact with the same understanding and information.
Alternatively, it is possible to simultaneously display different information to different users positioned in the room, such as by displaying different information on each face of a cube or polyhedron, for example.
In one embodiment, one could "touch" or otherwise interact with a specific region of interest in the 3D holographic display (e.g., using one or multiple fingers, virtual tools, or physical instruments being tracked within the same interaction space), and tissue
characteristics would become available and displayed in the 3D hologram. Such "touch" can also be used to, e.g., rotate the virtual organ, zoom, tag points in 3D, draw a path and trajectory plan (e.g., for treatment, targeting, etc.), select critical zones to avoid, create alarms, and drop virtual objects (e.g., implants) in 3D in the displayed 3D anatomy.
Exemplary embodiments according to the present disclosure can also be used to facilitate a remote procedure (e.g., where the practitioner "acts" on the virtual organ and a robot simultaneously or subsequently performs the procedure on the actual organ), to practice a procedure before performing the actual procedure in a training or simulation setting, and/or to review/study/teach a procedure after it has been performed (e.g., through data recording, storage, and playback of the 3D holographic display and any associated multimodality signals relevant to the clinical procedure).
Exemplary embodiments according to the present disclosure are further described herein below with reference to the appended figures. While such exemplary embodiments are largely described separately from one another (e.g., for ease of presentation and understanding), one having ordinary skill in the art shall appreciate in view of the teachings herein that such exemplary embodiments can be used independently and/or in combination with each other. Indeed, the implementation and use of the exemplary embodiments described herein, including combinations and variations thereof, all of which are considered a part of the present disclosure, can depend on, e.g., particular laboratory or clinical use/application, integration with other related technologies, available resources,
environmental conditions, etc. Accordingly, nothing in the present disclosure should be interpreted as limiting of the subject matter disclosed herein.
A real-time 3D holographic display in accordance with the present principles may include a real-time six degree of freedom (DOF) input via localization technology embedded into a data interaction device (e.g., a haptic device for sensory feedback). An imaging / monitoring system for multidimensional data acquisition may also be employed. Datalinks between the holography display, localization system / interaction device, and imaging / monitoring system may be provided for communication between these systems. In one embodiment, the display, feedback devices, localization devices, measurement devices may be employed with or integrated with a computational workstation for decision support and data libraries of case information that can be dynamically updated / recalled during a live case for training / teaching / procedure guidance purposes (e.g., for similar archived clinical cases relative to the procedure and patient undergoing treatment).
It should be understood that the present invention will be described in terms of medical instruments; however, the teachings of the present invention are much broader and are applicable to any systems that can benefit from holographic visualization. In some embodiments, the present principles are employed in tracking or analyzing complex biological or mechanical systems. In particular, the present principles are applicable to internal tracking procedures of biological systems, procedures in all areas of the body such as the lungs, gastro -intestinal tract, excretory organs, blood vessels, etc. The elements depicted in the FIGS, may be implemented in various combinations of hardware and software and provide functions which may be combined in a single element or multiple elements.
The functions of the various elements shown in the FIGS, can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared. Moreover, explicit use of the term "processor" or "controller" should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor ("DSP") hardware, read-only memory ("ROM") for storing software, random access memory ("RAM"), non-volatile storage, etc.
Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure). Thus, for example, it will be appreciated by those skilled in the art that the block diagrams presented herein represent conceptual views of illustrative system components and/or circuitry embodying the principles of the invention. Similarly, it will be appreciated that any flow charts, flow diagrams and the like represent various processes which may be substantially represented in computer readable storage media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
Furthermore, embodiments of the present invention can take the form of a computer program product accessible from a computer-usable or computer-readable storage medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer-usable or computer readable storage medium can be any apparatus that may include, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk - read only memory (CD-ROM), compact disk - read/write (CD-R/W), Blu-Ray™ and DVD.
Referring now to the drawings in which like numerals represent the same or similar elements and initially to FIG. 1, a system 100 for generating and interacting with holographic images is illustratively shown in accordance with one embodiment. System 100 may include a workstation or console 112 from which a procedure is supervised and/or managed.
Workstation 112 preferably includes one or more processors 114 and memory 116 for storing programs and applications. Memory 116 may store a holographic generation module 115 configured to render a holographic image on a display 158 or in-air depending on the application. The holographic generation module 115 codes image data to generate a three dimensional hologram. The coding may provide the hologram on a 2D display or in 3D media or 3D display. In one example, data from 3D imaging, e.g., computed tomography, ultrasound, magnetic resonance may be transformed into a hologram using spatial distribution and light intensity to render the hologram.
A localization system 120 includes a coordinate system 122 to which a holographic image or hologram 124 is registered. The localization system 120 may also be employed to register a monitored object 128, which may include virtual instruments, which are separately created and controlled, real instruments, a physician's hands, fingers or other anatomical parts, etc. The localization system 120 may include an electromagnetic tracking system, a shape sensing system, such as a fiber optic based shape sensing system, an optical sensing system, including light sensors and arrays, or other sensing modality, etc. The localization system 120 is employed to define spatial regions in and around the hologram or the holographic image 124 to enable a triggering of different functions or actions as a result of movement in the area of the hologram 124. For example, dynamic locations of a physician's hands may be tracked using a fiber optic shape sensing device. When the physician's hands enter the same space, e.g., a monitored space 126 about a projected hologram 124, the intensity of the hologram may be increased. In another example, the physician's hand movements may be employed to spatially alter the position or orientation of the hologram 124 or to otherwise interact with the hologram 124.
A monitored object or sensing system 128 may be spatially monitored relative to the hologram 124 or the space 126 around the hologram 124. The monitored object 128 may include the physician's hands, a real or a virtual tool, another hologram, etc. The monitored object 128 may include a sensor or sensors 132 adapted to monitor the position of the monitored object 128 such that when a position of the object or a portion thereof is within the hologram 124 or the space 126 around the hologram 124, a reaction occurs that is consistent with the type of the monitored object 128 and the action performed or to be performed by the monitored object 128. The sensor or sensors 132 may include EM sensors, fiber optic shape sensors, etc.
In one embodiment, the sensors 132 include fiber optic shape sensors. A sensor interpretation module 134 may be employed to interpret feedback signals from a shape sensing device or system (132). Interpretation module 134 is configured to use the signal feedback (and any other feedback, e.g., optical, electromagnetic (EM) tracking, etc.) to reconstruct motion, deflection and other changes associated with the monitored object 128, which may include a medical device or instrument, virtual tools, human anatomical features, etc. The medical device may include a catheter, a guidewire, a probe, an endoscope, a robot, an electrode, a filter device, a balloon device, or other medical component, etc.
The shape sensing system (132) may include one or more optical fibers which are coupled to the monitored object 128 in a set pattern or patterns. The optical fibers connect to the workstation 112 through cabling 127. The cabling 127 may include fiber optics, electrical connections, other instrumentation, etc., as needed.
Shape sensing system (132) may be based on fiber optic Bragg grating sensors. A fiber optic Bragg grating (FBG) is a short segment of optical fiber that reflects particular wavelengths of light and transmits all others. This is achieved by adding a periodic variation of the refractive index in the fiber core, which generates a wavelength-specific dielectric mirror. A fiber Bragg grating can therefore be used as an inline optical filter to block certain wavelengths, or as a wavelength-specific reflector.
A fundamental principle behind the operation of a fiber Bragg grating is Fresnel reflection at each of the interfaces where the refractive index is changing. For some wavelengths, the reflected light of the various periods is in phase so that constructive interference exists for reflection and, consequently, destructive interference for transmission. The Bragg wavelength is sensitive to strain as well as to temperature. This means that Bragg gratings can be used as sensing elements in fiber optical sensors. In an FBG sensor, the measurand (e.g., strain) causes a shift in the Bragg wavelength.
One advantage of this technique is that various sensor elements can be distributed over the length of a fiber. Incorporating three or more cores with various sensors (gauges) along the length of a fiber that is embedded in a structure permits a three dimensional form of such a structure to be precisely determined, typically with better than 1 mm accuracy. Along the length of the fiber, at various positions, a multitude of FBG sensors can be located (e.g., 3 or more fiber sensing cores). From the strain measurement of each FBG, the curvature of the structure can be inferred at that position. From the multitude of measured positions, the total three-dimensional form is determined.
As an alternative to fiber-optic Bragg gratings, the inherent backscatter in
conventional optical fiber can be exploited. One such approach is to use Rayleigh scatter in standard single-mode communications fiber. Rayleigh scatter occurs as a result of random fluctuations of the index of refraction in the fiber core. These random fluctuations can be modeled as a Bragg grating with a random variation of amplitude and phase along the grating length. By using this effect in three or more cores running within a single length of multi- core fiber, the 3D shape and dynamics of the surface of interest can be followed.
In one embodiment, workstation 112 includes an image generation module 148 configured to receive feedback from the shape sensing system 132 or other sensor to sense interactions with the hologram 124. A position and status of the hologram 124 and its surrounding space 126 is known to the localization system 120. When the monitored object 128 enters the space 126 or coincides with the positions of the hologram 124, as determined by a comparison module 142, an action is triggered depending on a type of motion, a type of monitored object 128, a type of procedure or activity and/or any other criteria. The comparison module 142 informs the holographic generation module 115 that a change is needed. The holographic generation module 115 recodes the image data, which is processed and output to the image generation module 148, which updates the hologram 124 in accordance with set criteria.
In illustrative embodiments, the hologram 124 may include an internal organ rendered based on 3D images 152 of a patient or subject 150. The images 152 may be collected from the patient 150 preoperative ly using an imaging system 110. Note the imaging system 110 and the patient 150 need not be present to employ the present principles as the system 100 may be employed for training, analysis or other purposes at any time. In this example, a physician employs a pair of gloves having sensors 132 disposed thereon. As the
gloves/sensors 132, enter the space 126 and coincide with the hologram 124, the physician is able to rotate or translate the hologram 124. In another embodiment, the gloves include a haptic device 156 that provides tactile feedback depending on a position of the
gloves/sensors relative to the hologram 124 or the space 126. In other embodiments, the haptic feedback is indicative of the tissue type corresponding with the hologram 124 and its representation. The haptic device or system 156 may include ultrasound sources, speakers or other vibratory sources to convey differences in state of the hologram 124 using vibrations or sound.
A display 118 and or display 158 may also permit a user to interact with the workstation 112, the hologram 124 and its components and functions, or any other element within the system 100. This is further facilitated by an interface 130 which may include a keyboard, mouse, a joystick, a haptic device, or any other peripheral or control to permit user feedback from and interaction with the workstation 112.
In one embodiment, a user (practitioner, surgeon, fellow, etc.) can touch (or otherwise interact with) a specific region of interest (ROI) 154 within the 3D holographic display 158 or the hologram 124 within the 3D holographic display (or elsewhere) to display additional information related to the selected specific region of interest, e.g., tissue characteristics, such as temperature, chemical content, genetic signature, pressure, calcification percent, etc. An overlay of information can be displayed or presented on a separate exemplary 2D display (118), whereby parts of the 2D display can be transparent, for example, for better viewing of displayed information. It is also possible that the exemplary 2D display 118 presents or displays other graphics and text in high resolution (e.g., in exemplary embodiments where the 3D display may be of relatively low or limited resolution).
Other embodiments can provide a practitioner (e.g., doctor) with a "heads up" display (as display 158) or as a combination display (118 and 158) to accommodate the
display/presentation of such additional information. Additionally, other zones or regions of interest 154 can be automatically highlighted and/or outlined within the 3D holographic display 158 or hologram 124. Such other zones of interest can be, e.g., zones which have similar characteristics as the selected zone of interest and/or zones which are otherwise related.
According to yet another exemplary embodiment, the 3D holographic display 158 or hologram 124 may be employed with six degrees of freedom (6DOF) user tracking, e.g., with shape enabled instruments 132 and/or with camera based sensors 137, allowing for use as a user interface in 3D and real-time 6DOF user interaction. For example, a user (e.g., practitioner) is provided with the capability of touching a virtual organ being displayed as a 3D holographic image 124. The user can rotate, zoom in/out (e.g., changing the
magnification of the view), tag points in 3D, draw a path and/or trajectory plan, select (critical) zones to avoid, create alarms, insert and manipulate the orientation of virtual implants in 3D in the anatomy, etc. These functions are carried out using the localization system(s) 120 and image generation system or module 148 working in conjunction with the holographic data being displayed for the hologram 124.
Seed points 162 may be created and dropped into the 3D holographic display 158 or hologram 124 by touching (and/or tapping, holding, etc.) a portion of the display 158 or the hologram 124. The seed points 162 may be employed for, e.g., activation of virtual cameras which can provide individually customized viewing perspectives (e.g., orientation, zoom, resolution, etc.) which can be streamed (or otherwise transmitted) onto a separate high resolution 2D display 118.
The touch feature can by employed to create or drop virtual seed points 162 into the 3D display 158 for a plurality of tasks, e.g., initialization of segmentation, modeling, registration or other computation, visualization, planning step, etc. In addition to the 3D holographic display 158 or hologram 124 being used to display data of an anatomy, the display can also be used to display buttons, drop down menus, pointers/trackers, optional functions, etc. allowing users to interact and give commands to the system and/or any computer included therein or connected thereto (e.g., directly connected or via the Internet or other network).
In another embodiment, a microphone 164 may be employed to receive verbal information to connect, control, interact, etc. with the exemplary 3D holographic display 158 or hologram 124 via voice-controlled commands. A speech recognition engine 166 may be provided to convert speech commands into program commands to allow a user (e.g., surgeon) to interact with the 3D holographic display 158 or hologram 124 without having to use their hands. For example, a user could say "SHOW LAO FORTY", and the volume displayed within the holographic image would rotate to the proper angle to provide the user with the desired view. Other commands can range from those which are relatively simple, such as "ZOOM", followed by a specific amount e.g., "3 times" or so as to display particular (additional) information, to more complex commands, e.g., which can be related to a specific task or procedure.
According to another embodiment, a recording mode can be provided in memory 116 and made available to, e.g., play back a case on a same device for full 3D replay and/or on conventional (2D or 3D) viewing devices with automatic conversion of recorded 3D scenes into multiple 2D viewing perspectives (or rotating 3D models, e.g., in virtual reality modeling language (VRML)). Data connections between the holographic display 158 and recordings archived in a library/database 168 such as a picture archiving and communication system (PACS), Radiology Information Systems (RIS) or other electronic medical record system can be used to facilitate, e.g., visualization and diagnostic interpretation/data mining. Recordings can be replayed and used for, e.g., teaching and training purposes, such as to teach or train others in an individual setting, (e.g., when a user wants to review a recorded procedure performed), a small group environment (e.g., with peers and/or management), a relatively large class, lecture, etc. Such exemplary recordings may also be used for marketing presentations, research environments, etc. and may also be employed for quality and regulatory assessment, e.g., process evaluation or procedure assessment by hospital administrators, third-party insurers, investors, the Food and Drug Administration (FDA) and/or other regulatory bodies. Virtual cameras may be employed to capture or record multiple viewpoints/angles and generate multiple 2D outputs for, e.g., video capture or simultaneous display of images on different 2D television screens or monitors (or sections thereof).
Referring to FIG. 2, in another embodiment, three-dimensional (3D) holography may be used to display volumetric data of an anatomy (e.g., from a 3D CT scan), for example, to fuse anatomical with functional imaging and "sensing" information, as well as temporal (time-related) information. The information may be employed to create (generate, produce, display, etc.) a dynamic 3D multimodality representation 202 (e.g., a hologram) of an object (e.g., organ) and a status thereof using visual indicators 204, 206, such as colors, contrast levels and patterns from a display 210. The object 202 (e.g., hologram 124) may show different regions 204, 206 to indicate useful data on the object 202. For example, epicardial and/or endocardial mapping data can be used to, e.g., display electrical activity data on a heart image during an electrophysiology procedure, superimposed with the anatomical imaging data of the heart (e.g., coming from CT, XperCT or MR). Another example is the display of temperature maps which can be provided by MR during ablation, or magnetic resonance high- intensity focused ultrasound (MR-HIFU) 4D (four-dimensional) information during an intervention (e.g., using MR digital data transfer systems and procedures). It is also possible to use information associated with a real-time radiation dose distribution map superimposed over the anatomical target during a radiation oncology treatment (Linac, brachytherapy, etc.), for example. Other embodiments are also contemplated.
Referring to FIG. 3, an exemplary holographic visualization of functional and anatomical information, which may be employed during an interventional procedure in accordance with an exemplary embodiment, is illustratively shown. A volumetric image 302 of a heart, in this example, is acquired and may be segmented to reduce computational space and to determine anatomical features of the heart as opposed to other portions of the image. This results in a segmented image 304. Functional or device data is acquired by performing measurements or tests in block 306 on the heart or other anatomical feature. In the illustrative embodiment, an electroanatomical map or other map is generated corresponding with the heart or organ. The map is registered to the segmented image 304 to provide a registered image 310 that may be generated and displayed as a hologram. Real-time catheter
308 data may be collected from within or about the heart using a localization technique (shape sensing, etc.). Data traces of catheter positions or other related data (treatment locations, etc.) may be rendered in a holographic image 312 which includes both the anatomical data (e.g., segmented hologram) and the device data (e.g., catheter data).
Another exemplary embodiment according to the present disclosure includes the acquisition of incomplete data (e.g., projections rather than full 3D images). This may include, for example, data in Fourier (frequency) space where intermittent or incomplete images are acquired. For example, undersampled image data in the frequency domain are collected. According to this exemplary embodiment, it is possible to construct (generate, produce, display, etc.) a 3D holographic image display with relatively less or a reduced amount of input data, and thus a relatively less or reduced amount of associated
computational processing power and/or time. Depending on the incompleteness of the acquired data and what particular information may not be available, it is possible that the resultant 3D holographic image may be constructed/displayed with (some) limitations.
However, such exemplary embodiments can help achieve real-time or near-real-time dynamic displays with significantly less radiation exposure (e.g., in the case of live X-ray imaging) as well as computational overhead, which benefits can be considered (e.g., balanced, weighed against) in view of the potential limitations associated with this exemplary embodiment.
Referring to FIG. 4, another exemplary embodiment includes inputting virtual instruments or objects into a holographic display. In one embodiment, objects 402 may be digitized or otherwise rendered into a virtual environment 404 and displayed. The objects 402 may be drawn or loaded into the workstation 112 as object data 405 and may be coded into the display 158 and concurrently renders with the hologram 124. A static image of the object 402 may appear in the hologram 124 and may be separately manipulated with the hologram 124 (and or on the display 158). The static image may be employed for size comparisons or measurements between the object 402 and the hologram 124.
In one embodiment, a converter box 406 may be included to employ a standardization protocol to provide for a "video-ready" interface to the 3D holographic display 158. For example, with respect to shape sensing technology, the converter box 406 can format the x, y, z coordinates from each localized instrument or object 402 (catheter, implant, etc.) into a space readable by the holographic display 158 (e.g., rasterized/scan converted voxel space, vector space, etc.). This can be performed using the workstation 112 in FIG. 1. The 3D format should at least support voxels (for volumes), and graphical elements / primitives e.g., meshes (a virtual catheter can be displayed as a tube) and lines in 3D (to encode
measurements and text rendering). The 3D format can be varied in accordance with the present disclosure based on, e.g., particular laboratory or clinical use or applications, integration with other related technologies, available resources, environmental conditions, etc. Using this video capability, the object 402 (e.g., a computer aided design rendering, model, scan, etc. for an instrument, medical device, implant, etc.) may be independently manipulated relative to the hologram 124 on the display 158 or in the air. In this way, the object 402 can be placed in or around the hologram 124 to determine whether the object will fit within a portion of the hologram 124, etc. For example, an implant may be placed through a blood vessel to test the fit visually. It is also contemplated that other feedback may be employed. For example, by understanding the space that the object 402 occupies and its orientation, a comparison module may be capable of determined interference between the hologram 124 and the objects 402 to enable, say, haptic feedback to indicate that a clearance for the implant is not possible. Other applications are also contemplated.
In another exemplary embodiment, the system 100 of FIG. 1 and/or FIG. 4 may be employed as an education and/or training tool. For example, a practitioner (e.g., surgeon, physician, fellow, doctor, etc.) could practice a procedure (surgery, case, etc.) virtually prior to actually performing the procedure by understanding the 3D anatomy and/or incorporating the use of actual or virtual tools or instruments (monitored objects 128 and/or objects 402, respectively). A fellow/practitioner could practice (perform virtually) a surgical
case/procedure by, e.g., sizing an implant to plan whether it would fit a particular patient's anatomy, etc.
Referring to FIG. 5 with continued reference to FIG. 1, a tracked input device (monitored object 128), e.g., an instrument tracked with shape sensing, electromagnetic tracking, acoustic tracking, or machine vision based optical tracking (time-of- flight cameras or structured light cameras), may be employed in conjunction with the display 158 to access a virtual help mode trigger point 504 (or other functions) in the hologram 124 and generated by the image generation module 148. The virtual help trigger point 504 may include pixel regions within the display or hologram. For example, when manipulating virtual instruments or objects 402, the region or trigger point 504 may be selected (e.g., virtually selected and displayed by using the tip of the tracked virtual tool (402) (or using the monitored object 128) which is automatically registered with the hologram 124 in the image.
In one embodiment, the trigger points 504 are selected in the hologram 124 and a menu 502 or other interactive space may open to permit further selections. For example, a fellow/practitioner could first select a program called "HIP" by activating a trigger point 504 to display a 3D CT image of a patient's (subject's) hip, and then select different "HIP IMPLANTS" from different manufacturers to see and "feel" which implant would fit best for the particular patient. It is also possible to use (e.g., physically hold and manipulate) the actual implant in the air and position it within the 3D holographic display to see, feel and assess fit (e.g., if and how well such implant may fit the particular patient).
FIG. 5 shows the virtual menu 502 that may be provided in the holographic display 158 or other display 118 to permit the selection of a stent 508. The virtual menu 502 can be called using the display 158, the hologram 124 or by employing interface 130. Once the stent 508 is selected, a virtual model is rendered (see FIG. 4) in the display 158 or hologram 124 to permit manipulation, measurement, etc.
The virtual menu 502 provides for clinical decision support tying together localization and an exemplary holographic user interface in accordance with an exemplary embodiment. During intra-procedural use, the shape tracked instrument (128), e.g., a catheter, can be navigated to the anatomy of interest (504) and the virtual menu 502 can pop up automatically for each region, or the trigger point 504 may be activated by placing the object tip into the region of the trigger point 504 or otherwise activating the trigger point 504 (e.g., touching it, etc.). An implant or other device may be selected, which is then introduced to allow for device sizing/selection to be performed in the virtual holography space (e.g., within or in close proximity to the holographic display).
Referring to FIG. 6, according to another exemplary embodiment, a 3D holographic display 158 or hologram 124 may be employed during surgery to interact with a device 602 inside the patient. Robotics via a master/slave configuration can be used, where a shape sensed analog 604 of the device 602 moving within the display 158 is employed to actuate the motion of the actual device 602 within a target region 606. A practitioner's (surgeon's, physician's, etc.) hands 610 or voice can be tracked by sensor-based and/or voice-based techniques, such as by, e.g., tracking a physician's hands using a shape-sensing device 608 and shape sensing system 614 in the 3D holographic display. Accordingly, a practitioner's movements (including, e.g., (re)positioning, orientation, etc. of their hands) performed in the holographic display 158 can be transmitted to the device 602, such as a robot 612 (e.g., robotically controlled instruments) inside the patient to replicate such movements within the patient's body, and thereby perform the actual surgery, procedure or task inside the patient's body. Thus, a surgeon can see, touch and feel a 3D holographic display of an organ, perform a procedure thereon (i.e., within the 3D holographic display), causing such procedure to be performed inside of a patient on the actual organ via, or simply to move instruments, e.g., robotically controlled instruments.
The movement of the physician creates sensing signals using sensor 608 (and/or sensors in device 604), which are adapted to control signals by the system 100 for controlling the robot or other device 602. The signals may be stored in memory (1 16) for delayed execution if needed or desired. The actual procedure may be performed in real-time (or near real-time), e.g., a robot 612 performs a specific movement within a patient's body concurrently with a surgeon's performance within the 3D holographic display 158. The actual procedure may be performed by, e.g., a robot (only) after a surgeon completes a certain task/movement (or series of tasks or movements), and/or the surgeon confirms that the robot should proceed (e.g., after certain predefined criteria and/or procedural milestone(s) are reached). Such a delay (e.g., between the virtual performance of a task or movement within the 3D holographic display to the actual performance within a patient's body) can help to prevent any movements/tasks being performed within the patient incorrectly and ensure that each such movement/task is performed accurately and precisely by providing the surgeon an opportunity to confirm a movement/task after it has been performed virtually within the 3D holographic display 158 before it is actually performed within a patient's body 150 by the robot 612.
Further, a practitioner could opt to redo a specific movement/task that is virtually performed within the 3D holographic display 158 if the practitioner is not satisfied with such movement or task (for any reason). Thus, for example, if a surgeon were to inadvertently move too far in any particular direction when virtually performing a movement or task in the 3D holographic display, the surgeon could opt to redo such virtual movement or task (as many times as desired or may be necessary) until it is performed correctly. After which the actual movement or task can be performed by a robot inside of the patient with or without dynamic adaptation of the task to adjust for changes in target or therapy instrument on-the-fly (e.g., dynamically, on a continuous basis, in real-time, etc.).
Referring to FIG. 7, another exemplary embodiment includes haptic feedback, which can be incorporated by using, e.g., ultrasound to generate vibrations in the air. A haptic device 710 may take many forms. In one embodiment, a set of ultrasonic probes 702 can be used to send customized waves towards a 3D holographic display 704 to give the user (e.g., a practitioner) a sense of feeling of structures being displayed and represented by the 3D holographic display 704. For example, such waves can be tailored or customized to represent hard or stiff materials of bony structures, while other tissues, such as of the liver and/or vessels, can be represented with a relatively softer "feel" by waves which are tailored or configured accordingly. Such encoding can be realized by, e.g., modulation of the frequency, amplitude, phase, or other spatiotemporal modulation of the excitation imparted by a haptic device 710 to the observer.
In another embodiment, the haptic feedback device 710 may be employed to, e.g., represent physical resistance of a particular structure or tissue in response to a movement or task performed by a practitioner. Thus, for example, when a surgeon 714 virtually performs a task within the 3D holographic display 704, using haptic feedback, it is possible for such task to be felt by the surgeon as if the surgeon were actually performing the task within the patient's body. This can be realized using a haptic device 712, such as, e.g., a glove(s), bracelet, or other garments or accessories having actuators or other vibratory elements.
According to one exemplary embodiment, the exemplary 3D holographic display 158 can be seen from (virtually) any angle, e.g., so that all users can interact with the same understanding and information. This is the case for an in-air hologram; however, other display techniques may be employed to provide multiple individuals with a same view.
Referring to FIG. 8, display information may be provided to different users positioned in a room or area, by displaying the same or different information on a geometrical structure 802 (holographically), such as a multi- faceted holographic display where each face of the display (e.g., a cube or polyhedron) displays the information. This can be achieved by projecting multiple 2D video streams 804 on the geometrical structure 802 (e.g., side by side, or partially overlapping) rendered within a holographic output 806. For example, a holographic "cube" display in 3D can show/display on one cube face information (e.g., a 2D live x-ray image) in one particular direction (e.g., the direction of a first practitioner 808), while another cube face of the same "cube" display can show/display another type of information (e.g., an ultrasound image) to a second practitioner 810 positioned elsewhere in the room (e.g., diametrically opposite the display from the first practitioner).
One having ordinary skill in the art will appreciate in view of the teachings provided herein that such exemplary display can be configured at will depending on, e.g., the number of users in the room. It is also possible that the position (in the room) of each
user/practitioner can be tracked (in the room) and that each individual's display information follows each user's viewing perspective as the user moves (e.g., during a procedure). For example, one particular user (doctor, nurse, etc.) can be provided with the specific information that the user needs regardless of where in the room such particular user moves during a procedure. Further, it is also possible that each user is provided with a unique display, which can be a 2D cube face, such as described above, or a 3D holographic display customized or tailored for such user, and that such a unique display can "follow" the user as the user moves around a room.
Multiple combinations of displays in accordance with this and other exemplary embodiments described herein are possible, providing, e.g., for individual users to have their own unique display and/or be presented with the same information of other users, regardless of the movement and location of a user within a room or elsewhere (e.g., outside of the room, off-site, etc.). Additionally, a user may initially select and change at any time during a procedure what information is displayed to them by, e.g., selecting from predefined templates, selecting specific informational fields, selecting the display of another particular user, etc.
Note that text is an inherently 2D mode of communication. The system may display shapes/symbols identifiable from multiple viewpoints, or represent the text oriented towards the viewer. In case of multiple viewers, the oriented text may be shown in multiple directions simultaneously or to each independently in different frames.
Referring to FIG. 9, in another exemplary embodiment, a remote system 900 may include at least some of the capabilities of system 100 (FIG. 1) but is remotely disposed relative to a patient 902 and data collection instruments. A user (practitioner, surgeon, fellow, etc.) may conduct a procedure remotely (e.g., with the user being physically located off-site from the location where the subject/patient 902 is located) or assist or provide guidance remotely to the procedure. For example, a user can perform a procedure/task on an exemplary holographic display 904 located at their location. In one embodiment, the display 904 is connected (e.g., via the Internet or other network 910 (wired or wireless)) to the system 100 co-located with the patient 902. System 100 can be in continuous
communication with the remote system 900 (e.g., where the user is located) so that the holographic display 904 is continually updated in (near) real-time. Additionally, the system 100 may include robotically controlled instruments 906, e.g., inside of a patient) which are controlled via commands provided (e.g., via the Internet) by the remote system 900, as described above. These commands are generated based on the user's interaction with the holographic display 904. Holographic displays 158 and 904 may include the same subject matter at one or more locations so that the same information is conveyed at each location. For example, this embodiment may include, e.g., providing guidance or assistance to another doctor around the globe, for peer-to-peer review, expert assistance or a virtual class room where many students could attend a live case from different locations throughout the world.
Some or all of the exemplary embodiments and features described herein can also be used (at least in part) in conjunction or combination with any other embodiments described herein.
Referring to FIG. 10, a method for interacting with a holographic display is shown in accordance with illustrative embodiments. In block 1002, a holographically rendered anatomical image is generated and displayed. The image may include one or more organs or anatomical regions. The holographically rendered anatomical image may be generated in-air.
In block 1004, a monitored space is localized on or around the holographically rendered anatomical image to define a region for interaction. The localization system may include one or more of a fiber optic shape sensing system, an electromagnetic tracking system, a light sensor array and/or other sensing modality. The position and orientation of the monitored space and the one or more monitored objects is preferably determined in a same coordinate system. The one or more monitored objects may include a medical instrument, an anatomical feature of a user, a virtual object, etc.
In block 1006, a position and orientation of one or more monitored objects is monitored by the localization system. In block 1008, coincidence of spatial points is determined between the monitored space the one or more monitored objects. In block 1010, if coincidence is determined, a response is triggered in the ho lo graphically rendered anatomical image. In block 1012, the response may include moving the holographically rendered anatomical image (e.g. 6DOF) or changing its appearance. In block 1014, the response may include adjusting a zoom (magnification) or other optical characteristics of the holographically rendered anatomical image. In block 1016, the holographically rendered anatomical image may be marked, tagged, targeted, etc. In block 1018, camera viewpoints can be assigned (for other viewers or displays). In block 1020, feedback may be generated to a user. The feedback may include haptic feedback (vibrating device or air), optical feedback (visual or color differences), acoustic feedback (verbal, alarms), etc.
In block 1022, a response region may be provided and monitored by the localization system such that upon activating the response region a display event occurs. The display event may include generating a help menu in block 1024; generating a menu of virtual objects to be included in the holographically rendered anatomical image upon selection in block 1026; and generating information to be displayed in block 1028.
In block 1030, the holographically rendered anatomical image may be generated with superimposed medical data mapped to positions on the holographically rendered anatomical image. In block 1032, the response that is triggered may include generating control signals for operating robotically controlled instruments. The control signals may enable remote operations to be performed.
In interpreting the appended claims, it should be understood that:
a) the word "comprising" does not exclude the presence of other elements or acts than those listed in a given claim;
b) the word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements;
c) any reference signs in the claims do not limit their scope; d) several "means" may be represented by the same item or hardware or software implemented structure or function; and
e) no specific sequence of acts is intended to be required unless specifically indicated.
Having described preferred embodiments for holographic user interfaces for medical procedures (which are intended to be illustrative and not limiting), it is noted that modifications and variations can be made by persons skilled in the art in light of the above teachings. It is therefore to be understood that changes may be made in the particular embodiments of the disclosure disclosed which are within the scope of the embodiments disclosed herein as outlined by the appended claims. Having thus described the details and particularity required by the patent laws, what is claimed and desired protected by Letters Patent is set forth in the appended claims.

Claims

CLAIMS:
1. An interactive holographic display system, comprising:
a holographic generation module (115) configured to display a holographically rendered anatomical image (124);
a localization system (120) configured to define a monitored space (126) on or around the holographically rendered anatomical image; and
one or more monitored objects (128) having their position and orientation monitored by the localization system such that coincidence of spatial points between the monitored space and the one or more monitored objects triggers a response in the holographically rendered anatomical image.
2. The system as recited in claim 1 , wherein the holographically rendered anatomical image (124) is generated in-air.
3. The system as recited in claim 1, wherein the localization system 120 includes one or more of a fiber optic shape sensing system, an electromagnetic tracking system, a light sensor array, and a sensing device to determine the position and orientation of the monitored space and the one or more monitored object in a same coordinate system.
4. The system as recited in claim 1, wherein the one or more monitored objects (128) include a medical instrument, an anatomical feature of a user and a virtual object.
5. The system as recited in claim 1, wherein the response in the holographically rendered anatomical image (124) includes one or more of: translation or rotation of the holographically rendered anatomical image, magnification adjustment of the holographically rendered anatomical image, marking of the holographically rendered anatomical image and feedback generation.
6. The system as recited in claim 5, wherein the response includes haptic feedback to a user.
7. The system as recited in claim 1 , wherein the ho lo graphically rendered anatomical image (124) includes a response region (504) monitored by the localization system such that upon activating the response region a display event (502) occurs.
8. The system as recited in claim 7, wherein the display event (502) includes a generated help menu, a generated menu of virtual objects to be included in the
ho lographically rendered anatomical image upon selection and a generated information display.
9. The system as recited in claim 1, wherein the ho lographically rendered anatomical image (124) displays superimposed medical data mapped to positions thereon.
10. The system as recited in claim 1, wherein the response in the ho lographically rendered anatomical image (124) generates control signals for operating robotically controlled instruments (602).
11. The system as recited in claim 1 , wherein the response in the ho lographically rendered anatomical image includes seed points (162) placed to direct virtual camera angles for an additional display.
12. The system as recited in claim 1, wherein the interactive holographic display system is remotely disposed from a patient location and connected to the patient location over a communication network (910).
13. The system as recited in claim 1, wherein the interactive holographic display system is remotely disposed from a patient location and connected to the patient location over a communication network (910) such that the ho lographically rendered anatomical image is employed to remotely control instruments (906) at the patient's location.
14. The system as recited in claim 1, further comprising a speech recognition engine (166) configured to convert speech commands into commands for altering an appearance of the ho lographically rendered anatomical image.
15. An interactive holographic display system, comprising:
a processor (114);
memory (1 16) coupled to the processor;
a holographic generation module (115) included in the memory and configured to display a holographically rendered anatomical image (124) as an in-air hologram or on a holographic display;
a localization system (120) configured to define a monitored space (126) on or around the holographically rendered anatomical image; and
one or more monitored objects (128) having their position and orientation monitored by the localization system such that coincidence of spatial points between the monitored space and the one or more monitored objects triggers a response in the holographically rendered anatomical image wherein the response in the holographically rendered anatomical image includes one or more of: translation or rotation of the holographically rendered anatomical image, magnification adjustment of the holographically rendered anatomical image, marking of the holographically rendered anatomical image and feedback generation.
16. The system as recited in claim 15, wherein the localization system (120) includes one or more of a fiber optic shape sensing system, an electromagnetic tracking system, a light sensor array and a sensing device to determine the position and orientation of the monitored space and the one or more monitored object in a same coordinate system.
17. The system as recited in claim 15, wherein the one or more monitored objects (128) include a medical instrument, an anatomical feature of a user and a virtual object.
18. The system as recited in claim 15, wherein the response includes haptic feedback to a user.
19. The system as recited in claim 15, wherein the holographically rendered anatomical image (124) includes a response region (504) monitored by the localization system such that upon activating the response region a display event (502) occurs.
20. The system as recited in claim 19, wherein the display event (502) includes a generated help menu, a generated menu of virtual objects to be included in the holographically rendered anatomical image upon selection and a generated information display.
21. The system as recited in claim 15, wherein the holographically rendered anatomical image (124) displays superimposed medical data mapped to positions thereon.
22. The system as recited in claim 15, wherein the response in the holographically rendered anatomical image generates control signals for operating robotically controlled instruments (602).
23. The system as recited in claim 15, wherein the response in the holographically rendered anatomical image includes seed points (162) placed to direct virtual camera angles for an additional display.
24. The system as recited in claim 15, wherein the interactive holographic display system is remotely disposed from a patient location and connected to the patient location over a communication network (910).
25. The system as recited in claim 15, wherein the interactive holographic display system is remotely disposed from a patient location and connected to the patient location over a communication network (910) such that the holographically rendered anatomical image is employed to remotely control instruments (906) at the patient's location.
26. The system as recited in claim 15, further comprising a speech recognition engine (166) configured to convert speech commands into commands for altering an appearance of the holographically rendered anatomical image.
27. A method for interacting with a holographic display, comprising:
displaying (1002) a holographically rendered anatomical image;
localizing (1004) a monitored space on or around the holographically rendered anatomical image to define a region for interaction;
monitoring (1006) a position and orientation of one or more monitored objects by the localization system;
determining (1008) coincidence of spatial points between the monitored space the one or more monitored objects; and
if coincidence is determined, triggering (1010) a response in the holographically rendered anatomical image.
28. The method as recited in claim 27, wherein displaying includes generating (1002) the holographically rendered anatomical image in-air.
29. The method as recited in claim 27, wherein the localization system includes one or more of a fiber optic shape sensing system, an electromagnetic tracking system, a light sensor array and a sensor device, and the method further comprises determining (1004) the position and orientation of the monitored space and the one or more monitored object (1006) in a same coordinate system.
30. The method as recited in claim 27, wherein the one or more monitored objects include a medical instrument, an anatomical feature of a user and a virtual object.
31. The method as recited in claim 27, wherein triggering a response includes one or more of: moving (1012) the holographically rendered anatomical image; adjusting (1014) zoom of the holographically rendered anatomical image, marking (1016) the holographically rendered anatomical image; and generating (1020) feedback to a user.
32. The method as recited in claim 31, wherein generating feedback includes generating haptic feedback for a user.
33. The method as recited in claim 27, wherein the holographically rendered anatomical image includes a response region (1022) monitored by the localization system such that upon activating the response region a display event occurs.
34. The method as recited in claim 33, wherein the display event includes generating (1024) a help menu; generating (1026) a menu of virtual objects to be included in the holographically rendered anatomical image upon selection; and generating (1028) information to be displayed.
35. The method as recited in claim 27, further comprising rendering (1030) the holographically rendered anatomical image with superimposed medical data mapped to positions on the holographically rendered anatomical image.
36. The method as recited in claim 27, further comprising generating (1032) control signals for operating robotically controlled instruments.
EP12798835.0A 2011-10-20 2012-10-15 Holographic user interfaces for medical procedures Not-in-force EP2769270B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161549273P 2011-10-20 2011-10-20
PCT/IB2012/055595 WO2013057649A1 (en) 2011-10-20 2012-10-15 Holographic user interfaces for medical procedures

Publications (2)

Publication Number Publication Date
EP2769270A1 true EP2769270A1 (en) 2014-08-27
EP2769270B1 EP2769270B1 (en) 2018-09-19

Family

ID=47326233

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12798835.0A Not-in-force EP2769270B1 (en) 2011-10-20 2012-10-15 Holographic user interfaces for medical procedures

Country Status (8)

Country Link
US (1) US20140282008A1 (en)
EP (1) EP2769270B1 (en)
JP (1) JP6157486B2 (en)
CN (1) CN103959179B (en)
BR (1) BR112014009129A2 (en)
IN (1) IN2014CN03103A (en)
RU (1) RU2608322C2 (en)
WO (1) WO2013057649A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017165301A1 (en) 2016-03-21 2017-09-28 Washington University Virtual reality or augmented reality visualization of 3d medical images
US10242643B2 (en) 2016-07-18 2019-03-26 Microsoft Technology Licensing, Llc Constrained head-mounted display communication
US10663922B2 (en) 2016-04-14 2020-05-26 Boe Technology Group Co., Ltd. Image display system and image display method
US20220262079A1 (en) * 2021-02-17 2022-08-18 Arm Limited Foveation for a holographic imaging system
EP4134974A1 (en) 2021-08-12 2023-02-15 Koninklijke Philips N.V. Dynamic care assistance mechanism

Families Citing this family (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2749119B1 (en) * 2011-08-24 2017-11-29 LG Electronics Inc. Mobile terminal and controlling method thereof
EP3321780A1 (en) * 2012-02-15 2018-05-16 Immersion Corporation High definition haptic effects generation using primitives
KR101409845B1 (en) * 2012-12-20 2014-06-19 전자부품연구원 Image Display Apparatus and Method
US9383975B1 (en) 2013-01-28 2016-07-05 Richard Stanley Fencel Projection of software and integrated circuit diagrams into actual 3D space
US20140218397A1 (en) * 2013-02-04 2014-08-07 Mckesson Financial Holdings Method and apparatus for providing virtual device planning
US9639666B2 (en) * 2013-03-15 2017-05-02 Covidien Lp Pathway planning system and method
US20150003204A1 (en) * 2013-06-27 2015-01-01 Elwha Llc Tactile feedback in a two or three dimensional airspace
US9804675B2 (en) 2013-06-27 2017-10-31 Elwha Llc Tactile feedback generated by non-linear interaction of surface acoustic waves
CN104679226B (en) * 2013-11-29 2019-06-25 上海西门子医疗器械有限公司 Contactless medical control system, method and Medical Devices
CN105302281A (en) * 2014-05-28 2016-02-03 席东民 Holographic virtual haptic generation apparatus
CN106028903B (en) * 2014-06-10 2018-07-24 奥林巴斯株式会社 The working method of endoscopic system, endoscopic system
US10416760B2 (en) 2014-07-25 2019-09-17 Microsoft Technology Licensing, Llc Gaze-based object placement within a virtual reality environment
US9904055B2 (en) 2014-07-25 2018-02-27 Microsoft Technology Licensing, Llc Smart placement of virtual objects to stay in the field of view of a head mounted display
US9858720B2 (en) 2014-07-25 2018-01-02 Microsoft Technology Licensing, Llc Three-dimensional mixed-reality viewport
US10451875B2 (en) 2014-07-25 2019-10-22 Microsoft Technology Licensing, Llc Smart transparency for virtual objects
US10311638B2 (en) 2014-07-25 2019-06-04 Microsoft Technology Licensing, Llc Anti-trip when immersed in a virtual reality environment
US9865089B2 (en) 2014-07-25 2018-01-09 Microsoft Technology Licensing, Llc Virtual reality environment with real world objects
US9766460B2 (en) 2014-07-25 2017-09-19 Microsoft Technology Licensing, Llc Ground plane adjustment in a virtual reality environment
CN105373212B (en) * 2014-08-25 2020-06-23 席东民 Virtual touch generating device
JP5927366B1 (en) * 2014-09-24 2016-06-01 オリンパス株式会社 Medical system
EP3012759B1 (en) * 2014-10-24 2019-10-02 mediCAD Hectec GmbH Method for planning, preparing, accompaniment, monitoring and/or final control of a surgical procedure in the human or animal body, system for carrying out such a procedure and use of the device
US10235807B2 (en) * 2015-01-20 2019-03-19 Microsoft Technology Licensing, Llc Building holographic content using holographic tools
US10740971B2 (en) 2015-01-20 2020-08-11 Microsoft Technology Licensing, Llc Augmented reality field of view object follower
US10007413B2 (en) 2015-04-27 2018-06-26 Microsoft Technology Licensing, Llc Mixed environment display of attached control elements
US9713871B2 (en) 2015-04-27 2017-07-25 Microsoft Technology Licensing, Llc Enhanced configuration and control of robots
CN104771232A (en) * 2015-05-05 2015-07-15 北京汇影互联科技有限公司 Electromagnetic positioning system and selection method for three-dimensional image view angle of electromagnetic positioning system
US20160364003A1 (en) * 2015-06-10 2016-12-15 Wayne Patrick O'Brien Holographic interface for manipulation
US11449146B2 (en) * 2015-06-10 2022-09-20 Wayne Patrick O'Brien Interactive holographic human-computer interface
CN107924459B (en) * 2015-06-24 2021-08-27 埃达技术股份有限公司 Method and system for interactive 3D mirror placement and measurement for kidney stone removal procedures
US9520002B1 (en) 2015-06-24 2016-12-13 Microsoft Technology Licensing, Llc Virtual place-located anchor
US9984478B2 (en) * 2015-07-28 2018-05-29 PME IP Pty Ltd Apparatus and method for visualizing digital breast tomosynthesis and other volumetric images
US10007352B2 (en) 2015-08-21 2018-06-26 Microsoft Technology Licensing, Llc Holographic display system with undo functionality
US9952656B2 (en) 2015-08-21 2018-04-24 Microsoft Technology Licensing, Llc Portable holographic user interface for an interactive 3D environment
KR20170029320A (en) * 2015-09-07 2017-03-15 엘지전자 주식회사 Mobile terminal and method for controlling the same
US10304247B2 (en) 2015-12-09 2019-05-28 Microsoft Technology Licensing, Llc Third party holographic portal
US9990078B2 (en) * 2015-12-11 2018-06-05 Immersion Corporation Systems and methods for position-based haptic effects
DE102015226669B4 (en) * 2015-12-23 2022-07-28 Siemens Healthcare Gmbh Method and system for outputting augmented reality information
CN105898217B (en) * 2016-04-14 2019-08-06 京东方科技集团股份有限公司 Holography operation equipment and control equipment, holographic operating method and control method
US10126553B2 (en) 2016-06-16 2018-11-13 Microsoft Technology Licensing, Llc Control device with holographic element
US10620717B2 (en) 2016-06-30 2020-04-14 Microsoft Technology Licensing, Llc Position-determining input device
CA3034314C (en) 2016-08-17 2021-04-20 Synaptive Medical (Barbados) Inc. Methods and systems for registration of virtual space with real space in an augmented reality system
US10394317B2 (en) * 2016-09-15 2019-08-27 International Business Machines Corporation Interaction with holographic image notification
US10768630B2 (en) * 2017-02-09 2020-09-08 International Business Machines Corporation Human imperceptible signals
CN107194163A (en) * 2017-05-15 2017-09-22 上海联影医疗科技有限公司 A kind of display methods and system
US10748443B2 (en) * 2017-06-08 2020-08-18 Honeywell International Inc. Apparatus and method for visual-assisted training, collaboration, and monitoring in augmented/virtual reality in industrial automation systems and other systems
US10898272B2 (en) * 2017-08-08 2021-01-26 Biosense Webster (Israel) Ltd. Visualizing navigation of a medical device in a patient organ using a dummy device and a physical 3D model
US11605448B2 (en) 2017-08-10 2023-03-14 Nuance Communications, Inc. Automated clinical documentation system and method
US11316865B2 (en) 2017-08-10 2022-04-26 Nuance Communications, Inc. Ambient cooperative intelligence system and method
DE102017127718A1 (en) * 2017-11-23 2019-05-23 Olympus Winter & Ibe Gmbh User assistance system for reusable medical devices
US11250382B2 (en) 2018-03-05 2022-02-15 Nuance Communications, Inc. Automated clinical documentation system and method
EP3861384A4 (en) * 2018-10-01 2022-05-11 LEIA Inc. Holographic reality system, multiview display, and method
US11320911B2 (en) * 2019-01-11 2022-05-03 Microsoft Technology Licensing, Llc Hand motion and orientation-aware buttons and grabbable objects in mixed reality
KR20220048973A (en) * 2019-05-29 2022-04-20 스티븐 비. 머피 Systems and methods for using augmented reality in surgery
CA3147628A1 (en) * 2019-08-19 2021-02-25 Jonathan Sean KARAFIN Light field display for consumer devices
CA3148816A1 (en) * 2019-08-26 2021-03-04 Light Field Lab, Inc. Light field display system for sporting events
WO2021155349A1 (en) * 2020-02-01 2021-08-05 Mediview Xr, Inc. Real time fused holographic visualization and guidance for deployment of structural heart repair or replacement product
DE102020204785A1 (en) 2020-04-15 2021-10-21 Fresenius Medical Care Deutschland Gmbh Medical device with a display and with a processing unit and method therefor
US11564767B2 (en) 2020-04-22 2023-01-31 Warsaw Orthopedic, Inc. Clinical diagnosis and treatment planning system and methods of use
CN114144135A (en) * 2020-07-03 2022-03-04 瓦里安医疗系统公司 Radio frequency ablation treatment system and method
CN116670782A (en) * 2020-12-25 2023-08-29 深圳迈瑞生物医疗电子股份有限公司 Medical information display system and medical system
CN115176283A (en) * 2021-01-28 2022-10-11 博医来股份公司 Augmented reality positioning medical views
US11273003B1 (en) * 2021-02-18 2022-03-15 Xenco Medical, Inc. Surgical display
US11478327B2 (en) 2021-02-18 2022-10-25 Xenco Medical, Inc. Surgical display
US11681373B1 (en) * 2021-12-08 2023-06-20 International Business Machines Corporation Finger movement management with haptic feedback in touch-enabled devices
US12094327B2 (en) 2022-06-06 2024-09-17 International Business Machines Corporation Haptic effect management

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4001643B2 (en) * 1993-10-05 2007-10-31 スナップ−オン・テクノロジイズ・インク Two-hand open type car maintenance equipment
US20020036617A1 (en) * 1998-08-21 2002-03-28 Timothy R. Pryor Novel man machine interfaces and applications
DE10130278B4 (en) * 2001-06-26 2005-11-03 Carl Zeiss Meditec Ag Method and device for representing an operating area during laser operations
US6975994B2 (en) * 2001-09-12 2005-12-13 Technology Innovations, Llc Device for providing speech driven control of a media presentation
US7831292B2 (en) * 2002-03-06 2010-11-09 Mako Surgical Corp. Guidance system and method for surgical procedures with improved feedback
US7190496B2 (en) * 2003-07-24 2007-03-13 Zebra Imaging, Inc. Enhanced environment visualization using holographic stereograms
JP2007530123A (en) * 2004-03-26 2007-11-01 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Control by unskilled MR device
US20070182812A1 (en) * 2004-05-19 2007-08-09 Ritchey Kurtis J Panoramic image-based virtual reality/telepresence audio-visual system and method
US7824328B2 (en) * 2006-09-18 2010-11-02 Stryker Corporation Method and apparatus for tracking a surgical instrument during surgery
US20080231926A1 (en) * 2007-03-19 2008-09-25 Klug Michael A Systems and Methods for Updating Dynamic Three-Dimensional Displays with User Input
DE102008015312A1 (en) * 2008-03-20 2009-10-01 Siemens Aktiengesellschaft Display system for displaying medical holograms
US20090309874A1 (en) * 2008-06-11 2009-12-17 Siemens Medical Solutions Usa, Inc. Method for Display of Pre-Rendered Computer Aided Diagnosis Results
US7720322B2 (en) * 2008-06-30 2010-05-18 Intuitive Surgical, Inc. Fiber optic shape sensor
EP2304491A1 (en) * 2008-07-10 2011-04-06 Real View Imaging Ltd. Broad viewing angle displays and user interfaces
DE102008034686A1 (en) * 2008-07-25 2010-02-04 Siemens Aktiengesellschaft A method of displaying interventional instruments in a 3-D dataset of an anatomy to be treated, and a display system for performing the method
WO2010092533A1 (en) * 2009-02-13 2010-08-19 Ecole Polytechnique Federale De Lausanne (Epfl) Method and apparatus for 3d object shape and surface topology measurements by contour depth extraction acquired in a single shot
US10004387B2 (en) * 2009-03-26 2018-06-26 Intuitive Surgical Operations, Inc. Method and system for assisting an operator in endoscopic navigation
KR101114750B1 (en) * 2010-01-29 2012-03-05 주식회사 팬택 User Interface Using Hologram

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2013057649A1 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017165301A1 (en) 2016-03-21 2017-09-28 Washington University Virtual reality or augmented reality visualization of 3d medical images
EP3432780A4 (en) * 2016-03-21 2019-10-23 Washington University Virtual reality or augmented reality visualization of 3d medical images
US11771520B2 (en) 2016-03-21 2023-10-03 Washington University System and method for virtual reality data integration and visualization for 3D imaging and instrument position data
US10663922B2 (en) 2016-04-14 2020-05-26 Boe Technology Group Co., Ltd. Image display system and image display method
US10242643B2 (en) 2016-07-18 2019-03-26 Microsoft Technology Licensing, Llc Constrained head-mounted display communication
US20220262079A1 (en) * 2021-02-17 2022-08-18 Arm Limited Foveation for a holographic imaging system
US11948255B2 (en) * 2021-02-17 2024-04-02 Arm Limited Foveation for a holographic imaging system
EP4134974A1 (en) 2021-08-12 2023-02-15 Koninklijke Philips N.V. Dynamic care assistance mechanism

Also Published As

Publication number Publication date
EP2769270B1 (en) 2018-09-19
IN2014CN03103A (en) 2015-07-03
WO2013057649A1 (en) 2013-04-25
CN103959179B (en) 2017-09-15
JP2015504320A (en) 2015-02-12
US20140282008A1 (en) 2014-09-18
RU2608322C2 (en) 2017-01-17
JP6157486B2 (en) 2017-07-05
BR112014009129A2 (en) 2017-04-18
RU2014120182A (en) 2015-11-27
CN103959179A (en) 2014-07-30

Similar Documents

Publication Publication Date Title
EP2769270B1 (en) Holographic user interfaces for medical procedures
US20190231436A1 (en) Anatomical model for position planning and tool guidance of a medical tool
CN105992996B (en) Dynamic and interactive navigation in surgical environment
EP2677937B1 (en) Non-rigid-body morphing of vessel image using intravascular device shape
US11617623B2 (en) Virtual image with optical shape sensing device perspective
Kunz et al. Infrared marker tracking with the HoloLens for neurosurgical interventions
JP6568084B2 (en) Robot control to image devices using optical shape detection
JP2018534011A (en) Augmented reality surgical navigation
CN102918567A (en) System and method for performing a computerized simulation of a medical procedure
JP6706576B2 (en) Shape-Sensitive Robotic Ultrasound for Minimally Invasive Interventions
EP2904601B1 (en) Clinical decision support and training system using device shape sensing
Yin et al. VR and AR in human performance research―An NUS experience
Krapichler et al. VR interaction techniques for medical imaging applications
US11406278B2 (en) Non-rigid-body morphing of vessel image using intravascular device shape
Behringer et al. Some usability issues of augmented and mixed reality for e-health applications in the medical domain
Liao 3D medical imaging and augmented reality for image-guided surgery
Porro et al. An integrated environment for plastic surgery support: building virtual patients, simulating interventions, and supporting intraoperative decisions
US20210358220A1 (en) Adapting an augmented and/or virtual reality
De Paolis et al. Advanced visualization and interaction systems for surgical pre-operative planning
EP3944254A1 (en) System for displaying an augmented reality and method for generating an augmented reality
Pednekar et al. Applications of virtual reality in surgery
Krapichler et al. Human-machine interface for a VR-based medical imaging environment
Mangalote et al. A comprehensive study to learn the impact of augmented reality and haptic interaction in ultrasound-guided percutaneous liver biopsy training and education
Allen et al. Towards virtual displays in the interventional radiology suite: a feasibility study
Sam et al. Augmented Reality in Surgical Procedures

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20140520

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20151130

RIC1 Information provided on ipc code assigned before grant

Ipc: G03H 1/00 20060101AFI20180223BHEP

Ipc: G06T 19/20 20110101ALN20180223BHEP

Ipc: G06F 3/16 20060101ALN20180223BHEP

Ipc: G03H 1/22 20060101ALI20180223BHEP

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

RIC1 Information provided on ipc code assigned before grant

Ipc: G03H 1/22 20060101ALI20180313BHEP

Ipc: G06T 19/20 20110101ALN20180313BHEP

Ipc: G03H 1/00 20060101AFI20180313BHEP

Ipc: G06F 3/16 20060101ALN20180313BHEP

INTG Intention to grant announced

Effective date: 20180409

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1043951

Country of ref document: AT

Kind code of ref document: T

Effective date: 20181015

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602012051337

Country of ref document: DE

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 7

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20180919

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181219

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181219

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20181220

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1043951

Country of ref document: AT

Kind code of ref document: T

Effective date: 20180919

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190119

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190119

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602012051337

Country of ref document: DE

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20181031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181015

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

26N No opposition filed

Effective date: 20190620

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181031

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181031

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181015

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181015

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20191129

Year of fee payment: 8

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20191025

Year of fee payment: 8

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20191029

Year of fee payment: 8

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20121015

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20180919

Ref country code: MK

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180919

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602012051337

Country of ref document: DE

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20201015

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210501

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20201015