EP2769270A1 - Holographic user interfaces for medical procedures - Google Patents
Holographic user interfaces for medical proceduresInfo
- Publication number
- EP2769270A1 EP2769270A1 EP12798835.0A EP12798835A EP2769270A1 EP 2769270 A1 EP2769270 A1 EP 2769270A1 EP 12798835 A EP12798835 A EP 12798835A EP 2769270 A1 EP2769270 A1 EP 2769270A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- anatomical image
- recited
- rendered anatomical
- monitored
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims description 68
- 230000004807 localization Effects 0.000 claims abstract description 33
- 230000004044 response Effects 0.000 claims abstract description 30
- 230000002452 interceptive effect Effects 0.000 claims abstract description 10
- 239000000835 fiber Substances 0.000 claims description 22
- 230000003993 interaction Effects 0.000 claims description 13
- 238000004891 communication Methods 0.000 claims description 9
- 230000003213 activating effect Effects 0.000 claims description 6
- 238000012544 monitoring process Methods 0.000 claims description 4
- 238000009877 rendering Methods 0.000 claims description 3
- 238000013519 translation Methods 0.000 claims description 3
- 230000033001 locomotion Effects 0.000 description 24
- 238000010586 diagram Methods 0.000 description 11
- 239000007943 implant Substances 0.000 description 11
- 230000003287 optical effect Effects 0.000 description 11
- 210000003484 anatomy Anatomy 0.000 description 10
- 230000006870 function Effects 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 9
- 238000003384 imaging method Methods 0.000 description 9
- 210000000056 organ Anatomy 0.000 description 8
- 238000005259 measurement Methods 0.000 description 6
- 238000012800 visualization Methods 0.000 description 6
- 238000012549 training Methods 0.000 description 5
- 238000002604 ultrasonography Methods 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 239000013307 optical fiber Substances 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000001276 controlling effect Effects 0.000 description 3
- 238000001093 holography Methods 0.000 description 3
- 230000010354 integration Effects 0.000 description 3
- 238000012552 review Methods 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 238000001356 surgical procedure Methods 0.000 description 3
- 230000001960 triggered effect Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 210000004204 blood vessel Anatomy 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000002591 computed tomography Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000001105 regulatory effect Effects 0.000 description 2
- 239000000523 sample Substances 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000004513 sizing Methods 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000002560 therapeutic procedure Methods 0.000 description 2
- 230000001755 vocal effect Effects 0.000 description 2
- 241000699670 Mus sp. Species 0.000 description 1
- 238000002679 ablation Methods 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000002725 brachytherapy Methods 0.000 description 1
- 230000002308 calcification Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000011960 computer-aided design Methods 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 238000007418 data mining Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 230000001066 destructive effect Effects 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 230000007831 electrophysiology Effects 0.000 description 1
- 238000002001 electrophysiology Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 210000001035 gastrointestinal tract Anatomy 0.000 description 1
- 230000002068 genetic effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000013152 interventional procedure Methods 0.000 description 1
- 210000004185 liver Anatomy 0.000 description 1
- 210000004072 lung Anatomy 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000003826 tablet Substances 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 210000001835 viscera Anatomy 0.000 description 1
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/04—Processes or apparatus for producing holograms
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/0005—Adaptation of holography to specific applications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/22—Processes or apparatus for obtaining an optical image from holograms
- G03H1/2202—Reconstruction geometries or arrangements
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/0005—Adaptation of holography to specific applications
- G03H2001/0061—Adaptation of holography to specific applications in haptic applications when the observer interacts with the holobject
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H2210/00—Object characteristics
- G03H2210/30—3D object
- G03H2210/33—3D/2D, i.e. the object is formed of stratified 2D planes, e.g. tomographic data
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H2226/00—Electro-optic or electronic components relating to digital holography
- G03H2226/04—Transmission or communication means, e.g. internet protocol
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
Definitions
- the present disclosure relates to medical systems, devices and methods, and more particularly to systems, devices and methods pertaining to integration of holographic image data with other information to improve accuracy and effectiveness in medical applications.
- an interactive holographic display system includes a holographic generation module configured to display a holographically rendered anatomical image.
- a localization system is configured to define a monitored space on or around the holographically rendered anatomical image.
- One or more monitored objects have their position and orientation monitored by the localization system such that coincidence of spatial points between the monitored space and the one or more monitored objects triggers a response in the holographically rendered anatomical image.
- Another interactive holographic display system includes a processor and memory coupled to the processor.
- a holographic generation module is included in the memory and configured to display a holographically rendered anatomical image as an in-air hologram or on a holographic display.
- a localization system is configured to define a monitored space on or around the holographically rendered anatomical image.
- One or more monitored objects has their position and orientation monitored by the localization system such that coincidence of spatial points between the monitored space and the one or more monitored objects triggers a response in the holographically rendered anatomical image wherein the response in the holographically rendered anatomical image includes one or more of: translation or rotation of the holographically rendered anatomical image, magnification adjustment of the
- a method for interacting with a holographic display includes displaying a
- holographically rendered anatomical image localizing a monitored space on or around the holographically rendered anatomical image to define a region for interaction; monitoring a position and orientation of one or more monitored objects by the localization system;
- FIG. 1 is a block/flow diagram showing a system for interfacing with holograms in accordance with exemplary embodiments
- FIG. 2 is a perspective view of a hologram rendered with a data map or overlay thereon in accordance with an illustrative embodiment
- FIG. 3 is a block diagram showing an illustrative process flow for displaying a data map or overlay in a holographic image in accordance with an illustrative embodiment
- FIG. 4 is a block diagram showing an illustrative system and process flow for displaying static or animated objects in a holographic image in accordance with an illustrative embodiment
- FIG. 5 is a diagram showing an illustrative image for displaying an objects menu for selecting a virtual objects during a procedure for display in a holographic image in accordance with an illustrative embodiment
- FIG. 6 is a block diagram showing an illustrative system for controlling a robot using a holographic image in accordance with an illustrative embodiment
- FIG. 7 is a block diagram showing an illustrative system which employs haptic feedback with a holographic image in accordance with an illustrative embodiment
- FIG. 8 is a diagram showing multiple views provided to different perspectives in an illustrative system for displaying a holographic image or the like in accordance with one embodiment
- FIG. 9 is a block diagram showing an illustrative system for controlling a robot remotely over a network using a holographic image in accordance with an illustrative embodiment.
- FIG. 10 is a flow diagram showing a method for interfacing with a hologram in accordance with an illustrative embodiment.
- systems, devices and methods which leverage holographic display technology for medical procedures.
- This can be done using 3D holographic technologies (e.g., in-air holograms) and real-time 3D input sensing methods such as optical shape sensing to provide a greater degree of human-data interaction during a procedure.
- Employing holographic technology with other technologies potentially simplifies procedure workflow, instrument selection, and manipulation within the anatomy of interest.
- Such exemplary embodiments described herein can utilize 3D holographic displays for real-time visualization of volumetric datasets with exemplary localization methods for sensing movements in free space during a clinical procedure, thereby providing new methods of human-data interaction in the interventional suite.
- 3D holography may be used to fuse anatomical data with functional imaging and "sensing" information.
- a fourth dimension e.g., time, color, texture, etc.
- a display can be in (near) real-time and use color-coded visual information and/or haptic feedback/tactile information, for example, to convey different effects of states of the holographically displayed object of interest.
- Such information can include morphological information about the target, functional information about the object of interest (e.g.
- the exemplary 3D holographic display can be seen from (virtually) any angle/direction so that, e.g., multiple users can simultaneously interact with the same understanding and information.
- Such "touch” can also be used to, e.g., rotate the virtual organ, zoom, tag points in 3D, draw a path and trajectory plan (e.g., for treatment, targeting, etc.), select critical zones to avoid, create alarms, and drop virtual objects (e.g., implants) in 3D in the displayed 3D anatomy.
- a path and trajectory plan e.g., for treatment, targeting, etc.
- select critical zones e.g., for treatment, targeting, etc.
- create alarms e.g., implants
- Exemplary embodiments according to the present disclosure can also be used to facilitate a remote procedure (e.g., where the practitioner "acts" on the virtual organ and a robot simultaneously or subsequently performs the procedure on the actual organ), to practice a procedure before performing the actual procedure in a training or simulation setting, and/or to review/study/teach a procedure after it has been performed (e.g., through data recording, storage, and playback of the 3D holographic display and any associated multimodality signals relevant to the clinical procedure).
- a remote procedure e.g., where the practitioner "acts" on the virtual organ and a robot simultaneously or subsequently performs the procedure on the actual organ
- to practice a procedure before performing the actual procedure in a training or simulation setting e.g., through data recording, storage, and playback of the 3D holographic display and any associated multimodality signals relevant to the clinical procedure.
- Exemplary embodiments according to the present disclosure are further described herein below with reference to the appended figures. While such exemplary embodiments are largely described separately from one another (e.g., for ease of presentation and understanding), one having ordinary skill in the art shall appreciate in view of the teachings herein that such exemplary embodiments can be used independently and/or in combination with each other. Indeed, the implementation and use of the exemplary embodiments described herein, including combinations and variations thereof, all of which are considered a part of the present disclosure, can depend on, e.g., particular laboratory or clinical use/application, integration with other related technologies, available resources,
- a real-time 3D holographic display in accordance with the present principles may include a real-time six degree of freedom (DOF) input via localization technology embedded into a data interaction device (e.g., a haptic device for sensory feedback).
- DOF real-time six degree of freedom
- An imaging / monitoring system for multidimensional data acquisition may also be employed. Datalinks between the holography display, localization system / interaction device, and imaging / monitoring system may be provided for communication between these systems.
- the display, feedback devices, localization devices, measurement devices may be employed with or integrated with a computational workstation for decision support and data libraries of case information that can be dynamically updated / recalled during a live case for training / teaching / procedure guidance purposes (e.g., for similar archived clinical cases relative to the procedure and patient undergoing treatment).
- the present invention will be described in terms of medical instruments; however, the teachings of the present invention are much broader and are applicable to any systems that can benefit from holographic visualization.
- the present principles are employed in tracking or analyzing complex biological or mechanical systems.
- the present principles are applicable to internal tracking procedures of biological systems, procedures in all areas of the body such as the lungs, gastro -intestinal tract, excretory organs, blood vessels, etc.
- the elements depicted in the FIGS may be implemented in various combinations of hardware and software and provide functions which may be combined in a single element or multiple elements.
- processor can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software.
- the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared.
- explicit use of the term "processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor ("DSP”) hardware, read-only memory (“ROM”) for storing software, random access memory (“RAM”), non-volatile storage, etc.
- DSP digital signal processor
- ROM read-only memory
- RAM random access memory
- non-volatile storage etc.
- embodiments of the present invention can take the form of a computer program product accessible from a computer-usable or computer-readable storage medium providing program code for use by or in connection with a computer or any instruction execution system.
- a computer-usable or computer readable storage medium can be any apparatus that may include, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
- the medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium.
- Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk.
- Current examples of optical disks include compact disk - read only memory (CD-ROM), compact disk - read/write (CD-R/W), Blu-RayTM and DVD.
- System 100 may include a workstation or console 112 from which a procedure is supervised and/or managed.
- Workstation 112 preferably includes one or more processors 114 and memory 116 for storing programs and applications.
- Memory 116 may store a holographic generation module 115 configured to render a holographic image on a display 158 or in-air depending on the application.
- the holographic generation module 115 codes image data to generate a three dimensional hologram.
- the coding may provide the hologram on a 2D display or in 3D media or 3D display.
- data from 3D imaging e.g., computed tomography, ultrasound, magnetic resonance may be transformed into a hologram using spatial distribution and light intensity to render the hologram.
- a localization system 120 includes a coordinate system 122 to which a holographic image or hologram 124 is registered.
- the localization system 120 may also be employed to register a monitored object 128, which may include virtual instruments, which are separately created and controlled, real instruments, a physician's hands, fingers or other anatomical parts, etc.
- the localization system 120 may include an electromagnetic tracking system, a shape sensing system, such as a fiber optic based shape sensing system, an optical sensing system, including light sensors and arrays, or other sensing modality, etc.
- the localization system 120 is employed to define spatial regions in and around the hologram or the holographic image 124 to enable a triggering of different functions or actions as a result of movement in the area of the hologram 124.
- dynamic locations of a physician's hands may be tracked using a fiber optic shape sensing device.
- the intensity of the hologram may be increased.
- the physician's hand movements may be employed to spatially alter the position or orientation of the hologram 124 or to otherwise interact with the hologram 124.
- a monitored object or sensing system 128 may be spatially monitored relative to the hologram 124 or the space 126 around the hologram 124.
- the monitored object 128 may include the physician's hands, a real or a virtual tool, another hologram, etc.
- the monitored object 128 may include a sensor or sensors 132 adapted to monitor the position of the monitored object 128 such that when a position of the object or a portion thereof is within the hologram 124 or the space 126 around the hologram 124, a reaction occurs that is consistent with the type of the monitored object 128 and the action performed or to be performed by the monitored object 128.
- the sensor or sensors 132 may include EM sensors, fiber optic shape sensors, etc.
- the sensors 132 include fiber optic shape sensors.
- a sensor interpretation module 134 may be employed to interpret feedback signals from a shape sensing device or system (132).
- Interpretation module 134 is configured to use the signal feedback (and any other feedback, e.g., optical, electromagnetic (EM) tracking, etc.) to reconstruct motion, deflection and other changes associated with the monitored object 128, which may include a medical device or instrument, virtual tools, human anatomical features, etc.
- the medical device may include a catheter, a guidewire, a probe, an endoscope, a robot, an electrode, a filter device, a balloon device, or other medical component, etc.
- the shape sensing system (132) may include one or more optical fibers which are coupled to the monitored object 128 in a set pattern or patterns.
- the optical fibers connect to the workstation 112 through cabling 127.
- the cabling 127 may include fiber optics, electrical connections, other instrumentation, etc., as needed.
- Shape sensing system (132) may be based on fiber optic Bragg grating sensors.
- a fiber optic Bragg grating (FBG) is a short segment of optical fiber that reflects particular wavelengths of light and transmits all others. This is achieved by adding a periodic variation of the refractive index in the fiber core, which generates a wavelength-specific dielectric mirror.
- a fiber Bragg grating can therefore be used as an inline optical filter to block certain wavelengths, or as a wavelength-specific reflector.
- a fundamental principle behind the operation of a fiber Bragg grating is Fresnel reflection at each of the interfaces where the refractive index is changing. For some wavelengths, the reflected light of the various periods is in phase so that constructive interference exists for reflection and, consequently, destructive interference for transmission.
- the Bragg wavelength is sensitive to strain as well as to temperature. This means that Bragg gratings can be used as sensing elements in fiber optical sensors. In an FBG sensor, the measurand (e.g., strain) causes a shift in the Bragg wavelength.
- One advantage of this technique is that various sensor elements can be distributed over the length of a fiber. Incorporating three or more cores with various sensors (gauges) along the length of a fiber that is embedded in a structure permits a three dimensional form of such a structure to be precisely determined, typically with better than 1 mm accuracy.
- a multitude of FBG sensors can be located (e.g., 3 or more fiber sensing cores). From the strain measurement of each FBG, the curvature of the structure can be inferred at that position. From the multitude of measured positions, the total three-dimensional form is determined.
- workstation 112 includes an image generation module 148 configured to receive feedback from the shape sensing system 132 or other sensor to sense interactions with the hologram 124.
- a position and status of the hologram 124 and its surrounding space 126 is known to the localization system 120.
- a comparison module 142 determines whether an action is triggered depending on a type of motion, a type of monitored object 128, a type of procedure or activity and/or any other criteria.
- the comparison module 142 informs the holographic generation module 115 that a change is needed.
- the holographic generation module 115 recodes the image data, which is processed and output to the image generation module 148, which updates the hologram 124 in accordance with set criteria.
- the hologram 124 may include an internal organ rendered based on 3D images 152 of a patient or subject 150.
- the images 152 may be collected from the patient 150 preoperative ly using an imaging system 110.
- the imaging system 110 and the patient 150 need not be present to employ the present principles as the system 100 may be employed for training, analysis or other purposes at any time.
- a physician employs a pair of gloves having sensors 132 disposed thereon.
- the gloves/sensors 132 enter the space 126 and coincide with the hologram 124, the physician is able to rotate or translate the hologram 124.
- the gloves include a haptic device 156 that provides tactile feedback depending on a position of the gloves
- the haptic feedback is indicative of the tissue type corresponding with the hologram 124 and its representation.
- the haptic device or system 156 may include ultrasound sources, speakers or other vibratory sources to convey differences in state of the hologram 124 using vibrations or sound.
- a display 118 and or display 158 may also permit a user to interact with the workstation 112, the hologram 124 and its components and functions, or any other element within the system 100. This is further facilitated by an interface 130 which may include a keyboard, mouse, a joystick, a haptic device, or any other peripheral or control to permit user feedback from and interaction with the workstation 112.
- a user can touch (or otherwise interact with) a specific region of interest (ROI) 154 within the 3D holographic display 158 or the hologram 124 within the 3D holographic display (or elsewhere) to display additional information related to the selected specific region of interest, e.g., tissue characteristics, such as temperature, chemical content, genetic signature, pressure, calcification percent, etc.
- ROI region of interest
- An overlay of information can be displayed or presented on a separate exemplary 2D display (118), whereby parts of the 2D display can be transparent, for example, for better viewing of displayed information.
- the exemplary 2D display 118 presents or displays other graphics and text in high resolution (e.g., in exemplary embodiments where the 3D display may be of relatively low or limited resolution).
- zones or regions of interest 154 can be automatically highlighted and/or outlined within the 3D holographic display 158 or hologram 124.
- Such other zones of interest can be, e.g., zones which have similar characteristics as the selected zone of interest and/or zones which are otherwise related.
- the 3D holographic display 158 or hologram 124 may be employed with six degrees of freedom (6DOF) user tracking, e.g., with shape enabled instruments 132 and/or with camera based sensors 137, allowing for use as a user interface in 3D and real-time 6DOF user interaction.
- 6DOF degrees of freedom
- a user e.g., practitioner
- the user can rotate, zoom in/out (e.g., changing the
- Seed points 162 may be created and dropped into the 3D holographic display 158 or hologram 124 by touching (and/or tapping, holding, etc.) a portion of the display 158 or the hologram 124.
- the seed points 162 may be employed for, e.g., activation of virtual cameras which can provide individually customized viewing perspectives (e.g., orientation, zoom, resolution, etc.) which can be streamed (or otherwise transmitted) onto a separate high resolution 2D display 118.
- the touch feature can by employed to create or drop virtual seed points 162 into the 3D display 158 for a plurality of tasks, e.g., initialization of segmentation, modeling, registration or other computation, visualization, planning step, etc.
- the display can also be used to display buttons, drop down menus, pointers/trackers, optional functions, etc. allowing users to interact and give commands to the system and/or any computer included therein or connected thereto (e.g., directly connected or via the Internet or other network).
- a microphone 164 may be employed to receive verbal information to connect, control, interact, etc. with the exemplary 3D holographic display 158 or hologram 124 via voice-controlled commands.
- a speech recognition engine 166 may be provided to convert speech commands into program commands to allow a user (e.g., surgeon) to interact with the 3D holographic display 158 or hologram 124 without having to use their hands. For example, a user could say "SHOW LAO FORTY", and the volume displayed within the holographic image would rotate to the proper angle to provide the user with the desired view.
- commands can range from those which are relatively simple, such as "ZOOM", followed by a specific amount e.g., "3 times” or so as to display particular (additional) information, to more complex commands, e.g., which can be related to a specific task or procedure.
- a recording mode can be provided in memory 116 and made available to, e.g., play back a case on a same device for full 3D replay and/or on conventional (2D or 3D) viewing devices with automatic conversion of recorded 3D scenes into multiple 2D viewing perspectives (or rotating 3D models, e.g., in virtual reality modeling language (VRML)).
- Data connections between the holographic display 158 and recordings archived in a library/database 168 such as a picture archiving and communication system (PACS), Radiology Information Systems (RIS) or other electronic medical record system can be used to facilitate, e.g., visualization and diagnostic interpretation/data mining.
- a library/database 168 such as a picture archiving and communication system (PACS), Radiology Information Systems (RIS) or other electronic medical record system can be used to facilitate, e.g., visualization and diagnostic interpretation/data mining.
- PACS picture archiving and communication system
- RIS Radiology Information Systems
- other electronic medical record system can be used to facilitate, e.g
- Recordings can be replayed and used for, e.g., teaching and training purposes, such as to teach or train others in an individual setting, (e.g., when a user wants to review a recorded procedure performed), a small group environment (e.g., with peers and/or management), a relatively large class, lecture, etc.
- Such exemplary recordings may also be used for marketing presentations, research environments, etc. and may also be employed for quality and regulatory assessment, e.g., process evaluation or procedure assessment by hospital administrators, third-party insurers, investors, the Food and Drug Administration (FDA) and/or other regulatory bodies.
- Virtual cameras may be employed to capture or record multiple viewpoints/angles and generate multiple 2D outputs for, e.g., video capture or simultaneous display of images on different 2D television screens or monitors (or sections thereof).
- three-dimensional (3D) holography may be used to display volumetric data of an anatomy (e.g., from a 3D CT scan), for example, to fuse anatomical with functional imaging and "sensing" information, as well as temporal (time-related) information.
- the information may be employed to create (generate, produce, display, etc.) a dynamic 3D multimodality representation 202 (e.g., a hologram) of an object (e.g., organ) and a status thereof using visual indicators 204, 206, such as colors, contrast levels and patterns from a display 210.
- the object 202 e.g., hologram 124) may show different regions 204, 206 to indicate useful data on the object 202.
- epicardial and/or endocardial mapping data can be used to, e.g., display electrical activity data on a heart image during an electrophysiology procedure, superimposed with the anatomical imaging data of the heart (e.g., coming from CT, XperCT or MR).
- anatomical imaging data of the heart e.g., coming from CT, XperCT or MR
- Another example is the display of temperature maps which can be provided by MR during ablation, or magnetic resonance high- intensity focused ultrasound (MR-HIFU) 4D (four-dimensional) information during an intervention (e.g., using MR digital data transfer systems and procedures).
- MR-HIFU magnetic resonance high- intensity focused ultrasound
- 4D four-dimensional information during an intervention
- information associated with a real-time radiation dose distribution map superimposed over the anatomical target during a radiation oncology treatment Linac, brachytherapy, etc.
- Other embodiments are also contemplated.
- a volumetric image 302 of a heart is acquired and may be segmented to reduce computational space and to determine anatomical features of the heart as opposed to other portions of the image. This results in a segmented image 304.
- Functional or device data is acquired by performing measurements or tests in block 306 on the heart or other anatomical feature.
- an electroanatomical map or other map is generated corresponding with the heart or organ. The map is registered to the segmented image 304 to provide a registered image 310 that may be generated and displayed as a hologram.
- data may be collected from within or about the heart using a localization technique (shape sensing, etc.).
- Data traces of catheter positions or other related data (treatment locations, etc.) may be rendered in a holographic image 312 which includes both the anatomical data (e.g., segmented hologram) and the device data (e.g., catheter data).
- Another exemplary embodiment according to the present disclosure includes the acquisition of incomplete data (e.g., projections rather than full 3D images).
- This may include, for example, data in Fourier (frequency) space where intermittent or incomplete images are acquired.
- undersampled image data in the frequency domain are collected.
- it is possible to construct (generate, produce, display, etc.) a 3D holographic image display with relatively less or a reduced amount of input data, and thus a relatively less or reduced amount of associated
- the resultant 3D holographic image may be constructed/displayed with (some) limitations.
- Such exemplary embodiments can help achieve real-time or near-real-time dynamic displays with significantly less radiation exposure (e.g., in the case of live X-ray imaging) as well as computational overhead, which benefits can be considered (e.g., balanced, weighed against) in view of the potential limitations associated with this exemplary embodiment.
- another exemplary embodiment includes inputting virtual instruments or objects into a holographic display.
- objects 402 may be digitized or otherwise rendered into a virtual environment 404 and displayed.
- the objects 402 may be drawn or loaded into the workstation 112 as object data 405 and may be coded into the display 158 and concurrently renders with the hologram 124.
- a static image of the object 402 may appear in the hologram 124 and may be separately manipulated with the hologram 124 (and or on the display 158).
- the static image may be employed for size comparisons or measurements between the object 402 and the hologram 124.
- a converter box 406 may be included to employ a standardization protocol to provide for a "video-ready" interface to the 3D holographic display 158.
- the converter box 406 can format the x, y, z coordinates from each localized instrument or object 402 (catheter, implant, etc.) into a space readable by the holographic display 158 (e.g., rasterized/scan converted voxel space, vector space, etc.). This can be performed using the workstation 112 in FIG. 1.
- the 3D format should at least support voxels (for volumes), and graphical elements / primitives e.g., meshes (a virtual catheter can be displayed as a tube) and lines in 3D (to encode
- the 3D format can be varied in accordance with the present disclosure based on, e.g., particular laboratory or clinical use or applications, integration with other related technologies, available resources, environmental conditions, etc.
- the object 402 e.g., a computer aided design rendering, model, scan, etc. for an instrument, medical device, implant, etc.
- the object 402 may be independently manipulated relative to the hologram 124 on the display 158 or in the air.
- the object 402 can be placed in or around the hologram 124 to determine whether the object will fit within a portion of the hologram 124, etc.
- an implant may be placed through a blood vessel to test the fit visually. It is also contemplated that other feedback may be employed.
- a comparison module may be capable of determined interference between the hologram 124 and the objects 402 to enable, say, haptic feedback to indicate that a clearance for the implant is not possible.
- Other applications are also contemplated.
- the system 100 of FIG. 1 and/or FIG. 4 may be employed as an education and/or training tool.
- a practitioner e.g., surgeon, physician, fellow, doctor, etc.
- a procedure surgery, case, etc.
- a fellow/practitioner could practice (perform virtually) a surgical procedure
- a tracked input device e.g., an instrument tracked with shape sensing, electromagnetic tracking, acoustic tracking, or machine vision based optical tracking (time-of- flight cameras or structured light cameras), may be employed in conjunction with the display 158 to access a virtual help mode trigger point 504 (or other functions) in the hologram 124 and generated by the image generation module 148.
- the virtual help trigger point 504 may include pixel regions within the display or hologram.
- the region or trigger point 504 may be selected (e.g., virtually selected and displayed by using the tip of the tracked virtual tool (402) (or using the monitored object 128) which is automatically registered with the hologram 124 in the image.
- the trigger points 504 are selected in the hologram 124 and a menu 502 or other interactive space may open to permit further selections.
- a fellow/practitioner could first select a program called "HIP” by activating a trigger point 504 to display a 3D CT image of a patient's (subject's) hip, and then select different "HIP IMPLANTS” from different manufacturers to see and "feel” which implant would fit best for the particular patient. It is also possible to use (e.g., physically hold and manipulate) the actual implant in the air and position it within the 3D holographic display to see, feel and assess fit (e.g., if and how well such implant may fit the particular patient).
- FIG. 5 shows the virtual menu 502 that may be provided in the holographic display 158 or other display 118 to permit the selection of a stent 508.
- the virtual menu 502 can be called using the display 158, the hologram 124 or by employing interface 130.
- a virtual model is rendered (see FIG. 4) in the display 158 or hologram 124 to permit manipulation, measurement, etc.
- the virtual menu 502 provides for clinical decision support tying together localization and an exemplary holographic user interface in accordance with an exemplary embodiment.
- the shape tracked instrument (128) e.g., a catheter
- the virtual menu 502 can pop up automatically for each region, or the trigger point 504 may be activated by placing the object tip into the region of the trigger point 504 or otherwise activating the trigger point 504 (e.g., touching it, etc.).
- An implant or other device may be selected, which is then introduced to allow for device sizing/selection to be performed in the virtual holography space (e.g., within or in close proximity to the holographic display).
- a 3D holographic display 158 or hologram 124 may be employed during surgery to interact with a device 602 inside the patient.
- Robotics via a master/slave configuration can be used, where a shape sensed analog 604 of the device 602 moving within the display 158 is employed to actuate the motion of the actual device 602 within a target region 606.
- a practitioner's (surgeon's, physician's, etc.) hands 610 or voice can be tracked by sensor-based and/or voice-based techniques, such as by, e.g., tracking a physician's hands using a shape-sensing device 608 and shape sensing system 614 in the 3D holographic display.
- a practitioner's movements including, e.g., (re)positioning, orientation, etc. of their hands
- the device 602 such as a robot 612 (e.g., robotically controlled instruments) inside the patient to replicate such movements within the patient's body, and thereby perform the actual surgery, procedure or task inside the patient's body.
- a surgeon can see, touch and feel a 3D holographic display of an organ, perform a procedure thereon (i.e., within the 3D holographic display), causing such procedure to be performed inside of a patient on the actual organ via, or simply to move instruments, e.g., robotically controlled instruments.
- the movement of the physician creates sensing signals using sensor 608 (and/or sensors in device 604), which are adapted to control signals by the system 100 for controlling the robot or other device 602.
- the signals may be stored in memory (1 16) for delayed execution if needed or desired.
- the actual procedure may be performed in real-time (or near real-time), e.g., a robot 612 performs a specific movement within a patient's body concurrently with a surgeon's performance within the 3D holographic display 158.
- the actual procedure may be performed by, e.g., a robot (only) after a surgeon completes a certain task/movement (or series of tasks or movements), and/or the surgeon confirms that the robot should proceed (e.g., after certain predefined criteria and/or procedural milestone(s) are reached).
- a delay e.g., between the virtual performance of a task or movement within the 3D holographic display to the actual performance within a patient's body
- a practitioner could opt to redo a specific movement/task that is virtually performed within the 3D holographic display 158 if the practitioner is not satisfied with such movement or task (for any reason).
- the surgeon could opt to redo such virtual movement or task (as many times as desired or may be necessary) until it is performed correctly.
- the actual movement or task can be performed by a robot inside of the patient with or without dynamic adaptation of the task to adjust for changes in target or therapy instrument on-the-fly (e.g., dynamically, on a continuous basis, in real-time, etc.).
- a haptic device 710 may take many forms.
- a set of ultrasonic probes 702 can be used to send customized waves towards a 3D holographic display 704 to give the user (e.g., a practitioner) a sense of feeling of structures being displayed and represented by the 3D holographic display 704.
- such waves can be tailored or customized to represent hard or stiff materials of bony structures, while other tissues, such as of the liver and/or vessels, can be represented with a relatively softer "feel" by waves which are tailored or configured accordingly.
- Such encoding can be realized by, e.g., modulation of the frequency, amplitude, phase, or other spatiotemporal modulation of the excitation imparted by a haptic device 710 to the observer.
- the haptic feedback device 710 may be employed to, e.g., represent physical resistance of a particular structure or tissue in response to a movement or task performed by a practitioner.
- a haptic device 712 such as, e.g., a glove(s), bracelet, or other garments or accessories having actuators or other vibratory elements.
- the exemplary 3D holographic display 158 can be seen from (virtually) any angle, e.g., so that all users can interact with the same understanding and information. This is the case for an in-air hologram; however, other display techniques may be employed to provide multiple individuals with a same view.
- display information may be provided to different users positioned in a room or area, by displaying the same or different information on a geometrical structure 802 (holographically), such as a multi- faceted holographic display where each face of the display (e.g., a cube or polyhedron) displays the information.
- a geometrical structure 802 holographically
- This can be achieved by projecting multiple 2D video streams 804 on the geometrical structure 802 (e.g., side by side, or partially overlapping) rendered within a holographic output 806.
- a holographic "cube" display in 3D can show/display on one cube face information (e.g., a 2D live x-ray image) in one particular direction (e.g., the direction of a first practitioner 808), while another cube face of the same "cube” display can show/display another type of information (e.g., an ultrasound image) to a second practitioner 810 positioned elsewhere in the room (e.g., diametrically opposite the display from the first practitioner).
- cube face information e.g., a 2D live x-ray image
- another cube face of the same "cube” display can show/display another type of information (e.g., an ultrasound image) to a second practitioner 810 positioned elsewhere in the room (e.g., diametrically opposite the display from the first practitioner).
- Such exemplary display can be configured at will depending on, e.g., the number of users in the room. It is also possible that the position (in the room) of each
- each user/practitioner can be tracked (in the room) and that each individual's display information follows each user's viewing perspective as the user moves (e.g., during a procedure). For example, one particular user (doctor, nurse, etc.) can be provided with the specific information that the user needs regardless of where in the room such particular user moves during a procedure. Further, it is also possible that each user is provided with a unique display, which can be a 2D cube face, such as described above, or a 3D holographic display customized or tailored for such user, and that such a unique display can "follow" the user as the user moves around a room.
- a unique display which can be a 2D cube face, such as described above, or a 3D holographic display customized or tailored for such user, and that such a unique display can "follow" the user as the user moves around a room.
- text is an inherently 2D mode of communication.
- the system may display shapes/symbols identifiable from multiple viewpoints, or represent the text oriented towards the viewer.
- the oriented text may be shown in multiple directions simultaneously or to each independently in different frames.
- a remote system 900 may include at least some of the capabilities of system 100 (FIG. 1) but is remotely disposed relative to a patient 902 and data collection instruments.
- a user may conduct a procedure remotely (e.g., with the user being physically located off-site from the location where the subject/patient 902 is located) or assist or provide guidance remotely to the procedure.
- a user can perform a procedure/task on an exemplary holographic display 904 located at their location.
- the display 904 is connected (e.g., via the Internet or other network 910 (wired or wireless)) to the system 100 co-located with the patient 902.
- System 100 can be in continuous
- the system 100 may include robotically controlled instruments 906, e.g., inside of a patient) which are controlled via commands provided (e.g., via the Internet) by the remote system 900, as described above. These commands are generated based on the user's interaction with the holographic display 904.
- Holographic displays 158 and 904 may include the same subject matter at one or more locations so that the same information is conveyed at each location. For example, this embodiment may include, e.g., providing guidance or assistance to another doctor around the globe, for peer-to-peer review, expert assistance or a virtual class room where many students could attend a live case from different locations throughout the world.
- a method for interacting with a holographic display is shown in accordance with illustrative embodiments.
- a holographically rendered anatomical image is generated and displayed.
- the image may include one or more organs or anatomical regions.
- the holographically rendered anatomical image may be generated in-air.
- a monitored space is localized on or around the holographically rendered anatomical image to define a region for interaction.
- the localization system may include one or more of a fiber optic shape sensing system, an electromagnetic tracking system, a light sensor array and/or other sensing modality.
- the position and orientation of the monitored space and the one or more monitored objects is preferably determined in a same coordinate system.
- the one or more monitored objects may include a medical instrument, an anatomical feature of a user, a virtual object, etc.
- a position and orientation of one or more monitored objects is monitored by the localization system.
- coincidence of spatial points is determined between the monitored space the one or more monitored objects.
- a response is triggered in the ho lo graphically rendered anatomical image.
- the response may include moving the holographically rendered anatomical image (e.g. 6DOF) or changing its appearance.
- the response may include adjusting a zoom (magnification) or other optical characteristics of the holographically rendered anatomical image.
- the holographically rendered anatomical image may be marked, tagged, targeted, etc.
- camera viewpoints can be assigned (for other viewers or displays).
- feedback may be generated to a user.
- the feedback may include haptic feedback (vibrating device or air), optical feedback (visual or color differences), acoustic feedback (verbal, alarms), etc.
- a response region may be provided and monitored by the localization system such that upon activating the response region a display event occurs.
- the display event may include generating a help menu in block 1024; generating a menu of virtual objects to be included in the holographically rendered anatomical image upon selection in block 1026; and generating information to be displayed in block 1028.
- the holographically rendered anatomical image may be generated with superimposed medical data mapped to positions on the holographically rendered anatomical image.
- the response that is triggered may include generating control signals for operating robotically controlled instruments. The control signals may enable remote operations to be performed.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- Pathology (AREA)
- Optics & Photonics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Endoscopes (AREA)
- Holo Graphy (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Architecture (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Computer Hardware Design (AREA)
- Image Generation (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
- Multimedia (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Human Computer Interaction (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161549273P | 2011-10-20 | 2011-10-20 | |
PCT/IB2012/055595 WO2013057649A1 (en) | 2011-10-20 | 2012-10-15 | Holographic user interfaces for medical procedures |
Publications (2)
Publication Number | Publication Date |
---|---|
EP2769270A1 true EP2769270A1 (en) | 2014-08-27 |
EP2769270B1 EP2769270B1 (en) | 2018-09-19 |
Family
ID=47326233
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12798835.0A Not-in-force EP2769270B1 (en) | 2011-10-20 | 2012-10-15 | Holographic user interfaces for medical procedures |
Country Status (8)
Country | Link |
---|---|
US (1) | US20140282008A1 (en) |
EP (1) | EP2769270B1 (en) |
JP (1) | JP6157486B2 (en) |
CN (1) | CN103959179B (en) |
BR (1) | BR112014009129A2 (en) |
IN (1) | IN2014CN03103A (en) |
RU (1) | RU2608322C2 (en) |
WO (1) | WO2013057649A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017165301A1 (en) | 2016-03-21 | 2017-09-28 | Washington University | Virtual reality or augmented reality visualization of 3d medical images |
US10242643B2 (en) | 2016-07-18 | 2019-03-26 | Microsoft Technology Licensing, Llc | Constrained head-mounted display communication |
US10663922B2 (en) | 2016-04-14 | 2020-05-26 | Boe Technology Group Co., Ltd. | Image display system and image display method |
US20220262079A1 (en) * | 2021-02-17 | 2022-08-18 | Arm Limited | Foveation for a holographic imaging system |
EP4134974A1 (en) | 2021-08-12 | 2023-02-15 | Koninklijke Philips N.V. | Dynamic care assistance mechanism |
Families Citing this family (65)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2749119B1 (en) * | 2011-08-24 | 2017-11-29 | LG Electronics Inc. | Mobile terminal and controlling method thereof |
EP3321780A1 (en) * | 2012-02-15 | 2018-05-16 | Immersion Corporation | High definition haptic effects generation using primitives |
KR101409845B1 (en) * | 2012-12-20 | 2014-06-19 | 전자부품연구원 | Image Display Apparatus and Method |
US9383975B1 (en) | 2013-01-28 | 2016-07-05 | Richard Stanley Fencel | Projection of software and integrated circuit diagrams into actual 3D space |
US20140218397A1 (en) * | 2013-02-04 | 2014-08-07 | Mckesson Financial Holdings | Method and apparatus for providing virtual device planning |
US9639666B2 (en) * | 2013-03-15 | 2017-05-02 | Covidien Lp | Pathway planning system and method |
US20150003204A1 (en) * | 2013-06-27 | 2015-01-01 | Elwha Llc | Tactile feedback in a two or three dimensional airspace |
US9804675B2 (en) | 2013-06-27 | 2017-10-31 | Elwha Llc | Tactile feedback generated by non-linear interaction of surface acoustic waves |
CN104679226B (en) * | 2013-11-29 | 2019-06-25 | 上海西门子医疗器械有限公司 | Contactless medical control system, method and Medical Devices |
CN105302281A (en) * | 2014-05-28 | 2016-02-03 | 席东民 | Holographic virtual haptic generation apparatus |
CN106028903B (en) * | 2014-06-10 | 2018-07-24 | 奥林巴斯株式会社 | The working method of endoscopic system, endoscopic system |
US10416760B2 (en) | 2014-07-25 | 2019-09-17 | Microsoft Technology Licensing, Llc | Gaze-based object placement within a virtual reality environment |
US9904055B2 (en) | 2014-07-25 | 2018-02-27 | Microsoft Technology Licensing, Llc | Smart placement of virtual objects to stay in the field of view of a head mounted display |
US9858720B2 (en) | 2014-07-25 | 2018-01-02 | Microsoft Technology Licensing, Llc | Three-dimensional mixed-reality viewport |
US10451875B2 (en) | 2014-07-25 | 2019-10-22 | Microsoft Technology Licensing, Llc | Smart transparency for virtual objects |
US10311638B2 (en) | 2014-07-25 | 2019-06-04 | Microsoft Technology Licensing, Llc | Anti-trip when immersed in a virtual reality environment |
US9865089B2 (en) | 2014-07-25 | 2018-01-09 | Microsoft Technology Licensing, Llc | Virtual reality environment with real world objects |
US9766460B2 (en) | 2014-07-25 | 2017-09-19 | Microsoft Technology Licensing, Llc | Ground plane adjustment in a virtual reality environment |
CN105373212B (en) * | 2014-08-25 | 2020-06-23 | 席东民 | Virtual touch generating device |
JP5927366B1 (en) * | 2014-09-24 | 2016-06-01 | オリンパス株式会社 | Medical system |
EP3012759B1 (en) * | 2014-10-24 | 2019-10-02 | mediCAD Hectec GmbH | Method for planning, preparing, accompaniment, monitoring and/or final control of a surgical procedure in the human or animal body, system for carrying out such a procedure and use of the device |
US10235807B2 (en) * | 2015-01-20 | 2019-03-19 | Microsoft Technology Licensing, Llc | Building holographic content using holographic tools |
US10740971B2 (en) | 2015-01-20 | 2020-08-11 | Microsoft Technology Licensing, Llc | Augmented reality field of view object follower |
US10007413B2 (en) | 2015-04-27 | 2018-06-26 | Microsoft Technology Licensing, Llc | Mixed environment display of attached control elements |
US9713871B2 (en) | 2015-04-27 | 2017-07-25 | Microsoft Technology Licensing, Llc | Enhanced configuration and control of robots |
CN104771232A (en) * | 2015-05-05 | 2015-07-15 | 北京汇影互联科技有限公司 | Electromagnetic positioning system and selection method for three-dimensional image view angle of electromagnetic positioning system |
US20160364003A1 (en) * | 2015-06-10 | 2016-12-15 | Wayne Patrick O'Brien | Holographic interface for manipulation |
US11449146B2 (en) * | 2015-06-10 | 2022-09-20 | Wayne Patrick O'Brien | Interactive holographic human-computer interface |
CN107924459B (en) * | 2015-06-24 | 2021-08-27 | 埃达技术股份有限公司 | Method and system for interactive 3D mirror placement and measurement for kidney stone removal procedures |
US9520002B1 (en) | 2015-06-24 | 2016-12-13 | Microsoft Technology Licensing, Llc | Virtual place-located anchor |
US9984478B2 (en) * | 2015-07-28 | 2018-05-29 | PME IP Pty Ltd | Apparatus and method for visualizing digital breast tomosynthesis and other volumetric images |
US10007352B2 (en) | 2015-08-21 | 2018-06-26 | Microsoft Technology Licensing, Llc | Holographic display system with undo functionality |
US9952656B2 (en) | 2015-08-21 | 2018-04-24 | Microsoft Technology Licensing, Llc | Portable holographic user interface for an interactive 3D environment |
KR20170029320A (en) * | 2015-09-07 | 2017-03-15 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US10304247B2 (en) | 2015-12-09 | 2019-05-28 | Microsoft Technology Licensing, Llc | Third party holographic portal |
US9990078B2 (en) * | 2015-12-11 | 2018-06-05 | Immersion Corporation | Systems and methods for position-based haptic effects |
DE102015226669B4 (en) * | 2015-12-23 | 2022-07-28 | Siemens Healthcare Gmbh | Method and system for outputting augmented reality information |
CN105898217B (en) * | 2016-04-14 | 2019-08-06 | 京东方科技集团股份有限公司 | Holography operation equipment and control equipment, holographic operating method and control method |
US10126553B2 (en) | 2016-06-16 | 2018-11-13 | Microsoft Technology Licensing, Llc | Control device with holographic element |
US10620717B2 (en) | 2016-06-30 | 2020-04-14 | Microsoft Technology Licensing, Llc | Position-determining input device |
CA3034314C (en) | 2016-08-17 | 2021-04-20 | Synaptive Medical (Barbados) Inc. | Methods and systems for registration of virtual space with real space in an augmented reality system |
US10394317B2 (en) * | 2016-09-15 | 2019-08-27 | International Business Machines Corporation | Interaction with holographic image notification |
US10768630B2 (en) * | 2017-02-09 | 2020-09-08 | International Business Machines Corporation | Human imperceptible signals |
CN107194163A (en) * | 2017-05-15 | 2017-09-22 | 上海联影医疗科技有限公司 | A kind of display methods and system |
US10748443B2 (en) * | 2017-06-08 | 2020-08-18 | Honeywell International Inc. | Apparatus and method for visual-assisted training, collaboration, and monitoring in augmented/virtual reality in industrial automation systems and other systems |
US10898272B2 (en) * | 2017-08-08 | 2021-01-26 | Biosense Webster (Israel) Ltd. | Visualizing navigation of a medical device in a patient organ using a dummy device and a physical 3D model |
US11605448B2 (en) | 2017-08-10 | 2023-03-14 | Nuance Communications, Inc. | Automated clinical documentation system and method |
US11316865B2 (en) | 2017-08-10 | 2022-04-26 | Nuance Communications, Inc. | Ambient cooperative intelligence system and method |
DE102017127718A1 (en) * | 2017-11-23 | 2019-05-23 | Olympus Winter & Ibe Gmbh | User assistance system for reusable medical devices |
US11250382B2 (en) | 2018-03-05 | 2022-02-15 | Nuance Communications, Inc. | Automated clinical documentation system and method |
EP3861384A4 (en) * | 2018-10-01 | 2022-05-11 | LEIA Inc. | Holographic reality system, multiview display, and method |
US11320911B2 (en) * | 2019-01-11 | 2022-05-03 | Microsoft Technology Licensing, Llc | Hand motion and orientation-aware buttons and grabbable objects in mixed reality |
KR20220048973A (en) * | 2019-05-29 | 2022-04-20 | 스티븐 비. 머피 | Systems and methods for using augmented reality in surgery |
CA3147628A1 (en) * | 2019-08-19 | 2021-02-25 | Jonathan Sean KARAFIN | Light field display for consumer devices |
CA3148816A1 (en) * | 2019-08-26 | 2021-03-04 | Light Field Lab, Inc. | Light field display system for sporting events |
WO2021155349A1 (en) * | 2020-02-01 | 2021-08-05 | Mediview Xr, Inc. | Real time fused holographic visualization and guidance for deployment of structural heart repair or replacement product |
DE102020204785A1 (en) | 2020-04-15 | 2021-10-21 | Fresenius Medical Care Deutschland Gmbh | Medical device with a display and with a processing unit and method therefor |
US11564767B2 (en) | 2020-04-22 | 2023-01-31 | Warsaw Orthopedic, Inc. | Clinical diagnosis and treatment planning system and methods of use |
CN114144135A (en) * | 2020-07-03 | 2022-03-04 | 瓦里安医疗系统公司 | Radio frequency ablation treatment system and method |
CN116670782A (en) * | 2020-12-25 | 2023-08-29 | 深圳迈瑞生物医疗电子股份有限公司 | Medical information display system and medical system |
CN115176283A (en) * | 2021-01-28 | 2022-10-11 | 博医来股份公司 | Augmented reality positioning medical views |
US11273003B1 (en) * | 2021-02-18 | 2022-03-15 | Xenco Medical, Inc. | Surgical display |
US11478327B2 (en) | 2021-02-18 | 2022-10-25 | Xenco Medical, Inc. | Surgical display |
US11681373B1 (en) * | 2021-12-08 | 2023-06-20 | International Business Machines Corporation | Finger movement management with haptic feedback in touch-enabled devices |
US12094327B2 (en) | 2022-06-06 | 2024-09-17 | International Business Machines Corporation | Haptic effect management |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4001643B2 (en) * | 1993-10-05 | 2007-10-31 | スナップ−オン・テクノロジイズ・インク | Two-hand open type car maintenance equipment |
US20020036617A1 (en) * | 1998-08-21 | 2002-03-28 | Timothy R. Pryor | Novel man machine interfaces and applications |
DE10130278B4 (en) * | 2001-06-26 | 2005-11-03 | Carl Zeiss Meditec Ag | Method and device for representing an operating area during laser operations |
US6975994B2 (en) * | 2001-09-12 | 2005-12-13 | Technology Innovations, Llc | Device for providing speech driven control of a media presentation |
US7831292B2 (en) * | 2002-03-06 | 2010-11-09 | Mako Surgical Corp. | Guidance system and method for surgical procedures with improved feedback |
US7190496B2 (en) * | 2003-07-24 | 2007-03-13 | Zebra Imaging, Inc. | Enhanced environment visualization using holographic stereograms |
JP2007530123A (en) * | 2004-03-26 | 2007-11-01 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Control by unskilled MR device |
US20070182812A1 (en) * | 2004-05-19 | 2007-08-09 | Ritchey Kurtis J | Panoramic image-based virtual reality/telepresence audio-visual system and method |
US7824328B2 (en) * | 2006-09-18 | 2010-11-02 | Stryker Corporation | Method and apparatus for tracking a surgical instrument during surgery |
US20080231926A1 (en) * | 2007-03-19 | 2008-09-25 | Klug Michael A | Systems and Methods for Updating Dynamic Three-Dimensional Displays with User Input |
DE102008015312A1 (en) * | 2008-03-20 | 2009-10-01 | Siemens Aktiengesellschaft | Display system for displaying medical holograms |
US20090309874A1 (en) * | 2008-06-11 | 2009-12-17 | Siemens Medical Solutions Usa, Inc. | Method for Display of Pre-Rendered Computer Aided Diagnosis Results |
US7720322B2 (en) * | 2008-06-30 | 2010-05-18 | Intuitive Surgical, Inc. | Fiber optic shape sensor |
EP2304491A1 (en) * | 2008-07-10 | 2011-04-06 | Real View Imaging Ltd. | Broad viewing angle displays and user interfaces |
DE102008034686A1 (en) * | 2008-07-25 | 2010-02-04 | Siemens Aktiengesellschaft | A method of displaying interventional instruments in a 3-D dataset of an anatomy to be treated, and a display system for performing the method |
WO2010092533A1 (en) * | 2009-02-13 | 2010-08-19 | Ecole Polytechnique Federale De Lausanne (Epfl) | Method and apparatus for 3d object shape and surface topology measurements by contour depth extraction acquired in a single shot |
US10004387B2 (en) * | 2009-03-26 | 2018-06-26 | Intuitive Surgical Operations, Inc. | Method and system for assisting an operator in endoscopic navigation |
KR101114750B1 (en) * | 2010-01-29 | 2012-03-05 | 주식회사 팬택 | User Interface Using Hologram |
-
2012
- 2012-10-15 RU RU2014120182A patent/RU2608322C2/en not_active IP Right Cessation
- 2012-10-15 JP JP2014536380A patent/JP6157486B2/en not_active Expired - Fee Related
- 2012-10-15 IN IN3103CHN2014 patent/IN2014CN03103A/en unknown
- 2012-10-15 US US14/352,409 patent/US20140282008A1/en not_active Abandoned
- 2012-10-15 BR BR112014009129A patent/BR112014009129A2/en not_active Application Discontinuation
- 2012-10-15 CN CN201280051749.8A patent/CN103959179B/en not_active Expired - Fee Related
- 2012-10-15 EP EP12798835.0A patent/EP2769270B1/en not_active Not-in-force
- 2012-10-15 WO PCT/IB2012/055595 patent/WO2013057649A1/en active Application Filing
Non-Patent Citations (1)
Title |
---|
See references of WO2013057649A1 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017165301A1 (en) | 2016-03-21 | 2017-09-28 | Washington University | Virtual reality or augmented reality visualization of 3d medical images |
EP3432780A4 (en) * | 2016-03-21 | 2019-10-23 | Washington University | Virtual reality or augmented reality visualization of 3d medical images |
US11771520B2 (en) | 2016-03-21 | 2023-10-03 | Washington University | System and method for virtual reality data integration and visualization for 3D imaging and instrument position data |
US10663922B2 (en) | 2016-04-14 | 2020-05-26 | Boe Technology Group Co., Ltd. | Image display system and image display method |
US10242643B2 (en) | 2016-07-18 | 2019-03-26 | Microsoft Technology Licensing, Llc | Constrained head-mounted display communication |
US20220262079A1 (en) * | 2021-02-17 | 2022-08-18 | Arm Limited | Foveation for a holographic imaging system |
US11948255B2 (en) * | 2021-02-17 | 2024-04-02 | Arm Limited | Foveation for a holographic imaging system |
EP4134974A1 (en) | 2021-08-12 | 2023-02-15 | Koninklijke Philips N.V. | Dynamic care assistance mechanism |
Also Published As
Publication number | Publication date |
---|---|
EP2769270B1 (en) | 2018-09-19 |
IN2014CN03103A (en) | 2015-07-03 |
WO2013057649A1 (en) | 2013-04-25 |
CN103959179B (en) | 2017-09-15 |
JP2015504320A (en) | 2015-02-12 |
US20140282008A1 (en) | 2014-09-18 |
RU2608322C2 (en) | 2017-01-17 |
JP6157486B2 (en) | 2017-07-05 |
BR112014009129A2 (en) | 2017-04-18 |
RU2014120182A (en) | 2015-11-27 |
CN103959179A (en) | 2014-07-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2769270B1 (en) | Holographic user interfaces for medical procedures | |
US20190231436A1 (en) | Anatomical model for position planning and tool guidance of a medical tool | |
CN105992996B (en) | Dynamic and interactive navigation in surgical environment | |
EP2677937B1 (en) | Non-rigid-body morphing of vessel image using intravascular device shape | |
US11617623B2 (en) | Virtual image with optical shape sensing device perspective | |
Kunz et al. | Infrared marker tracking with the HoloLens for neurosurgical interventions | |
JP6568084B2 (en) | Robot control to image devices using optical shape detection | |
JP2018534011A (en) | Augmented reality surgical navigation | |
CN102918567A (en) | System and method for performing a computerized simulation of a medical procedure | |
JP6706576B2 (en) | Shape-Sensitive Robotic Ultrasound for Minimally Invasive Interventions | |
EP2904601B1 (en) | Clinical decision support and training system using device shape sensing | |
Yin et al. | VR and AR in human performance research―An NUS experience | |
Krapichler et al. | VR interaction techniques for medical imaging applications | |
US11406278B2 (en) | Non-rigid-body morphing of vessel image using intravascular device shape | |
Behringer et al. | Some usability issues of augmented and mixed reality for e-health applications in the medical domain | |
Liao | 3D medical imaging and augmented reality for image-guided surgery | |
Porro et al. | An integrated environment for plastic surgery support: building virtual patients, simulating interventions, and supporting intraoperative decisions | |
US20210358220A1 (en) | Adapting an augmented and/or virtual reality | |
De Paolis et al. | Advanced visualization and interaction systems for surgical pre-operative planning | |
EP3944254A1 (en) | System for displaying an augmented reality and method for generating an augmented reality | |
Pednekar et al. | Applications of virtual reality in surgery | |
Krapichler et al. | Human-machine interface for a VR-based medical imaging environment | |
Mangalote et al. | A comprehensive study to learn the impact of augmented reality and haptic interaction in ultrasound-guided percutaneous liver biopsy training and education | |
Allen et al. | Towards virtual displays in the interventional radiology suite: a feasibility study | |
Sam et al. | Augmented Reality in Surgical Procedures |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20140520 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20151130 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G03H 1/00 20060101AFI20180223BHEP Ipc: G06T 19/20 20110101ALN20180223BHEP Ipc: G06F 3/16 20060101ALN20180223BHEP Ipc: G03H 1/22 20060101ALI20180223BHEP |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G03H 1/22 20060101ALI20180313BHEP Ipc: G06T 19/20 20110101ALN20180313BHEP Ipc: G03H 1/00 20060101AFI20180313BHEP Ipc: G06F 3/16 20060101ALN20180313BHEP |
|
INTG | Intention to grant announced |
Effective date: 20180409 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1043951 Country of ref document: AT Kind code of ref document: T Effective date: 20181015 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602012051337 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 7 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20180919 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181219 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181219 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181220 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1043951 Country of ref document: AT Kind code of ref document: T Effective date: 20180919 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190119 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190119 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602012051337 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20181031 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20181015 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: MM4A |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 |
|
26N | No opposition filed |
Effective date: 20190620 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20181031 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20181031 Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20181031 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20181015 Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20181015 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20191129 Year of fee payment: 8 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20191025 Year of fee payment: 8 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20191029 Year of fee payment: 8 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20121015 Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180919 Ref country code: MK Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180919 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R119 Ref document number: 602012051337 Country of ref document: DE |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20201015 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210501 Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20201031 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20201015 |