US20070019936A1 - Stereoscopic visualization device for patient image data and video images - Google Patents
Stereoscopic visualization device for patient image data and video images Download PDFInfo
- Publication number
- US20070019936A1 US20070019936A1 US11/426,176 US42617606A US2007019936A1 US 20070019936 A1 US20070019936 A1 US 20070019936A1 US 42617606 A US42617606 A US 42617606A US 2007019936 A1 US2007019936 A1 US 2007019936A1
- Authority
- US
- United States
- Prior art keywords
- camera
- set forth
- image
- observer
- auto
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012800 visualization Methods 0.000 title 1
- 238000003384 imaging method Methods 0.000 claims abstract description 6
- 230000006870 function Effects 0.000 description 5
- 238000000034 method Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 238000013479 data entry Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B41/00—Special techniques not covered by groups G03B31/00 - G03B39/00; Apparatus therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/462—Displaying means of special interest characterised by constructional features of the display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5235—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
Definitions
- the monitor can be controlled by an image processing unit that spatially assigns and combines the stereoscopic video images and three-dimensionally calculated patient body structures from the transillumination images and/or tomographs, generating a combined stereo image.
- the monitor can be an aperture mask or lens monitor having one or more stereo observation zones, e.g., single-user and multi-user configurations can be used.
- one or more observation zones can be arranged stationary relative to the monitor, e.g., a so-called passive system can be provided wherein the observer views a clear stereoscopic image when he is standing in front of the monitor in one or more predetermined observation zones.
- FIG. 4 is a block diagram of an exemplary computational unit that can be used to implement the method of the present invention.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
- This application is a continuation-in-part of U.S. Non-Provisional application Ser. No. 10/830,963 filed on Apr. 23, 2004, which claims priority to U.S. Provisional Application No. 60/489,750 filed on Jul. 24, 2003, both of which are incorporated herein by reference in their entirety.
- The present invention relates to medical video imaging and, more particularly, to a portable device for visually combining patient image data from transillumination and/or tomographic imaging methods and/or object image data with video images.
-
EP 1 321 105 B1 discloses a portable screen-camera unit that includes a camera mounted on a rear side of a screen. A physician can hold the portable screen together with the corresponding camera in front of a patient (e.g., in front of a particular body part). By displaying in combination patient image data from transillumination and/or tomographic imaging methods as well as video images, the physician can view the exterior area of the patient's body part and can simultaneously obtain superimposed images of interior structures of the patient's body part. - A disadvantage of this portable screen-camera unit is that the physician and/or any observer is only provided with a flat projection, i.e., a two-dimensional view. Depth information is difficult to estimate or infer from the unit.
- A device for displaying images includes an image display device, a camera device and a computer-assisted navigation system. The navigation system can detect a spatial position of the image display device and/or the camera device as well as the spatial position of a patient's body part via tracking means attached thereto.
- Two-dimensional representations of the image display device can be extended to a three-dimensional representation, wherein the image display device utilizes the fact that the patient image data are in many cases already available as spatial and three-dimensional data (e.g., as image data that originate from transillumination and/or tomographic imaging methods). If, for example, a CT recording is obtained, three-dimensional or spatial patient image data are available from the tomographs. In conventional screen-camera units, this fact has only been utilized to the extent that it has been possible to display the structural features of the body part correctly projected in the viewing direction, though only in two-dimensions on the screen.
- It is possible to visually provide the entire informational content of the three-dimensional patient image data sets. To this end, an auto-stereoscopic monitor can be used that provides the observer with a three-dimensional image, without requiring additional aids such as special spectacles, for example. The auto-stereoscopic monitor can be provided with the patient image data in a suitable way, such that the patient image data appear three-dimensional to the observer.
- If the camera means is a stereoscopic camera means, the data captured by the camera means of the patient's exterior also can be manipulated such that the monitor, provided with said data, generates a stereoscopic and/or three-dimensional video image of the patient's body part. Because the data can be positionally assigned by means of the navigation system, the two images can be superimposed such that a three-dimensional view is created on the monitor, which allows the observer to simultaneously observe the patient's body part by way of exterior and interior features and, thus, to also obtain depth information.
- Auto-stereoscopic monitors are available and can be adapted to the requirements of the application. There are so-called parallax displays that are based on a two-dimensional image plane, wherein diffuse light can be actively focused and aligned. So-called parallax barrier systems use a kind of aperture mask, e.g., an opaque layer in front of the image surface that is interrupted by regular slits, wherein a defined image region can be presented depending on the angle of view. Other systems also can be used with the present invention, including lens-based systems in which the views are separated before the screen by lens elements. Round lenses can be used (full parallax) or semi-cylindrical optical lenses that use sub-pixel control (slanted lenticular sheets) provide improved resolution when aligned obliquely. Further, oblique sub-pixel arrays avoid moiré effects and black transitions. It is also possible to use alternative parallax systems, e.g. prism systems or polarizing filters.
- The camera means can include a single camera. In one embodiment, the camera means includes at least two cameras that are arranged on a rear or back side of the image display device (e.g., on the opposite side to the screen) and wherein a distance between the two cameras can be adjusted to a predetermined distance range between the observer and the monitor. The camera means also can include movable cameras. The movable cameras enable one to set an intersecting region of the fields of vision of the cameras in accordance with the distance between the cameras and the object observed and/or in accordance with the distance between the observer and the monitor. Thus, the entire array, as well as the software, can be set to a typical scenario prior to use, wherein corresponding predetermined fixed values or ranges of values can be used. In this way, for example, image rendering can be simplified and quickly calculated. Further, the movable cameras increase flexibility with respect to the distance between the portable image display device and the patient.
- The monitor can be controlled by an image processing unit that spatially assigns and combines the stereoscopic video images and three-dimensionally calculated patient body structures from the transillumination images and/or tomographs, generating a combined stereo image. As already indicated above, the monitor can be an aperture mask or lens monitor having one or more stereo observation zones, e.g., single-user and multi-user configurations can be used. In accordance with one embodiment, one or more observation zones can be arranged stationary relative to the monitor, e.g., a so-called passive system can be provided wherein the observer views a clear stereoscopic image when he is standing in front of the monitor in one or more predetermined observation zones.
- Further, an observer tracking unit can be assigned to the monitor to create a so-called active system that tracks the observation zone(s) of the observer. The image processing unit can display an image on the monitor that develops its observation zone(s) at each observer location or at each tracked observer location. Such an “active” system that monitors a location of the observer provides the observer with the best stereoscopic image at the observer's location.
- The observer tracking unit can be a tracking unit of the navigation system or can be directly assigned to the navigation system. In another embodiment, the observer tracking unit can be a video tracking unit with a separate video camera that identifies and tracks the position of the head or the eyes of the observer by means of image recognition.
- It is possible to use the image processing unit of the navigation system as the image processing unit, or an image processing unit that is directly assigned to the navigation system, e.g. an image processing unit in the portable screen unit tracked by the navigation system. Image processing also can be performed separately by a separate image processing unit, the data then being transferred via a wired or wireless link to the monitor.
- In accordance with another embodiment, the device also includes a user input device, by means of which the user can toggle between a stereoscopic and normal image observation mode. Also, the user can toggle between displaying one or more observation zones. The device also can include inputs for enabling image-assisted treatment or treatment planning, and for controlling the movable cameras.
- In the following, the invention is explained in more detail on the basis of an embodiment, wherein reference is made to the enclosed figures. The invention can be implemented with any of its described features, individually or in combination, and also includes the procedural embodiment of the features described on the basis of the device.
-
FIG. 1 illustrates a front view of an exemplary screen-camera unit and navigation system in accordance with the invention. -
FIG. 2 illustrate a rear view of an exemplary screen-camera unit including two rear-side cameras. -
FIG. 3 illustrate a rear view of an exemplary screen-camera unit including one rear-side camera. -
FIG. 4 is a block diagram of an exemplary computational unit that can be used to implement the method of the present invention. - Referring to
FIG. 1 , animage display unit 1 is shown in a perspective view from the front. The image display unit includes acasing 2, which is fitted on a front side with ascreen 3. Thescreen 3 is an auto-stereoscopic screen or monitor, and in the present case a partially covered patient's back is shown on its display with the spinal column beneath. - A reference star 4 with
markers 4 a is shown on top of thecasing 2 and enables a position of theimage display unit 1 in a localizing space of anavigation system 5 to be established. Thus, a position of thecameras 7 and 8 (shown inFIG. 2 on the rear side of the image display unit 1) also can be established in the navigation system's localizing space. More particularly, since thepatient 30 is also tracked by thenavigation system 5 viareference star 32 withmarkers 32 a, the video image of thecameras navigation system 5 and displayed in the correct spatial positional relationship to the interior patient structures. - Since the two
cameras screen 3 is an auto-stereoscopic screen, and since the data on the interior patient structure (e.g., the spinal column) are additionally provided as spatial data (e.g. from a CT recording), all the image information can be reproduced in a single combined and superimposed stereo image, such that an image depth effect is created for theobserver 34. In the present case, the tracking means (e.g., thereference stars 4 and 32 andmarkers navigation system 5. Thenavigation system 5 includes, for example, two spacedinfrared cameras 5 a and aninfrared light emitter 5 b. Thenavigation system 5 also include acomputational unit 5 c, such as a computer or the like. Other navigation and/or tracking systems (magnetic or actively emitting marker or reference arrays) also can be used without departing from the scope of the invention. Additionally, thenavigation system 5, via integral orseparate cameras 5 d, also may operate as an observer tracking unit that can trackobservation zones 36 of anobserver 34. - The
observer 34 can view a three-dimensional representation of thepatient 30 and look into the patient, so to speak, such that a stereo image or three-dimensional image of a “glass patient” is created. - A fixing
device 6 also is attached to theimage display unit 1 in a lower part of thecasing 2, for example. With the aid of the fixingdevice 6, theimage display unit 1, which is embodied as a portable unit, can be temporarily fixed to a mounting (not shown) if the intention is to view onto/into the patient from the same direction over a longer period of time. Furthermore, thescreen 3 also includesinput buttons 7, one of which is shown on the right of thecasing 2. With the aid of the input buttons 7 (a touch screen or other input device also can be used), it is possible to toggle between a stereoscopic and normal image observation mode, for example. Furthermore, it is possible to toggle between displaying with one ormore observation zones 36, and/or enabling inputs for image-assisted treatment or treatment planning. If thecameras cameras input buttons 7. Preferably, a menu control is provided that can be accessed via thebuttons 7. - Moving to
FIG. 4 , thecomputational unit 5 c of thenavigation system 5 is illustrated in block diagram form. Thecomputational unit 5 c includes acomputer 10 for processing data, and a display 12 (e.g., a Cathode Ray Tube, Liquid Crystal Display, or the like) for viewing system information. Akeyboard 14 andpointing device 16 may be used for data entry, data display, screen navigation, etc. Thekeyboard 14 andpointing device 16 may be separate from thecomputer 10 or they may be integral to it. A computer mouse or other device that points to or otherwise identifies a location, action, etc., e.g., by a point and click method or some other method, are examples of a pointing device. Alternatively, a touch screen (not shown) may be used in place of thekeyboard 14 andpointing device 16. Touch screens may be beneficial when the available space for akeyboard 14 and/or apointing device 16 is limited. - Included in the
computer 10 is a storage medium 18 for storing information, such as application data, screen information, programs, etc. The storage medium 18 may be a hard drive, an optical drive, or the like. A processor 20, such as an AMD Athlon 64™ processor or an Intel Pentium IV® processor, combined with a memory 22 and the storage medium 18 execute programs to perform various functions, such as data entry, numerical calculations, screen display, system setup, etc. The processor 20 also may operate as an image processing unit, for example, so as to process graphical data for display onimage display unit 1 and/ordisplay 12. A network interface card (NIC) 24 allows thecomputer 10 to communicate with devices external to thecomputational unit 5 c. - The actual code for performing the functions described herein can be readily programmed by a person having ordinary skill in the art of computer programming in any of a number of conventional programming languages based on the disclosure herein. Consequently, further detail as to the particular code itself has been omitted for sake of brevity.
- Although the invention has been shown and described with respect to a certain preferred embodiment or embodiments, it is obvious that equivalent alterations and modifications will occur to others skilled in the art upon the reading and understanding of this specification and the annexed drawings. In particular regard to the various functions performed by the above described elements (components, assemblies, devices, compositions, etc.), the terms (including a reference to a “means”) used to describe such elements are intended to correspond, unless otherwise indicated, to any element which performs the specified function of the described element (i.e., that is functionally equivalent), even though not structurally equivalent to the disclosed structure which performs the function in the herein illustrated exemplary embodiment or embodiments of the invention. In addition, while a particular feature of the invention may have been described above with respect to only one or more of several illustrated embodiments, such feature may be combined with one or more other features of the other embodiments, as may be desired and advantageous for any given or particular application.
Claims (13)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/426,176 US7463823B2 (en) | 2003-07-24 | 2006-06-23 | Stereoscopic visualization device for patient image data and video images |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US48975003P | 2003-07-24 | 2003-07-24 | |
US10/830,963 US7203277B2 (en) | 2003-04-25 | 2004-04-23 | Visualization device and method for combined patient and object image data |
US11/426,176 US7463823B2 (en) | 2003-07-24 | 2006-06-23 | Stereoscopic visualization device for patient image data and video images |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/830,963 Continuation-In-Part US7203277B2 (en) | 2003-04-25 | 2004-04-23 | Visualization device and method for combined patient and object image data |
Publications (2)
Publication Number | Publication Date |
---|---|
US20070019936A1 true US20070019936A1 (en) | 2007-01-25 |
US7463823B2 US7463823B2 (en) | 2008-12-09 |
Family
ID=37679126
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/426,176 Active 2025-03-07 US7463823B2 (en) | 2003-07-24 | 2006-06-23 | Stereoscopic visualization device for patient image data and video images |
Country Status (1)
Country | Link |
---|---|
US (1) | US7463823B2 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100039506A1 (en) * | 2008-08-15 | 2010-02-18 | Amir Sarvestani | System for and method of visualizing an interior of body |
WO2012000536A1 (en) * | 2010-06-28 | 2012-01-05 | Brainlab | Generating images for at least two displays in image-guided surgery |
US20120027257A1 (en) * | 2010-07-29 | 2012-02-02 | Lg Electronics Inc. | Method and an apparatus for displaying a 3-dimensional image |
US20120032952A1 (en) * | 2010-08-09 | 2012-02-09 | Lee Kyoungil | System, apparatus, and method for displaying 3-dimensional image and location tracking device |
US20120045122A1 (en) * | 2010-08-19 | 2012-02-23 | Shinichiro Gomi | Image Processing Device, Method, and Program |
US20170215971A1 (en) * | 2006-03-24 | 2017-08-03 | Abhishek Gattani | System and method for 3-d tracking of surgical instrument in relation to patient body |
EP3195823A4 (en) * | 2014-09-19 | 2017-09-20 | Koh Young Technology Inc. | Optical tracking system and coordinate matching method for optical tracking system |
WO2019182523A1 (en) * | 2018-03-22 | 2019-09-26 | Intometer J.S.A. | A multi-functional 3d imaging device for ct and mr images |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2974143C (en) | 2004-02-20 | 2020-11-10 | University Of Florida Research Foundation, Inc. | System for delivering conformal radiation therapy while simultaneously imaging soft tissue |
US8560047B2 (en) | 2006-06-16 | 2013-10-15 | Board Of Regents Of The University Of Nebraska | Method and apparatus for computer aided surgery |
WO2011020505A1 (en) | 2009-08-20 | 2011-02-24 | Brainlab Ag | Integrated surgical device combining instrument; tracking system and navigation system |
FR2963693B1 (en) | 2010-08-04 | 2013-05-03 | Medtech | PROCESS FOR AUTOMATED ACQUISITION AND ASSISTED ANATOMICAL SURFACES |
US8810640B2 (en) | 2011-05-16 | 2014-08-19 | Ut-Battelle, Llc | Intrinsic feature-based pose measurement for imaging motion compensation |
US9498231B2 (en) | 2011-06-27 | 2016-11-22 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
US11911117B2 (en) | 2011-06-27 | 2024-02-27 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
JP6259757B2 (en) | 2011-06-27 | 2018-01-10 | ボード オブ リージェンツ オブ ザ ユニバーシティ オブ ネブラスカ | On-board instrument tracking system for computer-assisted surgery |
FR2983059B1 (en) | 2011-11-30 | 2014-11-28 | Medtech | ROBOTIC-ASSISTED METHOD OF POSITIONING A SURGICAL INSTRUMENT IN RELATION TO THE BODY OF A PATIENT AND DEVICE FOR CARRYING OUT SAID METHOD |
US10561861B2 (en) | 2012-05-02 | 2020-02-18 | Viewray Technologies, Inc. | Videographic display of real-time medical treatment |
KR20150080527A (en) | 2012-10-26 | 2015-07-09 | 뷰레이 인코포레이티드 | Assessment and improvement of treatment using imaging of physiological responses to radiation therapy |
US10105149B2 (en) | 2013-03-15 | 2018-10-23 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
US9446263B2 (en) | 2013-03-15 | 2016-09-20 | Viewray Technologies, Inc. | Systems and methods for linear accelerator radiotherapy with magnetic resonance imaging |
WO2017091621A1 (en) | 2015-11-24 | 2017-06-01 | Viewray Technologies, Inc. | Radiation beam collimating systems and methods |
US10413751B2 (en) | 2016-03-02 | 2019-09-17 | Viewray Technologies, Inc. | Particle therapy with magnetic resonance imaging |
CA3028716C (en) | 2016-06-22 | 2024-02-13 | Viewray Technologies, Inc. | Magnetic resonance imaging at low field strength |
KR20190092530A (en) | 2016-12-13 | 2019-08-07 | 뷰레이 테크놀로지스 인크. | Radiation Therapy Systems and Methods |
JP2020531965A (en) * | 2017-08-30 | 2020-11-05 | コンペディア ソフトウェア アンド ハードウェア デベロップメント リミテッドCompedia Software And Hardware Development Ltd. | Assisted augmented reality |
CA2983780C (en) * | 2017-10-25 | 2020-07-14 | Synaptive Medical (Barbados) Inc. | Surgical imaging sensor and display unit, and surgical navigation system associated therewith |
CN111712298B (en) | 2017-12-06 | 2023-04-04 | 优瑞技术公司 | Radiation therapy system |
US11209509B2 (en) | 2018-05-16 | 2021-12-28 | Viewray Technologies, Inc. | Resistive electromagnet systems and methods |
Citations (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5526812A (en) * | 1993-06-21 | 1996-06-18 | General Electric Company | Display system for enhancing visualization of body structures during medical procedures |
US5694142A (en) * | 1993-06-21 | 1997-12-02 | General Electric Company | Interactive digital arrow (d'arrow) three-dimensional (3D) pointing |
US5715836A (en) * | 1993-02-16 | 1998-02-10 | Kliegis; Ulrich | Method and apparatus for planning and monitoring a surgical operation |
US5765561A (en) * | 1994-10-07 | 1998-06-16 | Medical Media Systems | Video-based surgical targeting system |
US5961456A (en) * | 1993-05-12 | 1999-10-05 | Gildenberg; Philip L. | System and method for displaying concurrent video and reconstructed surgical views |
US5978143A (en) * | 1997-09-19 | 1999-11-02 | Carl-Zeiss-Stiftung | Stereoscopic recording and display system |
US6038467A (en) * | 1997-01-24 | 2000-03-14 | U.S. Philips Corporation | Image display system and image guided surgery system |
US20010035871A1 (en) * | 2000-03-30 | 2001-11-01 | Johannes Bieger | System and method for generating an image |
US20020082498A1 (en) * | 2000-10-05 | 2002-06-27 | Siemens Corporate Research, Inc. | Intra-operative image-guided neurosurgery with augmented reality visualization |
US6414708B1 (en) * | 1997-02-03 | 2002-07-02 | Dentop Systems, Ltd. | Video system for three dimensional imaging and photogrammetry |
US20020140694A1 (en) * | 2001-03-27 | 2002-10-03 | Frank Sauer | Augmented reality guided instrument positioning with guiding graphics |
US6477400B1 (en) * | 1998-08-20 | 2002-11-05 | Sofamor Danek Holdings, Inc. | Fluoroscopic image guided orthopaedic surgery system with intraoperative registration |
US20020163499A1 (en) * | 2001-03-29 | 2002-11-07 | Frank Sauer | Method and apparatus for augmented reality visualization |
US6490467B1 (en) * | 1990-10-19 | 2002-12-03 | Surgical Navigation Technologies, Inc. | Surgical navigation systems including reference and localization frames |
US6640128B2 (en) * | 2000-12-19 | 2003-10-28 | Brainlab Ag | Method and device for the navigation-assisted dental treatment |
US6644852B2 (en) * | 2001-11-15 | 2003-11-11 | Ge Medical Systems Global Technology | Automatically reconfigurable x-ray positioner |
US20050020909A1 (en) * | 2003-07-10 | 2005-01-27 | Moctezuma De La Barrera Jose Luis | Display device for surgery and method for using the same |
US7050845B2 (en) * | 2001-12-18 | 2006-05-23 | Brainlab Ag | Projecting patient image data from radioscopic imaging methods and/or tomographic imaging methods onto video images |
US7084838B2 (en) * | 2001-08-17 | 2006-08-01 | Geo-Rae, Co., Ltd. | Method and system for controlling the motion of stereoscopic cameras using a three-dimensional mouse |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB9405299D0 (en) | 1994-03-17 | 1994-04-27 | Roke Manor Research | Improvements in or relating to video-based systems for computer assisted surgery and localisation |
US6483948B1 (en) | 1994-12-23 | 2002-11-19 | Leica Ag | Microscope, in particular a stereomicroscope, and a method of superimposing two images |
GB2324428A (en) | 1997-04-17 | 1998-10-21 | Sharp Kk | Image tracking; observer tracking stereoscopic display |
GB9827546D0 (en) | 1998-12-15 | 1999-02-10 | Street Graham S B | Apparatus and method for image control |
WO2003002011A1 (en) | 2001-06-28 | 2003-01-09 | Surgyvision Ltd. | Stereoscopic video magnification and navigation system |
-
2006
- 2006-06-23 US US11/426,176 patent/US7463823B2/en active Active
Patent Citations (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6490467B1 (en) * | 1990-10-19 | 2002-12-03 | Surgical Navigation Technologies, Inc. | Surgical navigation systems including reference and localization frames |
US5715836A (en) * | 1993-02-16 | 1998-02-10 | Kliegis; Ulrich | Method and apparatus for planning and monitoring a surgical operation |
US5961456A (en) * | 1993-05-12 | 1999-10-05 | Gildenberg; Philip L. | System and method for displaying concurrent video and reconstructed surgical views |
US5694142A (en) * | 1993-06-21 | 1997-12-02 | General Electric Company | Interactive digital arrow (d'arrow) three-dimensional (3D) pointing |
US5526812A (en) * | 1993-06-21 | 1996-06-18 | General Electric Company | Display system for enhancing visualization of body structures during medical procedures |
US5765561A (en) * | 1994-10-07 | 1998-06-16 | Medical Media Systems | Video-based surgical targeting system |
US6038467A (en) * | 1997-01-24 | 2000-03-14 | U.S. Philips Corporation | Image display system and image guided surgery system |
US6414708B1 (en) * | 1997-02-03 | 2002-07-02 | Dentop Systems, Ltd. | Video system for three dimensional imaging and photogrammetry |
US5978143A (en) * | 1997-09-19 | 1999-11-02 | Carl-Zeiss-Stiftung | Stereoscopic recording and display system |
US6477400B1 (en) * | 1998-08-20 | 2002-11-05 | Sofamor Danek Holdings, Inc. | Fluoroscopic image guided orthopaedic surgery system with intraoperative registration |
US20010035871A1 (en) * | 2000-03-30 | 2001-11-01 | Johannes Bieger | System and method for generating an image |
US20020082498A1 (en) * | 2000-10-05 | 2002-06-27 | Siemens Corporate Research, Inc. | Intra-operative image-guided neurosurgery with augmented reality visualization |
US6640128B2 (en) * | 2000-12-19 | 2003-10-28 | Brainlab Ag | Method and device for the navigation-assisted dental treatment |
US20020140694A1 (en) * | 2001-03-27 | 2002-10-03 | Frank Sauer | Augmented reality guided instrument positioning with guiding graphics |
US20020163499A1 (en) * | 2001-03-29 | 2002-11-07 | Frank Sauer | Method and apparatus for augmented reality visualization |
US7084838B2 (en) * | 2001-08-17 | 2006-08-01 | Geo-Rae, Co., Ltd. | Method and system for controlling the motion of stereoscopic cameras using a three-dimensional mouse |
US6644852B2 (en) * | 2001-11-15 | 2003-11-11 | Ge Medical Systems Global Technology | Automatically reconfigurable x-ray positioner |
US7050845B2 (en) * | 2001-12-18 | 2006-05-23 | Brainlab Ag | Projecting patient image data from radioscopic imaging methods and/or tomographic imaging methods onto video images |
US20050020909A1 (en) * | 2003-07-10 | 2005-01-27 | Moctezuma De La Barrera Jose Luis | Display device for surgery and method for using the same |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170215971A1 (en) * | 2006-03-24 | 2017-08-03 | Abhishek Gattani | System and method for 3-d tracking of surgical instrument in relation to patient body |
US9248000B2 (en) * | 2008-08-15 | 2016-02-02 | Stryker European Holdings I, Llc | System for and method of visualizing an interior of body |
US20100039506A1 (en) * | 2008-08-15 | 2010-02-18 | Amir Sarvestani | System for and method of visualizing an interior of body |
US9907622B2 (en) | 2010-06-28 | 2018-03-06 | Brainlab Ag | Generating images for at least two displays in image-guided surgery |
US9775684B2 (en) | 2010-06-28 | 2017-10-03 | Brainlab Ag | Generating images for at least two displays in image-guided surgery |
EP4026508A1 (en) * | 2010-06-28 | 2022-07-13 | Brainlab AG | Generating images for at least two displays in image-guided surgery |
US9907623B2 (en) | 2010-06-28 | 2018-03-06 | Brainlab Ag | Generating images for at least two displays in image-guided surgery |
WO2012000536A1 (en) * | 2010-06-28 | 2012-01-05 | Brainlab | Generating images for at least two displays in image-guided surgery |
US8942427B2 (en) * | 2010-07-29 | 2015-01-27 | Lg Electronics Inc. | Method and an apparatus for displaying a 3-dimensional image |
US20120027257A1 (en) * | 2010-07-29 | 2012-02-02 | Lg Electronics Inc. | Method and an apparatus for displaying a 3-dimensional image |
US9197884B2 (en) * | 2010-08-09 | 2015-11-24 | Lg Electronics Inc. | System, apparatus, and method for displaying 3-dimensional image and location tracking device |
US20120032952A1 (en) * | 2010-08-09 | 2012-02-09 | Lee Kyoungil | System, apparatus, and method for displaying 3-dimensional image and location tracking device |
US20120045122A1 (en) * | 2010-08-19 | 2012-02-23 | Shinichiro Gomi | Image Processing Device, Method, and Program |
US8737768B2 (en) * | 2010-08-19 | 2014-05-27 | Sony Corporation | Image processing device, method, and program |
EP3195823A4 (en) * | 2014-09-19 | 2017-09-20 | Koh Young Technology Inc. | Optical tracking system and coordinate matching method for optical tracking system |
US11206998B2 (en) | 2014-09-19 | 2021-12-28 | Koh Young Technology Inc. | Optical tracking system for tracking a patient and a surgical instrument with a reference marker and shape measurement device via coordinate transformation |
WO2019182523A1 (en) * | 2018-03-22 | 2019-09-26 | Intometer J.S.A. | A multi-functional 3d imaging device for ct and mr images |
Also Published As
Publication number | Publication date |
---|---|
US7463823B2 (en) | 2008-12-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7463823B2 (en) | Stereoscopic visualization device for patient image data and video images | |
US20240080433A1 (en) | Systems and methods for mediated-reality surgical visualization | |
US11176750B2 (en) | Surgeon head-mounted display apparatuses | |
US20230122367A1 (en) | Surgical visualization systems and displays | |
ES2899353T3 (en) | Digital system for capturing and visualizing surgical video | |
US6891518B2 (en) | Augmented reality visualization device | |
US9330477B2 (en) | Surgical stereo vision systems and methods for microsurgery | |
US9138135B2 (en) | System, a method and a computer program for inspection of a three-dimensional environment by a user | |
US6836286B1 (en) | Method and apparatus for producing images in a virtual space, and image pickup system for use therein | |
Parsons et al. | A non-intrusive display technique for providing real-time data within a surgeons critical area of interest | |
Rolland et al. | Optical versus video see-through head-mounted displays | |
CN105493153A (en) | Method for displaying on a screen an object shown in a 3D data set | |
JP2020202499A (en) | Image observation system | |
Grossmann | A new AS-display as part of the MIRO lightweight robot for surgical applications | |
EP1621153B1 (en) | Stereoscopic visualisation apparatus for the combination of scanned and video images | |
Parsons et al. | 246 Medicine Meets Virtual Reality JD Westwood, HM Hoffman, D. Stredney, and SJ Weghorst (Eds.) IOS Press and Ohmsha, 1998 A Non-Intrusive Display Technique for | |
WO2019204012A1 (en) | Compensation for observer movement in robotic surgical systems having stereoscopic displays |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BRAINLAB AG, GERMANY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BIRKENBACH, RAINER;SCHMIDT, ROBERT;WOHLGEMUTH, RICHARD;AND OTHERS;REEL/FRAME:018370/0216;SIGNING DATES FROM 20060822 TO 20060831 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 12 |