WO2003002011A1 - Systeme video stereoscopique d'agrandissement et de navigation - Google Patents
Systeme video stereoscopique d'agrandissement et de navigation Download PDFInfo
- Publication number
- WO2003002011A1 WO2003002011A1 PCT/IL2001/000598 IL0100598W WO03002011A1 WO 2003002011 A1 WO2003002011 A1 WO 2003002011A1 IL 0100598 W IL0100598 W IL 0100598W WO 03002011 A1 WO03002011 A1 WO 03002011A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- images
- operator
- camera
- video
- stereoscopic
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/20—Surgical microscopes characterised by non-optical aspects
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/74—Manipulators with manual electric input means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/296—Synchronisation thereof; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00203—Electrical control of surgical instruments with speech control or speech recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00221—Electrical control of surgical instruments with wireless transmission of data, e.g. by infrared radiation or radiowaves
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2072—Reference field transducer attached to an instrument or patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/371—Surgical systems with images on a monitor during operation with simultaneous use of two cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
- A61B2090/502—Headgear, e.g. helmet, spectacles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/189—Recording image signals; Reproducing recorded image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
Definitions
- the present invention relates generally to a stereoscopic observation system, and more particularly to a stereoscopic video observation and magnification system integrated with an imaging guided system to be utilized in surgery and other medical applications.
- Magnification observation systems are known in the art as optical instruments comprising magnifying lenses for the magnification of the view of small objects.
- such instruments typically include simple hand-held loupes, wearable binocular loupes with or without headlight, or, in the case of microsurgery, surgical microscopes.
- surgical microscopes are used by a single surgeon, leaving the other members of the surgical team with a minimal view of the operating field.
- Surgical microscopes are large, cumbersome and hard to manipulate.
- the magnification range of the wide array of surgical microscopes is from 2 (where 1 is unaided eye) to about 12.
- loupe-aided observation provides magnifications in the range of 2 to 3 (also larger magnification is available but the equipment is unwieldy and less frequently used) whereas the range of magnifications for a microscope is typically 8 or above.
- Stereoscopic observation systems are known in the art as optical devices comprising two magnifying lenses having two different viewpoints from two adjacent angles of view and providing two images such as to reproduce the characteristics of human binocular vision.
- the images provided by the optical devices are combined in a user's brain thereby enabling the user to perceive depth.
- Current surgical microscopes are typically stereoscopic, but only one operator can observe the operating field in stereoscopy while the additional observers provided with a monocular view only.
- a stereoscopic video magnifying system is known in the art as a mechanism that utilizes two video cameras typically mounted on an observer's head.
- the cameras record two video images of an object, magnify the images, and forward the magnified images to an observer.
- the magnification provided is typically in the range of 1 to more than 8.
- the stereoscopic video magnification system having the cameras mounted on the observer's head is very useful for the performance of surgical procedures in field conditions where proper mounting devices for the cameras may be unavailable.
- the efficiency of the system is limited. Wearing the cameras is uncomfortable and effect fatigue during prolonged operations.
- Such a system includes the provision of a full range of magnifications, thus suitable for many different types of operations and procedures and also include a useful "see through” capability that enables the surgeon to work continuously and with ⁇ rinimum head position movements.
- the "see through” capability refers to the attribute of the system allowing a surgeon to observe directly at the operating field through the LCD glasses when not observing the video image of the magnifying cameras.
- the system also provides more than one surgeon with a stereoscopic view of the operating field.
- Navigation systems in the medical field are a group of devices brought to use together in order to allow a surgeon to pinpoint the location of a probe with high accuracy.
- An object located within the operating field for example the brain in neurosurgery, is scanned, typically by a MRI or a CT scanner, prior to an operation.
- sets of markers are attached to the object and located around the operating field. The markers can be seen on the scanning image as well as to be detected thereafter by a sensor device.
- the scarining information Prior to the surgical procedure the scarining information is fed to a computer.
- markers that can be detected by a special camera typically an infrared camera
- a pinpoint location on the object within the operating field is achieved by fixing at least three reference points in space.
- Two markers around the operative field and a special surgery probe utilized by the surgeon constitute the three reference points designed to locate a desired location.
- the location is displayed on a separate view screen with reference to the pre-operative scanning images.
- existing systems operative in the typical neurosurgery procedure assist the surgeon to locate a desired area on the object but the surgeon must remove his eyes from the operative microscope as well as utilize a special probe instead of a surgical tool in order to view the desired location
- the system relieves the surgeon from the necessity of removing his eyes from the operating field and from the necessity of manually manipulating a probe.
- the surgeon is free to operate and observe in three-dimensional view the magnified operative field as well as pinpointing an exact location of a target point.
- Such location can be displayed to the surgeon's eyes as a two or three-dimensional image.
- the present invention it is also the purpose of the present invention to improve the ergonomics of the stereoscopic video magnification system, to allow a surgeon further freedom of movement, and to provide a stereoscopic view of the operating field simultaneously to more than one member of the a surgical team. It is another object of the present invention to provide new and improved means for physically supporting the video cameras.
- the supporting apparatus providing means for stable and smooth and easy placing of the video magnification and magmfying system. Furthermore, said apparatus can be automatically controlled and can provide space for extra equipment.
- One aspect of the present invention regards an apparatus for providing stereoscopic magnified observation enabling an operator to perform surgical procedures without having to remove his eyes from the operating field.
- the apparatus comprises a head-mounted display for providing the operator with stereoscopic magnified images in an operating field.
- the camera module can further comprise a camera mount for mounting at least two video cameras, and at least two video cameras attached to the camera mount for acquiring video images of the operating field. It can also include a converging system for interconnecting and adjusting the at least two video cameras with respect to each other and obtaining a focal point associated with a point in the operating field resulting in obtaining stereoscopic images.
- the interface-processing unit can further include a central processing unit for processing stereoscopic images and for calculating a convergence angle required at a specific focal distance for generating stereoscopic images. It can also include at least memory device, a camera control unit for controlling at least the focus level, focus distance and the distance angle and for the synchronizing of received values from at least two video camera, and a camera convergence unit for controlling a convergence system. In addition, it can include an imaging sampler unit for sampling video images received from the at least two video camera and forwarding the sampled video images to the central processing unit. A video switcher unit for processing video images obtained from external video sources and for converting the images so that the head mounted display can display such images. And an on screen display control unit for each head mounted display for sending operational information to each head mounted display.
- a central processing unit for processing stereoscopic images and for calculating a convergence angle required at a specific focal distance for generating stereoscopic images. It can also include at least memory device, a camera control unit for controlling at least the focus level, focus distance
- a head mounted display driver for translating video image signal into information and commands fed to the head mounted display.
- a head mounted display-inverting driver for inverting the video images and translating the video images into information fed to an inverting head mounted display.
- the interface processing unit may also include a display control unit for controlling the information sent to HMD, and an external display driver for controlling information and format to be displayed onto the external display.
- the interface-processing unit can further comprise an operator controller interface unit for handling the control commands input by the operator sent to the interface-processing unit.
- a second aspect of the present invention regards an apparatus for providing stereoscopic magnified observation and precise imaging location of a point in an operating field, enabling an operator to perform surgical procedures without having to remove his eyes from the operating field.
- the apparatus includes at least one interface-processing unit for processing and transmitting data and for controlling peripheral devices.
- a display device for displaying the stereoscopic magnification images, information and precise location of a point in an operating field.
- An input device for selecting and inputting data and commands into the interface processing unit.
- a camera module comprising at least two cameras for acquiring magnified and stereoscopic images from the operation filed and for localizing markers around the operative field and for obtaining focal point distance information.
- the interface processing unit for processing and dynamically presenting the stereoscopic magnified images and for calculating and precise location of a point in an operating field and plotting said point on display.
- the interface processing unit can also include a central processing unit, a memory device, a communication device and a navigation system for obtaining imaging data and for localizing a point in the operative field and display such point on a three dimensional image representation displayed on a display device.
- a third aspect of the present invention regards an apparatus for providing stereoscopic magnified observation and precise imaging location of a point in an operating field.
- the apparatus comprising a camera module having one or more infra red camera. The cameras observe the operative field and the camera module have two or more cameras focused on a focal point at a focal distance.
- a focal distance point is superimposed on the head-mounted display.
- a head mounted display worn by operator.
- An interface-processing unit receives information, which is processed, sent and stored.
- a fourth aspect of the present invention regards a method for providing stereoscopic magnified observation and precise imaging location of a point in an operating field, enabling an operator to perform surgical procedures. The method steps are as follows: receiving distance and attitude of two markers or more located about the operating field from the infra red camera. Receiving distance and attitude of one marker or more located on a camera module from an infra red camera. Calculating distance of one marker or more located on the camera module from a base line between two cameras within the camera module.
- a camera support apparatus can support the camera module and markers.
- the infra red camera, the interface processing unit can be mounted on the camera support apparatus.
- Such camera support apparatus can be manually and automatically controlled.
- a fifth aspect of the present invention regards a method for providing stereoscopic magnified observation images and precise imaging location of a point in an operating field and information, enabling an operator to perform surgical procedures without having to remove his eyes from the operating field.
- the method comprising the steps of: Displaying of images from stereoscopic magnifying video cameras to the eye of an operator. Obtaining and storing imaging images. Displaying imaging images to the eye of the operator. Plotting focal point location information on imaging images and displaying plotted focal point location information imaging images to the eye of the operator. And selecting information to be displayed to the operator.
- FIG. 1 is a block diagram illustrating the main components of a stereoscopic video magnification system, in accordance with a preferred embodiment of the present invention.
- Fig. 2 is a block diagram illustration of the preferred embodiment
- Fig. 3A is a schematic illustration of the operation sequence of the stereoscopic video magmfication system in concert with navigational data and system
- Fig 3B is a schematic illustration of camera module
- Fig 3C is a schematic illustration of targeting cross hair marker displayed onto the head mounted display (HMD) screen
- Figs. 4A is schematic illustrations of the floor stand for the stereoscopic video magnification and navigation system; and swivel arm supporting the camera module.
- the present invention overcomes the disadvantages of the prior art by providing a novel method and system that enhance and add to the capabilities of a stereoscopic video magmfication system.
- the following description is presented in order to enable any person skilled in the art to make and use the invention.
- specific terminology is set forth to provide a thorough understanding of the invention.
- descriptions of a specific application are provided only as an example.
- Various modifications to the preferred embodiment will be readily apparent to those skilled in the art and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the invention.
- the present invention is not limited to the embodiment shown, but it is to be accorded the widest scope consistent with the principles and features disclosed herein.
- Some of the elements presented in the following description are computer programs implemented in a computing device.
- the device presented in the following description contains computer software instructions specifically developed for the practice of the present invention.
- the software in the presented computing device causes the device to perform the various functions described herein, although it should be noted that it is possible to use dedicated electronic hardware to perform all the functionality described herein.
- the application can be implemented as hardware by the embedding of the predetermined program instructions and/or appropriate control information within suitable electronic hardware devices containing application-specific integrated til circuits.
- PCT application number LL00/00398 dated 1 January 2001 assigned to Surgivision Ltd. which is incorporated herein by reference.
- the present invention provides a novel and useful apparatus for stereoscopic magnified observation and precise location of a surgical or operating field by enabling an operator, typically a surgeon, to observe, to magnify, and to locate a point in the operating field while performing a surgical procedure, without having to remove his eyes from the operating field or substantially moving his head.
- the present invention can also provide a view substantially similar to the view provided to the surgeon to the other members of the surgical team simultaneously.
- Fig. 1 is a schematic illustration of the configuration and the related operation of a stereoscopic video magmfication system, generally referenced as 100, in accordance with a preferred embodiment of the present invention.
- System 100 includes an Interface Processing Unit (IPU) 102, a Head Mounted Display (HMD) 104, an External Display Unit 106, a Camera Module 108 and an Operator Control Unit 110.
- the camera module 108 includes a camera mount 112, two video cameras 114, a converging system 116, and suitable connecting cables with associated input and output sockets.
- module 108 includes two analog-to-digital converters 118.
- IPU 102 includes a Central Processing Unit (CPU) with an internal memory device 120, a flash memory device 122, a camera control unit 124, a camera convergence unit 126, an image sampler unit 128, an operator controller interface unit 132, a video switcher unit 134, an On Screen Display (OSD) unit 148 for each HMD connected to the system, a HMD driver 136, a HMD inverting driver 138, a test connection interface 140, a display control unit 130, and an external display driver 144, all of which are connected by main bus 142.
- the camera mount 112 typically is attached to the head of the surgeon by a headband, headrest or a helmet.
- Cameras 114 are lightweight video cameras such as Model EVI 370DG from Sony, Japan or the like.
- the cameras 114 are interconnected by a convergence system 116 allowing them to be adjusted in respect to each other.
- the adjustment effects motions of the cameras in respect to each other along the longitudinal axis.
- the aforementioned motions permit the positioning of both cameras in such an angle that a certain target's focal point may fall on the cameras' charged-coupled devices thereby generating a stereoscopic view.
- the analog-to-digital (A/D) converter 118 converts an analog visual signal captured by cameras 114 to digital signals to be fed to IPU 102.
- suitable analog-to-digital converters are supplied with and situated within the cameras. In contrast when an analog camera is used suitable A/D converters are to be acquired separately and placed IPU 102.
- Cameras control unit 124 receives information regarding focus level and distance, distance angle and the like for the cameras 114 and automatically synchronizes the cameras 114 according to the received values. Focus level and distance, distance angle and other information relating to the cameras 114 are fed to CPU 120 of IPU 102.
- CPU 120 is programmed with specifically developed software instructions that perform suitable calculation concerning the convergence angle required at a specific focal distance in order to generate proper stereoscopic vision.
- the required convergence angle is fed to the camera convergence unit 126 that performs the necessary correction in the disposition of the cameras 114 by sending suitable instructions to the mechanical convergence system 116 via electrical connections.
- the process is designed to be automatic but can be also induced by requests of the operator of the system (not shown) such as a surgeon, or the like.
- Video images captured by the cameras 114 are fed to the image sampler 128 of IPU 102 via A/D converters 118.
- Image sampler unit 128 samples the video images received from the cameras 114 by periodically obtaining specific values associated with the signal representing the video images, and forwards the sampled images to the CPU 120.
- CPU 120 processes the received images. The processing involves the examination of the images obtained by the cameras 114, and appropriately corrections to any stereoscopically essential differences between the cameras 114. As a result of the processing appropriate control information is sent to the camera convergence unit 126.
- Camera convergence unit 126 sends correction instruction signals to the convergence system 116. In accordance with the received instruction convergence system 116 mechanically changes the convergence angle between the cameras 114.
- Video images obtained by the camera module 108 are also sent to HMD 104 via video switcher unit 134, via On Screen Display (OSD) unit 148, and via HMD driver 136 or HMD inverting driver 138.
- Video switcher unit 134 processes video images from different sources.
- Unit 134 receives video signals such as digital signals directly from A/D converters 118, video image files from the flash memory 122, compressed video files such as MPEG files from external video source 146, and the like.
- Video switcher unit 134 converts the received video image signals to a format appropriate for display to HMD drivers 136 and 138.
- On Screen Display (OSD) unit 148 sends operational mformation to the HMD 104, such as zoom level values, and the like.
- the HMD driver 136 translates the different video images signals into information and commands fed to the HMD 104.
- HMD inverting driver 138 first inverts the video image then translates the video signals into information and commands fed to Inverting HMD 104.
- Inverting HMD driver 138 thus creates an opposite point of view that is displayed to an additional user such as a second surgeon donning an inverting HMD 104.
- Inverting HMD 104 is a head mounted display receiving information and commands from inverting HMD driver 138.
- two users (such as two surgeons) are situated on the opposite sides of an operating field.
- Camera module 108 sends video images that allow the first surgeon to observe the scene from his point of view.
- the second surgeon receives the same view, but as a result of being located on the opposite side of the operating field the video image received is not the proper representation of his natural view of the field.
- the image received is inverted to the point of view of the second surgeon.
- the inverted view may confuse the second surgeon as to the spatial location of left and right. Inverting driver 138 thus allows both surgeons to have a directionally correct view of the operating field each from his respective point of view.
- HMD 104 is typically a pair of head mounted display units, such as LCD glasses model LDI-D100BE from Sony, Japan.
- Display control unit 130 controls the information sent to HMD 104. Such information can include direct video images feed, commands, stored video images feed, patient data, operational information, other data and the like.
- Display control unit 130 receives orders from CPU 120 through operator controller unit 110 as well as through other input devices such as a keyboard, a touch screen, a pointing device, or the like.
- Operator controller unit 110 is connected to the IPU 102 via the operator controller interface 132.
- the operator controller interface 132 handles the control commands input by the operator sent to the interface processing unit.
- Operator controller unit 110 is used by the user (not shown) to transmit suitable instructions to IPU 102 by suitable manipulation of the unit 110.
- the instructions transmitted by the operator controller unit via the operator controller interface unit 132 can include focusing the cameras 114, changing the zoom values, activating and deactivating the "see-through" capability, initializing the system and readjusting the parameters of the system such as luminance, contrast, and the like.
- IPU 102 executes commands received from the operator by reading the introduced command parameters from display control unit 130. Consequently IPU 102 calculates the required parameters needed to be set and sends the instructions involving the processed command information to the camera control unit 124, and to the camera convergence unit 126.
- Camera control unit 124 and camera convergence unit 126 sends the required set of instructions to the cameras 114 and the convergence system 116 for execution.
- the operator controller unit 110 is a specially designed hardware control board that can be attached to the user's arm, placed on a special pedestal near the operating field, or the like.
- the IPU 102 tests all the connections and the settings of the system and receives test and software updates through test connection interface 140.
- External display driver 144 translates different image signals into information and commands fed to external display 106.
- External display 106 can be a computer screen, a TV set a hand eld device screen, an LCD screen and the like.
- IPU 200 receives, processes, and transmits data. IPU 200 also controls the peripheral devices. IPU 200 comprises of central processing unit (CPU) 230, memory device 231, communication device 233 and navigation system 244. Peripheral devices 270 communicate with IPU 200 via CPU 230. Peripheral devices 270 comprises display devices 250, input devices 220, cameras 210 and camera support apparatus 240.
- CPU central processing unit
- Peripheral devices 270 communicate with IPU 200 via CPU 230.
- Peripheral devices 270 comprises display devices 250, input devices 220, cameras 210 and camera support apparatus 240.
- Display devices 250 can be a head mounted display (HMD) 254 described in fig 1 and PCT application number IL00/00398 dated 18 th January 2001 assigned to Surgivision Ltd., an external display 256 such as a TV screen or a computer screen such as a 17" AE-1769 from AST Vision, a Printer devices 258 such as Phaser 2135 from Xerox, and the like.
- Display device 250 provides the operator with the stereoscopic magnified images, imaging data, precise location and localization data, patient related data, surgery and surgical procedure related data as well as any other data and information required by the operator during surgical sessions, and the like.
- Input device 220 typically comprise Operator Controller (OC) 222 such as operator controller 110 of Fig.
- OC Operator Controller
- Input device 220 can also comprise pointing device 226 such as a PS/2 mouse from MicrosoftTM, Keyboard such as a BTC 8110M ergonomic keyboard with touch pad, Microphone such as VR250BT Unidirectional Electret Microphone with Single Earphone Speaker and the like. Input devices 220 are used for selecting and inputting data and commands into the interface processing unit.
- Cameras 210 typically comprise a set of two types of cameras. Cam Module 208 as described in Fig. 1 and PCT application number IL00/00398 dated 18 th January 2001 assigned to Surgivision Ltd.
- IR camera 206, EM camera 204 and matching markers are typically supplied with Navigation system 244.
- Camera supporting apparatus 240 can contain Supporting arm 216, Arm control panel 214 both of which are described here and below in Fig 4.
- Cam module 208 is designed to capture and transmit preferably magnified video images from observed field of operation (not shown). Camera module 208 is thus placed within camera support apparatus 240. Camera module 208 is positioned and moved via camera support apparatus 240 by support arm 216 and controlled by arm control panel 214 as well as by input devices 220.
- the positioning can be automatic or manual.
- Camera support apparatus 240 is fiirther described in fig 4.
- Magnified video images obtained by cam module 208 are transmitted to CPU 230 of IPU 200 as well as to HMD 254 of display device 250. Said video images can also be transmitted for display on external display 256, for printing a hard copy form printer 258.
- Operator (not shown) using system 1 can use input device 220 to instruct CPU 230 of IPU 200 to perform certain operations.
- Command data can be fed to IPU 200 by touch, pointing device click, finger tap, voice, or by any other suitable means suitable for a human operator.
- Such command data can be sent to IPU 200 via hard cables, radio frequency wireless transmission, Infrared transmission, and the like.
- Navigation system 244 of IPU 200 is designed to receive imaging data 242 of operative field and the surrounding areas (not shown).
- Imaging data 242 can be a MRI scan, a CT scan, Ultrasound data and the like.
- Imaging data 242 can be acquired pre-operatively or intra-operative and is fed to navigation system 244 directly such as via a hard copy imaging file or via network 234.
- Network 234 typically comprises a hospital network connecting computers 235, the Internet network 236 and the like.
- Communication device 233 of IPU 200 is designed to receive and transmit data between network 234 and CPU 230 of IPU 200. Such data can be made available for navigation system 244 as well as for memory device 231 for storage, as well as for display on display devices 250.
- Communication device can be a X2 modem device from US Robotics and the like.
- Memory device 231 can be a Flush memory device, or any other memory device, and the like.
- Navigation system 244 such as VectorVision from BrainLabTM, IR camera 206 or EM camera 204 as well as a set of markers (not shown) are used to localize a point in space preferably in the operative field (not shown) and display such point on a three dimensional image representation reconstructed from imaging data 242 obtained in advance. Such three dimensional image is displayed on display device 250 such as HMD 254 and the like.
- Fig. 3 illustrates the operational sequence of the stereoscopic video magnification system in association with the navigational data and system.
- a navigational system In order for a navigational system to locate a point in space, it must have at least three coordinates in space. Markers placed near and around operative field attain such coordinates. Using a multiple of markers enhances accuracy of the system.
- a multiple of active or passive markers located near and around the operative field to be used as reference points. The reference points are used to establish the location of a target probe. Typically the operator manually manipulates the target probe by physically placing it on a target point.
- the navigation system calculates the location of the target point and displays the location thus calculated on a display monitor where the indicated location is superimposed on an imaging image, such as a MRI two dimensional image.
- an imaging image such as a MRI two dimensional image.
- the target probe is unnecessary as will be clearly shown from the following description.
- the target point is superimposed on a three dimensional imaging image constructed from imaging data and displayed on visualization devices such as HMD.
- Fig. 3A, 3B and 3C same numbers relate to like components. The following discussion relates to said Figures as a whole.
- Fig 3 A illustrate the preferred embodiment of the present invention in which cam module 208, external display 256, keyboard 224, Infrared (IR) camera 344 and IPU 200 are all mounted on camera support apparatus 240.
- Camera support apparatus 240 is situated preferably close to operative field 300 such that cam module 208 can be situated in a position permitting cameras 314 direct visualization of operative field 300.
- IR camera 344 is mounted on camera support apparatus 240 such that both operative field 300 and cam module 208 mounted markers are directly visible.
- User usually dons HMD 254 during operative session.
- HMD 254 is connected to IPU 200 via suitable conduits.
- Camera module 208 best seen in Fig 3B comprises camera mount 312, two cameras 314, convergence system 316 and markers 302. Cameras 314 typically observe operative field 300 such that both obtain a single focal point 308 illustrated as a cross best seen in Fig 3A.
- operative field 300 Preferably magnified stereoscopic video image of operative field 300 is displayed stereoscopically on HMD 254.
- Focal point 308 marked as a cross in Fig 3C is superimposed onto HMD display 306 via crosshair generator such as FK 1/F crosshair generator from KAPPA Miltram Industries Ltd.
- Focal point 308 is thus visible to Operator (not shown) at all times and is represented as a crosshair on HMD display 306 of HMD 254.
- an imaging machine such as a MRI scanner or CT scanner scans the operating field 300 and its surroundings.
- Markers 302 surround operative field 300. Markers 302 are made of a substance perceived by imaging scanners and will appear as dark or light spots on the imaging data 242 which is fed to IPU 200. The process by which Navigation system 244 determines the exact location of the focal point 308 is now discussed in further detail.
- Focal point 308 is a virtual target probe used in place of the traditional physical target probe. Markers 302 can be passive Markers such as IR or light reflectors or active markers such as IR emitting markers or electromagnetic radiation emitting markers and the like. Markers 302 are situated on camera mount 312 as well as around the operating field 300 in the same location as during the imaging session.
- markers 302 emit IR radiation, as seen by parallel lines in Fig 3A.
- the IR radiation is perceived by IR camera 344 of navigating system 244.
- the analog information representing the IR radiation is converted to digital information by navigational system 244 of Fig.1 located within IPU 200. Consequently CPU 230 of IPU 200 performs the suitable calculation regarding the location of focal point 308 in respect to markers 302 mounted on Cam module 208.
- Focal point 308 location in space is then compared to the spatial location of markers 302 located around operative field 300. Calculated location of the focal point 308 is plotted on pre or intra-operative imaging data.
- the plotted imaging data 242 is sent via suitable conduits to output unit 250 of Fig 2 for display on HMD 254, external display 256, and for transmission to remote units via communication device 233 of Fig 1.
- Focal point location 308 superimposed on imaging data 242 is displayed preferably as a three dimensional reconstruction of the operative field and surroundings.
- the user can choose to observe said imaging data 242 with superimposed focal point 308 on display devices 250 of Fig 2 by introducing suitable requests through the appropriate manipulation of input devices 220 of Fig 2 such as operator controller 222 of Fig 2, keyboard 224 and the like.
- electromagnetic probes 302 such as magnets
- IR camera 344 is not needed and a special crown (not shown) is placed around magnetic probes. The crown is constructed to measures signals from each probe.
- Fig. 4A which describes in further detail camera support apparatus 400.
- Apparatus 400 typically consists of a floor-stand support 410 or a ceiling mount (not shown), a handling apparatus 420, and the camera module 430.
- Camera support apparatus 400 is intended to provide easy, effortless, positioning of the camera module 430.
- Camera module 430 is positioned above or around operative field 500 such that the cameras 422 capture a clear and uninterrupted view of operative field 500. Said view is transmitted to the user 600.
- Handling apparatus 420 include a set of arms, marked 406, 408 and 412, permitting easy and smooth handling of camera module 430 to a stable but changeable position during an operating session.
- Handling apparatus 420 also provide suitable tubing for cable routing within said arms from cameras 210 of Fig 1 such as camera module 430, as well as from display device 250 of Fig 1, as well as from a command board 414 to IPU 200 of Fig 3 A.
- said cables are suitable conduits for the power supply, data transfer and control of certain aspects related to said devices, mounted on camera support apparatus 400.
- Floor-stand 410 is intended to provide support, stability, mobility, cable routing, storage place, and the like to camera module 430.
- Floor-stand support 410 comprises a base 402, a controlling handle 403, and a set of wheels 404.
- the base 402 can consists of a rectangular shaped box or a multiple protrusion shaped round flat plate, or the like, fitted with wheels 404 at the bottom side of the base. Wheels 404 are typically made from hard plastic but also may be pneumatic. Wheels 404 can be fitted with a braking mechanism (not shown) for static positioning of floor-stand 410. Camera support apparatus 400 is easily maneuverable with the help of controlling handle 403.
- Floor-stand support 410 can be of varied forms according to the operating room type and available space.
- Floor-stand 410 is designed to fit the stereoscopic video magnifying and navigation systems requirements.
- Said box type base 402 may contain within, IPU 200 of Fig 3A as well as other components related to the system.
- Handling apparatus 420 comprising a stand arm 406, a reaching arm 408 and a swivel arm 412 is intended for easy and versatile handling of the camera module 430.
- Handling apparatus 420 is preferably made from light components such as composite metals, plastics and the like, and is suitable for conducting electrical and data wires within tubes there within.
- Stand arm 406 which is a vertical beam
- reaching arm 408 which is a horizontal arm
- swivel arm 412 which is a special support arm for camera module 430 are inter connected arms, set with oil damping bearings to allow operator 600 easy manipulation of camera module 430.
- the camera module 430 is attached to the end of the swivel arm 412 via the camera mount 416.
- Camera mount 416 is connected to swivel arm 412 by a ball and socket type joint 435 allowing movement in any direction.
- Joint 435 is fitted with vertical and horizontal-locking screws (not shown) for the fixation of camera mount 416 in any desired position.
- Arm 412 is joined with horizontal arm 408 via a rotational joint 445 allowing horizontal rotation of arm 412 around axis 480.
- Joint 445 is fitted with locking screws (not shown) permitting fixation of arm 412 in any desired location around axis 480.
- Horizontal arm 408 is interconnected with vertical beam 406 via joint 455.
- Joint 455 is such that horizontal arm 408 can be elevated and descended along axis 490 as well as rotated around axis 490.
- Joint 455 is also fitted with two locking screws (not shown).
- Said locking screw permit the fixation of arm 408 in any point along and around axis 490.
- Vertical beam 406 is fitted with a stopper (not shown) at beam 406 upper margin, such that arm 408 is prevented from slipping over beam 406.
- Camera mount 416, Swivel arm 412 as well as horizontal arm 408 can be displaced manually by operator 600 or others. Such displacement can be achieved by electrical motor system apparatus (not shown) and controlled by Command board 414 of handling apparatus 420.
- Such board 414 can allow operator 600 to set a fixed memorable positions of handling apparatus as well as control other aspects of camera apparatus such as activating braking mechanism for wheels 404, activating lighting apparatus 426 on camera module 430, and the like.
- Floor-stand support 410 and handling apparatus 420 typically convey within their apparatuses, hidden from external observation, cable connections operative in the activation of camera module 430. It should be easily appreciated that handling apparatus 420 can be attached to a ceiling mount (not shown) and function such as to allow easy and precise handling of camera module 430.
- Camera module 430 consists of camera mount 416, cameras 422, convergence system 424, a pair of handles 418, a lighting apparatus 426, and a set of at least three probes 428.
- Camera mount 416 is linked via connecting cables inserted via a specialized aperture (not shown) to the power supply as well as to the cameras and the other components of the system for data and command transfer.
- Camera mount 416 is fitted with two handles to allow operator 600 ready, precise positioning of the camera module 430 above or around the operating field 500.
- Camera mount 416 has at least three fitted probes 428 for the transmission or the reflection of electromagnetic or infrared radiation for the purpose of spatial localization of the camera module 430.
- camera mount 416 contains two cameras 422 attached to each other via a convergence system 424 as well as a lighting apparatus 426 such as an illumination Tl-25 3mm white Led lamp from The Led Light and the like.
- Cameras 422, convergence system 424 and lighting apparatus 426 are fitted within camera mount 416 in such a manner as to allow unobstructed motion to the cameras 422 and the convergence system 426.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Pathology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Robotics (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IL2001/000598 WO2003002011A1 (fr) | 2001-06-28 | 2001-06-28 | Systeme video stereoscopique d'agrandissement et de navigation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IL2001/000598 WO2003002011A1 (fr) | 2001-06-28 | 2001-06-28 | Systeme video stereoscopique d'agrandissement et de navigation |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003002011A1 true WO2003002011A1 (fr) | 2003-01-09 |
Family
ID=11043065
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2001/000598 WO2003002011A1 (fr) | 2001-06-28 | 2001-06-28 | Systeme video stereoscopique d'agrandissement et de navigation |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2003002011A1 (fr) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1470791A1 (fr) * | 2003-04-25 | 2004-10-27 | BrainLAB AG | Appareil de visualisation de données d'imagerie médicale et d'images vidéo combinées, équipé de moyens de saisie d'entrées, et procédé de visualisation |
WO2004100815A2 (fr) * | 2003-05-16 | 2004-11-25 | Carl Zeiss | Dispositif d'eclairage d'un champ operatoire |
EP1621153A1 (fr) * | 2004-07-28 | 2006-02-01 | BrainLAB AG | Appareil de visualisation stéréoscopique de données d'imagerie médicale et d'images vidéo combinées |
US7203277B2 (en) | 2003-04-25 | 2007-04-10 | Brainlab Ag | Visualization device and method for combined patient and object image data |
EP1969416A2 (fr) * | 2005-12-12 | 2008-09-17 | Universidade Federal de Sao Paulo - UNIFESP | Systeme de visualisation elargie de la realite avec informatique omnipresente |
US7463823B2 (en) | 2003-07-24 | 2008-12-09 | Brainlab Ag | Stereoscopic visualization device for patient image data and video images |
CN103211655A (zh) * | 2013-04-11 | 2013-07-24 | 深圳先进技术研究院 | 一种骨科手术导航系统及导航方法 |
WO2015151447A1 (fr) * | 2014-03-31 | 2015-10-08 | Sony Corporation | Dispositif de commande chirurgical, procédé de commande, et système de commande d'imagerie |
EP3096065A1 (fr) * | 2015-05-21 | 2016-11-23 | Euromedis Groupe | Système vidéo comprenant une armature articulée |
CN107865702A (zh) * | 2016-09-28 | 2018-04-03 | 李健 | 一种医用智能手术显微系统 |
CN109715107A (zh) * | 2016-09-23 | 2019-05-03 | 索尼奥林巴斯医疗解决方案公司 | 医疗观察装置和医疗观察系统 |
AT521076A1 (de) * | 2018-03-26 | 2019-10-15 | Bhs Tech Gmbh | Stereomikroskop zur Verwendung bei mikrochirurgischen Eingriffen am Patienten und Verfahren zur Steuerung des Stereomikroskops |
US12126916B2 (en) | 2021-08-30 | 2024-10-22 | Proprio, Inc. | Camera array for a mediated-reality system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4395731A (en) * | 1981-10-16 | 1983-07-26 | Arnold Schoolman | Television microscope surgical method and apparatus therefor |
EP0629963A2 (fr) * | 1993-06-21 | 1994-12-21 | General Electric Company | Système d'affichage pour la visualisation de parties du corps lors des procédés médicaux |
WO1999038449A1 (fr) * | 1998-01-28 | 1999-08-05 | Cosman Eric R | Systeme de suivi d'objets optiques |
US5961456A (en) * | 1993-05-12 | 1999-10-05 | Gildenberg; Philip L. | System and method for displaying concurrent video and reconstructed surgical views |
US6006126A (en) * | 1991-01-28 | 1999-12-21 | Cosman; Eric R. | System and method for stereotactic registration of image scan data |
-
2001
- 2001-06-28 WO PCT/IL2001/000598 patent/WO2003002011A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4395731A (en) * | 1981-10-16 | 1983-07-26 | Arnold Schoolman | Television microscope surgical method and apparatus therefor |
US6006126A (en) * | 1991-01-28 | 1999-12-21 | Cosman; Eric R. | System and method for stereotactic registration of image scan data |
US5961456A (en) * | 1993-05-12 | 1999-10-05 | Gildenberg; Philip L. | System and method for displaying concurrent video and reconstructed surgical views |
EP0629963A2 (fr) * | 1993-06-21 | 1994-12-21 | General Electric Company | Système d'affichage pour la visualisation de parties du corps lors des procédés médicaux |
WO1999038449A1 (fr) * | 1998-01-28 | 1999-08-05 | Cosman Eric R | Systeme de suivi d'objets optiques |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7203277B2 (en) | 2003-04-25 | 2007-04-10 | Brainlab Ag | Visualization device and method for combined patient and object image data |
EP1470791A1 (fr) * | 2003-04-25 | 2004-10-27 | BrainLAB AG | Appareil de visualisation de données d'imagerie médicale et d'images vidéo combinées, équipé de moyens de saisie d'entrées, et procédé de visualisation |
WO2004100815A2 (fr) * | 2003-05-16 | 2004-11-25 | Carl Zeiss | Dispositif d'eclairage d'un champ operatoire |
WO2004100815A3 (fr) * | 2003-05-16 | 2005-02-10 | Zeiss Carl | Dispositif d'eclairage d'un champ operatoire |
US7463823B2 (en) | 2003-07-24 | 2008-12-09 | Brainlab Ag | Stereoscopic visualization device for patient image data and video images |
EP1621153A1 (fr) * | 2004-07-28 | 2006-02-01 | BrainLAB AG | Appareil de visualisation stéréoscopique de données d'imagerie médicale et d'images vidéo combinées |
EP1969416A2 (fr) * | 2005-12-12 | 2008-09-17 | Universidade Federal de Sao Paulo - UNIFESP | Systeme de visualisation elargie de la realite avec informatique omnipresente |
EP1969416A4 (fr) * | 2005-12-12 | 2010-03-03 | Univ Fed Sao Paulo Unifesp | Systeme de visualisation elargie de la realite avec informatique omnipresente |
CN103211655A (zh) * | 2013-04-11 | 2013-07-24 | 深圳先进技术研究院 | 一种骨科手术导航系统及导航方法 |
US10571671B2 (en) | 2014-03-31 | 2020-02-25 | Sony Corporation | Surgical control device, control method, and imaging control system |
WO2015151447A1 (fr) * | 2014-03-31 | 2015-10-08 | Sony Corporation | Dispositif de commande chirurgical, procédé de commande, et système de commande d'imagerie |
EP3096065A1 (fr) * | 2015-05-21 | 2016-11-23 | Euromedis Groupe | Système vidéo comprenant une armature articulée |
FR3036458A1 (fr) * | 2015-05-21 | 2016-11-25 | Euromedis Groupe | Systeme video comprenant une armature articulee |
CN109715107A (zh) * | 2016-09-23 | 2019-05-03 | 索尼奥林巴斯医疗解决方案公司 | 医疗观察装置和医疗观察系统 |
EP3517070A4 (fr) * | 2016-09-23 | 2019-09-04 | Sony Olympus Medical Solutions Inc. | Dispositif d'observation médicale et système d'observation médicale |
US11432899B2 (en) | 2016-09-23 | 2022-09-06 | Sony Olympus Medical Solutions Inc. | Medical observation device and medical observation system |
CN107865702A (zh) * | 2016-09-28 | 2018-04-03 | 李健 | 一种医用智能手术显微系统 |
AT521076A1 (de) * | 2018-03-26 | 2019-10-15 | Bhs Tech Gmbh | Stereomikroskop zur Verwendung bei mikrochirurgischen Eingriffen am Patienten und Verfahren zur Steuerung des Stereomikroskops |
AT521076B1 (de) * | 2018-03-26 | 2020-11-15 | Bhs Tech Gmbh | Stereomikroskop zur Verwendung bei mikrochirurgischen Eingriffen am Patienten und Verfahren zur Steuerung des Stereomikroskops |
US11516437B2 (en) | 2018-03-26 | 2022-11-29 | Bhs Technologies Gmbh | Stereo microscope for use in microsurgical operations on a patient and method for controlling the stereo microscope |
US12126916B2 (en) | 2021-08-30 | 2024-10-22 | Proprio, Inc. | Camera array for a mediated-reality system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20050090730A1 (en) | Stereoscopic video magnification and navigation system | |
US6919867B2 (en) | Method and apparatus for augmented reality visualization | |
AU2019261643B2 (en) | Stereoscopic visualization camera and integrated robotics platform | |
US11336804B2 (en) | Stereoscopic visualization camera and integrated robotics platform | |
US9766441B2 (en) | Surgical stereo vision systems and methods for microsurgery | |
US11147443B2 (en) | Surgical visualization systems and displays | |
US6891518B2 (en) | Augmented reality visualization device | |
EP2903551B1 (fr) | Système numérique pour la capture et l'affichage d'une vidéo chirurgicale | |
US7907166B2 (en) | Stereo telestration for robotic surgery | |
JP2575586B2 (ja) | 外科用装置位置付けシステム | |
US10028651B2 (en) | Surgical visualization systems and displays | |
US9330477B2 (en) | Surgical stereo vision systems and methods for microsurgery | |
EP3912588B1 (fr) | Système d'imagerie pour robot chirurgical et robot chirurgical | |
US20060176242A1 (en) | Augmented reality device and method | |
WO2003002011A1 (fr) | Systeme video stereoscopique d'agrandissement et de navigation | |
CN1894618A (zh) | 用于立体观察实时或静态图像的系统 | |
EP4221581A1 (fr) | Microscope chirurgical numérique à auto-navigation | |
RU2802453C1 (ru) | Хирургическая система стереовидения | |
CN116568219A (zh) | 自动导航数字手术显微镜 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
122 | Ep: pct application non-entry in european phase | ||
NENP | Non-entry into the national phase |
Ref country code: JP |