EP1240418A1 - Procede de poursuite automatique fiable d'un endoscope et pistage (tracking) d'un instrument chirurgical avec un systeme de guidage d'endoscope (efs) a entrainement et a commande electriques, en chirurgie a effraction minimale - Google Patents

Procede de poursuite automatique fiable d'un endoscope et pistage (tracking) d'un instrument chirurgical avec un systeme de guidage d'endoscope (efs) a entrainement et a commande electriques, en chirurgie a effraction minimale

Info

Publication number
EP1240418A1
EP1240418A1 EP00977518A EP00977518A EP1240418A1 EP 1240418 A1 EP1240418 A1 EP 1240418A1 EP 00977518 A EP00977518 A EP 00977518A EP 00977518 A EP00977518 A EP 00977518A EP 1240418 A1 EP1240418 A1 EP 1240418A1
Authority
EP
European Patent Office
Prior art keywords
instrument
endoscope
tracking
image
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP00977518A
Other languages
German (de)
English (en)
Inventor
Wolfgang Eppler
Ralf Mikut
Udo Voges
Rainer Stotzka
Helmut Breitwieser
Reinhold Oberle
Harald Fischer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Forschungszentrum Karlsruhe GmbH
Original Assignee
Forschungszentrum Karlsruhe GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Forschungszentrum Karlsruhe GmbH filed Critical Forschungszentrum Karlsruhe GmbH
Publication of EP1240418A1 publication Critical patent/EP1240418A1/fr
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/313Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/367Correlation of different images or relation of image positions in respect to the body creating a 3D dataset from 2D images using position information
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras

Definitions

  • the invention relates to a method for the safe automatic tracking of an endoscope and tracking of a surgical instrument with an electrically driven and controlled endoscope guidance system (EFS) for minimally invasive surgery.
  • EFS endoscope guidance system
  • surgeon uses a monitor image (original monitor).
  • An endoscope with a camera and the instruments necessary for the operation are inserted through trocars into the patient's body cavity.
  • both the endoscope and the camera are often still guided manually.
  • the surgeon who guides the instruments instructs an assistant to adjust the endoscope with the camera so that the instrument remains visible in the image.
  • the advantages of this procedure are that the assistant leading the endoscope avoids dangerous situations, recognizes errors, communicates with the surgeon and only updates the endoscope when it is necessary. Disadvantages are increased personnel costs compared to conventional operations and the inevitable trembling of the assistant.
  • Such an endoscope guide system for guiding an endoscopic camera unit is electrically driven and can be attached to any operating table.
  • An operating component is used for remote control, usually a joystick, which is usually attached to the working instrument, or a voice input.
  • the inserted endoscope as well as separately inserted instruments each have an invariant point with regard to the movement, the trocar Puncture site that must exist on or in the patient's body wall so that these devices can be swiveled and tilted without injuring the patient more than with the puncture.
  • the camera of the endoscope guidance system is guided and mounted so that the lower edge of the image runs parallel to the patient support and the image does not come upside down (see, for example, DE 196 09 034). It is possible to twist the camera, but it makes spatial orientation difficult.
  • An endoscope of such an endoscope guide system that projects into the patient's body has several degrees of freedom.
  • the EFS in DE 196 09 034 has four degrees of freedom of movement, namely about a first axis perpendicular to the operating table through the puncture site on the body, about a second axis perpendicular to the first and perpendicular to the puncturing direction, along a third axis, the trocar axis, and around this latter axis.
  • the first three degrees of freedom are limited by limit switches.
  • the control component e.g. B. on the instrument handle of the instrument operated by the surgeon, the endoscope camera is steered in its viewing direction.
  • Each of the four degrees of freedom can thus be changed at a safety-limited speed.
  • An automatic tracking system can be installed on the basis of such an endoscope control.
  • a control system is known from US 5,820,545.
  • the instrument tip envisaged in it is constantly tracked with every movement, which means restlessness for the viewer. This requires electronics, which, because it is specially made, means a considerable economic outlay. If the third dimension is to be recorded, the corresponding 3-D camera device must be provided for this, which increases the outlay on equipment. Error handling, such as is necessary due to reflections or changing lighting, is not provided.
  • the image section of the current instrument tip is tracked. The operating surgeon sees two different images. Color, geometry or brightness coding of the instrument and position detection via magnetic probes on the working instrument are described.
  • the tracking is related to instruments or organs marked with color / geometry. Multi-colored markings for switching the tracking targets and for increasing security through redundancy are mentioned.
  • the control element is the camera zoom or the position of the CCD chips in the camera or an electronically implemented image selection on the monitor. The system consistently uses special cameras.
  • the invention has for its object a fast, fault-tolerant and inexpensive method for the automatic Chasing an instrument tip with a sparingly moving one
  • the invention is achieved by a method with the features of claim 1 and aims to maintain the advantages of manual guidance of the endoscope even with automatic tracking.
  • the image processing and endoscope control part is strictly separated from the surgeon's original monitor. Errors in these parts do not affect the sequences he tracks.
  • the detection of the instrument tip and the control of the endoscope with its axes and the zoom control are treated as a unit, since the safety concept implemented in this way can detect errors both in the image recognition and in the assignment of the manipulated variables with high reliability. Identifiable error conditions are:
  • the endoscope setting is only changed if the instrument tip leaves a certain frame in the image center of the O monitor (permissible range). This leaves the picture for the Surgeons calm when moving the instrument within this frame near the image center.
  • the tip of the instrument is marked by shape, color or even by its characteristic shape in order to achieve quick recognition. Nevertheless, it cannot be avoided that the characteristics change with different instruments. For this reason, an online adaptation of the characteristic properties of the marking with neural or statistical learning methods will lead to reliable and flexible instrument recognition.
  • Standard components such as computers, operating systems and cameras are completely sufficient to carry out all of these procedural steps.
  • the system needs observation with a single camera, a 2-D camera. It performs the tracking based on two-dimensional image information.
  • a 3-D camera the use of a video channel is therefore sufficient (claim 9), whereby the hardware outlay for image processing is reduced.
  • the instrument tip should be held in the center of the image on the O monitor. Therefore, movements perpendicular to the image plane are ignored. If they are still to be recognized, for example for zoom control or for camera movement perpendicular to the image plane, further measures must be taken.
  • One is a further sensor on the trocar of the instrument, which measures the immersion depth (claim 7), thus reducing the two-channel image processing required for 3-D recording to one channel as for 2-D recording.
  • Another is to roughly calculate the distance between the endoscope and the instrument tip from the perspective distortion of the parallel edges of the instrument. This presupposes that the focal length of the camera and the latitude and longitude dimensions of the instrument are known.
  • the top priority is the intervention of the operating surgeon, who can intervene in the endoscope control at any time with the highest priority and cancel the tracking.
  • Settings work is preceded by the central division of the monitor area before the operation during the functional test.
  • the endoscope setting is only changed automatically when the tip of the instrument leaves the permissible range (claim 2), whereby the image remains pleasantly calm. In order to be able to do this, the area of the instrument tip is mapped in the computer, a model of which is sufficient for identification is created (claim 3).
  • One method of doing this is mentioned in claim 4 and consists of generating a gradient image, segmenting the object edges and determining the third dimension by calculating the edge line using linear regression.
  • the gradient image can be generated by a Sobel filter (claim 5).
  • the advantage of redundancy is that the image processing and the redundant sensors have different advantages and disadvantages.
  • image processing is sensitive to masking of the instrument tip and contamination of the optics.
  • position sensors on the instrument guidance system can supply incorrect information in the event of electromagnetic interference in the operating room, inaccuracies due to the different lengths of the instruments used or inaccuracies in the determination of the reference coordinate systems between the endoscope and instrument guidance, or they can fail during the operation.
  • both image processing and - - Also position sensors for instrument guidance the results can be compared and checked for consistency. Due to the development of the errors, it can be concluded in many cases which of the sensor signals reproduces the current situation without errors.
  • the degree of redundancy of the degrees of freedom of the endoscope guide system is determined by the number of excess axes that are not directly necessary for centering the object in the 0-monitor image. These can be both extracorporeal axes of the EFS - rotation around the vertical axis, around the horizontal axis and rotation around as well as translation along the trocar axis - but also other degrees of freedom that result from the use of endoscopes with flexible, pivotable distal areas. This means that there are so-called intracorporeal axes or degrees of freedom (claim 8).
  • This process concept results in a very high level of security and great fault tolerance.
  • the method works in simple detection situations with an increased processing speed, particularly in image processing, and is able to track at a reduced speed in complicated detection situations, such as unfavorable lighting, similarities between instrument tips and surroundings.
  • the tracking of the endoscope remains at least so fast that no impatience is provoked by the operating surgeon.
  • the method optionally allows the integration of additional transmitters Sensor information such as that of magnetic probes on the guiding system of the working instrument, measurement of the immersion depth on the trocar, in order to compensate for the temporary failure of individual sensors in the multi-sensor environment due to contamination of the instrument tip during optical measurement, to check the plausibility of the evaluated sensor information and thus finally Increase security.
  • Sensor information such as that of magnetic probes on the guiding system of the working instrument, measurement of the immersion depth on the trocar
  • the system is made up of commercially available components or subsystems and can therefore be implemented in an economically acceptable manner.
  • FIG. 1 hierarchy of the method
  • FIG. 3 state graph of automatic tracking
  • FIG. 4 image areas on the original monitor
  • Figure 6 schematically shows the endoscope guidance system.
  • the safety standard is set very high in medical devices. Therefore, the core of the automatic endoscope tracking is the fault-tolerant procedure, which works with multiple redundancy and thus guarantees the required safety. The surgeon is relieved of additional safety by relieving him of technical steps wherever possible. Different levels of automatic tracking support provide support as needed. This means that the doctor can operate the instruments necessary for the operation intuitively and confidently. This is ensured by the quiet path guidance, the speed limit for the tracking and the voice output, through which the doctor can use the - -
  • Output medium MMI monitor, LCD display or voice output about errors and critical states of the system, such as dirty endoscope, is informed.
  • Sovereignty also means that the surgeon uses the monitor, which is independent of the tracking system, the original monitor, and has the hierarchical option of switching off the tracking system at any time.
  • This structured requirement is shown in FIG. 1 and shows the structure of the hierarchy starting from the central requirement of security.
  • the fault tolerance is achieved by one or more measures:
  • Object detection and control as a unit multiple handling of possible error states, both through individual components of the image processing and control as well as through a higher-level monitoring unit, multi-sensor concept, adaptive feature adaptation and 3-D reconstruction.
  • the advantage of the uniform handling of object detection and control is that it allows conclusions to be drawn about the causes of errors. If, for example, the last positioning actions are known, the probable positions of the instrument marking can be inferred with greater accuracy, and thus a higher degree of recognition reliability can be achieved. In addition to improved communication with the surgeon, determining the cause of the error has the advantage of being able to determine adequate system reactions.
  • a system configuration of the endoscope guidance system is shown schematically as an example by the system structure in Figure 2 and consists of the following blocks connected by cable: - The basic EFS with four degrees of freedom, left / right, up / down, turn and in / out including the electronic control and the limit switches on the corresponding axes of the degrees of freedom, _ the 2-D video endoscope with video output (red / yellow / blue output, RGB), original monitor and light source,
  • MMI Man-machine
  • TTL logic interface
  • the user interface in the form of a hand switch, the joystick for manual operation.
  • the tracking control, tracking control consists of the
  • BI binary input "tracking stop” and the video signal with three channels (RGB) and synchronization.
  • the output variables are:
  • the main task of the automatic tracking function is to keep the currently required instrument tip in the central region (see Figure 4).
  • the necessary one The control sequence is shown in the state graph according to FIG. 3.
  • the release circuit for automatic tracking is initiated within the system.
  • the automatic tracking is released by the operating surgeon via the ring switch on the control unit (see Figure 6) and remains active until it is stopped by pressing the stop button or by operating the joystick or automatically.
  • the tracking is automatically stopped:
  • the automatic tracking works with limited positioning speeds of up to 10 cm / sec or 30 ° / sec, which can also be restricted or adapted depending on the application (abdominal, lung, cardiac surgery, for example) and on an individual basis, so that the surgeon can avoid unwanted ones Situations can react in time.
  • there is an adjustment limit for the axis positions which limits tilting and swiveling, limits the translatory movement along the trocar axis and does not allow full rotation about the shaft axis (see FIG. 7).
  • the possibly additionally marked instrument tip is automatically recognized via its image stored in the computer and its middle position by the x position and y position in the two-dimensional camera image, detection reliability, size of the identified instrument tip and further information for error detection passed on to the controller.
  • the detection of the instrument tip works independently and is independent of the release of the tracking.
  • the image processing (FIG. 2) recognizes occurring errors, such as no instrument in the image, several instruments in the image, and stops the automatic tracking in these cases.
  • the automatic tracking will change the position of the endoscope until the instrument tip is again in the central region.
  • This task is solved by the path control (see FIG. 2), which continuously processes the measured position of the instrument tip in the camera image.
  • the status of the automatic tracking and any error messages are shown or displayed on the MMI monitor so that there is no need to interfere with the image transmission between the camera and the O monitor for the camera image.
  • the third dimension can be estimated with sufficient accuracy with a known focal length of the endoscope.
  • the most important task in depth estimation is to determine the size of the object in the image.
  • Object can also mean a marking with sharp edges that can be easily recognized on the object.
  • the simplest method of detection is to determine the diameter of the segmented marking region. This proves to be imprecise, since the different orientations of the endoscope and the properties of the central projection lead to deformations that do not allow an exact determination of the object width.
  • a better method for determining the instrument width at the tip segments the edges of the object and then determines the distance to the calculated center of gravity. This has the advantage that the width of the object is determined largely independently of the orientation and largely unaffected by the projection.
  • a filter for example a 3x3 Sobel filter, is applied to the transformed grayscale image in order to then start an edge tracking algorithm.
  • edges found have the disadvantage that their width can vary widely. What is required is a thin edge line that is consistently the width of a pixel by distances to determine the edges more precisely.
  • the accuracy of the distance determination essentially depends on the quality of the edge extraction.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

L'invention concerne un procédé de poursuite automatique fiable d'un endoscope et de pistage (tracking) d'un instrument chirurgical avec un système de guidage d'endoscope (EFS) à entraînement et à commande électriques, en chirurgie à effraction minimale. Ce procédé se fonde sur trois éléments fondamentaux : le traitement de la tolérance aux erreurs assisté par ordinateur, l'utilisation intuitive par le chirurgien qui opère et sa souveraineté. Tout cela permet de garantir une grande fiabilité pendant l'opération et signifie pour le chirurgien et son équipe qu'ils se trouvent déchargés de travaux et de manipulations momentanées sans caractère de priorité, sollicitant énormément de concentration.
EP00977518A 1999-12-22 2000-11-09 Procede de poursuite automatique fiable d'un endoscope et pistage (tracking) d'un instrument chirurgical avec un systeme de guidage d'endoscope (efs) a entrainement et a commande electriques, en chirurgie a effraction minimale Ceased EP1240418A1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE19961971 1999-12-22
DE19961971A DE19961971B4 (de) 1999-12-22 1999-12-22 Vorrichtung zum sicheren automatischen Nachführen eines Endoskops und Verfolgen eines Instruments
PCT/EP2000/011062 WO2001046577A2 (fr) 1999-12-22 2000-11-09 Procede de poursuite automatique fiable d'un endoscope et pistage (tracking) d'un instrument chirurgical avec un systeme de guidage d'endoscope (efs) a entrainement et a commande electriques, en chirurgie a effraction minimale

Publications (1)

Publication Number Publication Date
EP1240418A1 true EP1240418A1 (fr) 2002-09-18

Family

ID=7933779

Family Applications (1)

Application Number Title Priority Date Filing Date
EP00977518A Ceased EP1240418A1 (fr) 1999-12-22 2000-11-09 Procede de poursuite automatique fiable d'un endoscope et pistage (tracking) d'un instrument chirurgical avec un systeme de guidage d'endoscope (efs) a entrainement et a commande electriques, en chirurgie a effraction minimale

Country Status (4)

Country Link
US (1) US20020156345A1 (fr)
EP (1) EP1240418A1 (fr)
DE (1) DE19961971B4 (fr)
WO (1) WO2001046577A2 (fr)

Families Citing this family (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8944070B2 (en) 1999-04-07 2015-02-03 Intuitive Surgical Operations, Inc. Non-force reflecting method for providing tool force information to a user of a telesurgical system
WO2003030763A1 (fr) * 2001-10-05 2003-04-17 Boston Innovative Optics, Inc. Systeme et procede permettant d'obtenir de la documentation visuelle au cours d'une intervention chirurgicale
WO2003092498A1 (fr) * 2002-05-02 2003-11-13 Medigus Ltd. Lumiere d'entree pour endoscopes et laparoscopes
US8211010B2 (en) * 2002-10-29 2012-07-03 Olympus Corporation Endoscope information processor and processing method
DE10313829B4 (de) * 2003-03-21 2005-06-09 Aesculap Ag & Co. Kg Verfahren und Vorrichtung zur Auswahl eines Bildausschnittes aus einem Operationsgebiet
DE102004011888A1 (de) * 2003-09-29 2005-05-04 Fraunhofer Ges Forschung Vorrichtung zur virtuellen Lagebetrachtung wenigstens eines in einen Körper intrakorporal eingebrachten medizinischen Instruments
US9943372B2 (en) * 2005-04-18 2018-04-17 M.S.T. Medical Surgery Technologies Ltd. Device having a wearable interface for improving laparoscopic surgery and methods for use thereof
US9789608B2 (en) 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical robot
US10555775B2 (en) 2005-05-16 2020-02-11 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US7962195B2 (en) 2006-06-01 2011-06-14 Biosense Webster, Inc. Model-based correction of position measurements
US8062211B2 (en) * 2006-06-13 2011-11-22 Intuitive Surgical Operations, Inc. Retrograde instrument
US10258425B2 (en) 2008-06-27 2019-04-16 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide
US10008017B2 (en) 2006-06-29 2018-06-26 Intuitive Surgical Operations, Inc. Rendering tool information as graphic overlays on displayed images of tools
US9718190B2 (en) 2006-06-29 2017-08-01 Intuitive Surgical Operations, Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US20090192523A1 (en) 2006-06-29 2009-07-30 Intuitive Surgical, Inc. Synthetic representation of a surgical instrument
US20080004610A1 (en) * 2006-06-30 2008-01-03 David Miller System for calculating IOL power
US9089256B2 (en) 2008-06-27 2015-07-28 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US9469034B2 (en) 2007-06-13 2016-10-18 Intuitive Surgical Operations, Inc. Method and system for switching modes of a robotic system
US8620473B2 (en) * 2007-06-13 2013-12-31 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
US9138129B2 (en) 2007-06-13 2015-09-22 Intuitive Surgical Operations, Inc. Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide
US9084623B2 (en) 2009-08-15 2015-07-21 Intuitive Surgical Operations, Inc. Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide
GB2454017A (en) * 2007-10-26 2009-04-29 Prosurgics Ltd A control assembly
US9168173B2 (en) * 2008-04-04 2015-10-27 Truevision Systems, Inc. Apparatus and methods for performing enhanced visually directed procedures under low ambient light conditions
US8864652B2 (en) * 2008-06-27 2014-10-21 Intuitive Surgical Operations, Inc. Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the positioning and orienting of its tip
US10117721B2 (en) * 2008-10-10 2018-11-06 Truevision Systems, Inc. Real-time surgical reference guides and methods for surgical applications
US9226798B2 (en) * 2008-10-10 2016-01-05 Truevision Systems, Inc. Real-time surgical reference indicium apparatus and methods for surgical applications
US9173717B2 (en) 2009-02-20 2015-11-03 Truevision Systems, Inc. Real-time surgical reference indicium apparatus and methods for intraocular lens implantation
DE102009010263B4 (de) * 2009-02-24 2011-01-20 Reiner Kunz Verfahren zur Navigation eines endoskopischen Instruments bei der technischen Endoskopie und zugehörige Vorrichtung
WO2010125481A1 (fr) * 2009-04-29 2010-11-04 Koninklijke Philips Electronics, N.V. Estimation de la profondeur en temps réel à partir d'images endoscopiques monoculaires
US8918211B2 (en) 2010-02-12 2014-12-23 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
US9492927B2 (en) 2009-08-15 2016-11-15 Intuitive Surgical Operations, Inc. Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose
US8784443B2 (en) * 2009-10-20 2014-07-22 Truevision Systems, Inc. Real-time surgical reference indicium apparatus and methods for astigmatism correction
US20110213342A1 (en) * 2010-02-26 2011-09-01 Ashok Burton Tripathi Real-time Virtual Indicium Apparatus and Methods for Guiding an Implant into an Eye
DE102010029275A1 (de) * 2010-05-25 2011-12-01 Siemens Aktiengesellschaft Verfahren zum Bewegen eines Instrumentenarms eines Laparoskopierobotors in einer vorgebbare Relativlage zu einem Trokar
US9452276B2 (en) 2011-10-14 2016-09-27 Intuitive Surgical Operations, Inc. Catheter with removable vision probe
US10238837B2 (en) 2011-10-14 2019-03-26 Intuitive Surgical Operations, Inc. Catheters with control modes for interchangeable probes
US20130303944A1 (en) 2012-05-14 2013-11-14 Intuitive Surgical Operations, Inc. Off-axis electromagnetic sensor
KR101876386B1 (ko) * 2011-12-29 2018-07-11 삼성전자주식회사 의료용 로봇 시스템 및 그 제어 방법
TWI517828B (zh) * 2012-06-27 2016-01-21 國立交通大學 影像追蹤系統及其影像追蹤方法
CA2883498C (fr) 2012-08-30 2022-05-31 Truevision Systems, Inc. Systeme d'imagerie et procedes affichant une image reconstituee multidimensionnelle et fusionnee
US10507066B2 (en) 2013-02-15 2019-12-17 Intuitive Surgical Operations, Inc. Providing information of tools by filtering image areas adjacent to or on displayed images of the tools
DE102013108228A1 (de) 2013-07-31 2015-02-05 MAQUET GmbH Assistenzeinrichtung zur bildgebenden Unterstützung eines Operateurs während eines chirurgischen Eingriffs
US10744646B2 (en) 2013-08-29 2020-08-18 Wayne State University Camera control system and method
DE102013109677A1 (de) 2013-09-05 2015-03-05 MAQUET GmbH Assistenzeinrichtung zur bildgebenden Unterstützung eines Operateurs während eines chirurgischen Eingriffs
DE102014118962A1 (de) * 2014-12-18 2016-06-23 Karl Storz Gmbh & Co. Kg Lagebestimmung eines minimalinvasiven Instrumentes
DE102015100927A1 (de) 2015-01-22 2016-07-28 MAQUET GmbH Assistenzeinrichtung und Verfahren zur bildgebenden Unterstützung eines Operateurs während eines chirurgischen Eingriffs unter Verwendung mindestens eines medizinischen Instrumentes
JP6275335B2 (ja) * 2015-05-28 2018-02-07 オリンパス株式会社 内視鏡システム
WO2018159155A1 (fr) * 2017-02-28 2018-09-07 ソニー株式会社 Système d'observation médicale, dispositif de commande et procédé de commande
US10917543B2 (en) 2017-04-24 2021-02-09 Alcon Inc. Stereoscopic visualization camera and integrated robotics platform
US11083537B2 (en) 2017-04-24 2021-08-10 Alcon Inc. Stereoscopic camera with fluorescence visualization
US10299880B2 (en) 2017-04-24 2019-05-28 Truevision Systems, Inc. Stereoscopic visualization camera and platform
WO2018225132A1 (fr) * 2017-06-05 2018-12-13 オリンパス株式会社 Système médical et procédé pour faire fonctionner le système médical
DE102021204031A1 (de) 2021-04-22 2022-10-27 Carl Zeiss Meditec Ag Verfahren zum Betreiben eines Operationsmikroskops und Operationsmikroskop
CN114191099B (zh) * 2022-01-14 2023-12-01 山东威高手术机器人有限公司 微创手术机器人主从跟踪延时测试方法
DE102022118328A1 (de) 2022-07-21 2024-02-01 Karl Storz Se & Co. Kg Steuervorrichtung und System
CN117953043B (zh) * 2024-03-26 2024-06-21 北京云力境安科技有限公司 一种基于内镜图像的区域测量方法、装置及存储介质
CN118319213B (zh) * 2024-06-11 2024-09-27 湖南省华芯医疗器械有限公司 一种引导鞘管、内窥镜组件及检测方法

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5279309A (en) * 1991-06-13 1994-01-18 International Business Machines Corporation Signaling device and method for monitoring positions in a surgical operation
US5417210A (en) * 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US5631973A (en) * 1994-05-05 1997-05-20 Sri International Method for telemanipulation with telepresence
AT399647B (de) * 1992-07-31 1995-06-26 Truppe Michael Anordnung zur darstellung des inneren von körpern
US5829444A (en) * 1994-09-15 1998-11-03 Visualization Technology, Inc. Position tracking and imaging system for use in medical applications
US5836869A (en) * 1994-12-13 1998-11-17 Olympus Optical Co., Ltd. Image tracking endoscope system
WO1996028107A1 (fr) * 1995-03-10 1996-09-19 Forschungszentrum Karlsruhe Gmbh Dispositif de guidage d'instruments chirurgicaux destines a la chirurgie endoscopique
US5887121A (en) * 1995-04-21 1999-03-23 International Business Machines Corporation Method of constrained Cartesian control of robotic mechanisms with active and passive joints
DE19529950C1 (de) * 1995-08-14 1996-11-14 Deutsche Forsch Luft Raumfahrt Verfahren zum Nachführen eines Stereo-Laparoskops in der minimalinvasiven Chirurgie
US6671058B1 (en) * 1998-03-23 2003-12-30 Leica Geosystems Ag Method for determining the position and rotational position of an object
US6546277B1 (en) * 1998-04-21 2003-04-08 Neutar L.L.C. Instrument guidance system for spinal and other surgery

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO0146577A2 *

Also Published As

Publication number Publication date
US20020156345A1 (en) 2002-10-24
WO2001046577A8 (fr) 2008-01-17
DE19961971A1 (de) 2001-07-26
DE19961971B4 (de) 2009-10-22
WO2001046577A2 (fr) 2001-06-28

Similar Documents

Publication Publication Date Title
EP1240418A1 (fr) Procede de poursuite automatique fiable d'un endoscope et pistage (tracking) d'un instrument chirurgical avec un systeme de guidage d'endoscope (efs) a entrainement et a commande electriques, en chirurgie a effraction minimale
DE69322202T2 (de) System und Verfahren zur Verbesserung von endoskopischer Chirurgie
EP3363358B1 (fr) Dispositif de détermination et recouvrement d'un point de référence lors d'une intervention chirurgicale
EP2449997B1 (fr) Poste de travail médical
DE102018206406B3 (de) Mikroskopiesystem und Verfahren zum Betrieb eines Mikroskopiesystems
DE102014016823B4 (de) Verfahren zum Steuern eines an einem Haltearm gekoppelten mechatronischen Assistenzsystems
EP1638064B1 (fr) Simulateur pour l'entrainement aux techniques opératoires minimalement invasive
WO2008058520A2 (fr) Dispositif de génération d'images pour un opérateur
EP1284673A1 (fr) Guidage de camera automatique robotise a l'aide de detecteurs de position destine a des interventions laparoscopiques
WO2015014669A1 (fr) Procédé et dispositif pour définir une zone de travail d'un robot
DE10249786A1 (de) Referenzierung eines Roboters zu einem Werkstück und Vorrichtung hierfür
DE202013012276U1 (de) Vorrichtung zum unterstützen eines laparoskopischeneingriffs - lenk- und manövriergelenkwerkzeug
DE102018125592A1 (de) Steuerungsanordnung, Verfahren zur Steuerung einer Bewegung eines Roboterarms und Behandlungsvorrichtung mit Steuerungsanordnung
DE4412073A1 (de) Operationsmikroskop-Einheit
DE102015216573A1 (de) Digitales Operationsmikroskopiesystem
DE102020215559B4 (de) Verfahren zum Betreiben eines Visualisierungssystems bei einer chirurgischen Anwendung und Visualisierungssystem für eine chirurgische Anwendung
DE102014210056A1 (de) Verfahren zur Ansteuerung eines chirurgischen Geräts sowie chirurgisches Gerät
DE102004052753A1 (de) Verfahren und Operations-Assistenz-System zur Steuerung der Nachführung zumindest eines Hilfsinstrumentes bei einem medizinisch minimal-invasiven Eingriff
EP4284290A1 (fr) Système d'assistance chirurgical à microscope opératoire et caméra et procédé de visualisation
DE102018206405B3 (de) Mikroskopiesystem sowie Verfahren zum Betrieb eines Mikroskopiesystems
DE102020126029A1 (de) Chirurgisches Assistenzsystem und Darstellungsverfahren
DE102020205546A1 (de) Überwachungsverfahren und medizinisches System
DE102020114416A1 (de) System zur Überwachung einer Operationsleuchtenanordnung
DE102008046344A1 (de) Vorrichtung zum Überwachen eines räumlichen Bereichs, insbesondere des Umfelds eines medizinischen Geräts
DE10235795A1 (de) Medizinische Vorrichtung

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20020326

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR

RIN1 Information on inventor provided before grant (corrected)

Inventor name: FISCHER, HARALD

Inventor name: MIKUT, RALF

Inventor name: VOGES, UDO

Inventor name: OBERLE, REINHOLD

Inventor name: EPPLER, WOLFGANG

Inventor name: BREITWIESER, HELMUT

Inventor name: STOTZKA, RAINER

17Q First examination report despatched

Effective date: 20030918

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20040207