WO2017054818A1 - Endoskopisches bildverarbeitungssystem zur einfügung in eine datenleitung - Google Patents
Endoskopisches bildverarbeitungssystem zur einfügung in eine datenleitung Download PDFInfo
- Publication number
- WO2017054818A1 WO2017054818A1 PCT/DE2016/200451 DE2016200451W WO2017054818A1 WO 2017054818 A1 WO2017054818 A1 WO 2017054818A1 DE 2016200451 W DE2016200451 W DE 2016200451W WO 2017054818 A1 WO2017054818 A1 WO 2017054818A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- output
- information
- input
- camera
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
Definitions
- the invention relates to an image processing system specified in the preamble of claim 1 Art.
- Such image processing systems are widely used today in minimally invasive surgery. Due to the short focal length of the cameras used, they have a relatively large depth of field, which is also necessary so that the operator has a good overview of the work area and the objects viewed do not get out of the focus area with each movement of the endoscope.
- the corresponding cameras usually have a fixed distance setting that is adapted to the work area.
- the depth of field may in known systems include, for example, a range of 1 mm to infinity. Since the objects of a work area to be considered can thus be located at different distances from the camera optics, the displayed variable on a reproduction monitor can not be used as a reference for other elements or implants to be introduced into the work area. The true size of objects and their distances between them can therefore only be estimated with great difficulty when viewed endoscopically.
- the surgical measuring technique used in neurology is based on the use of additional imaging systems with different physical bases in order to record the image data recorded before the operation into the running image of the endoscope camera in order to provide an overview of the anatomical anatomy To preserve conditions for the surgeon.
- the shape of at least a part of the auxiliary instrument in the optical imaging system is determined and from this a factor for the image scale is calculated with regard to a reference point which is preferably located at a point of the instrument and when touching touches an organ point first.
- the invention has for its object to provide an image processing system of the type mentioned in which it is possible despite the provision of measures that make it possible to perform surveys in the current image and to display them, to maintain essential parts of the device.
- the invention is based on the recognition that despite the provision of evaluation devices which take measurements in the current image and display them on the monitor, these calculations and processing in the image can be performed by a device which is inserted in the monitor line, if these calculations practically in real-time - that is, without significant time delay - be performed.
- the input line has means for connecting to the video output of a system for processing and video signals recorded by an endoscope camera and intended for display on a monitor.
- the device shown here is suitable for connection to any device having a monitor output to show recorded via an endoscope camera video images.
- the output line has means for connection to a monitor which serves for image reproduction during a - especially medical - operation, so that the device can be inserted into a conventional device arrangement, if only the plug standards and the video format are adapted to each other.
- the processing part provided between the input and output data lines has a data and video processor which adds image contents to the video input signal that have been generated by processing from the current video input signal.
- a data and video processor which adds image contents to the video input signal that have been generated by processing from the current video input signal.
- the representation of survey information in analog or digital representation which can be evaluated via their optical information as distance information and recorded in the detection range of the endoscope camera of the upstream system. These may be based on a scaling device, which are visible in the image. Both the imaging properties of the endoscope camera and those of the scaling device must be known.
- At least one graphic element is then generated via the data and video processor, which is superimposed on the input signal directly and in real time via summing means so that it is visible in the video signal that arrives at the output data line and thus at the monitor from the output of the summation circuit.
- a bridging circuit is provided for the connection between the input and output data lines. which is designed as a sleep switch and in the currentless case, the input and output data line galvanically bridged.
- the processing unit itself is simultaneously galvanically isolated with the galvanic bridging of the input and output data lines, so that the galvanic coupling of fault currents into the external video reproduction line is prevented for any type of electrical malfunction.
- a detection circuit for various malfunctions such as a fault in the power supply, the response of an over-temperature detector, etc. is provided, which activates the bypass circuit in response to an error signal.
- the error bridging is ensured in any case, because the relays in question rest in their rest position the video line galvanic and otherwise separate from the processing unit.
- the processing unit is activated if no criterion is met, which concerns a temporary impairment of the data evaluation, such as a distortion of the image information by a camera movement, an overexposure or color falsification.
- the processing unit is an integral part of the monitor, so that results in a particularly small design and some assemblies that would otherwise have been performed separately omitted.
- the processing unit is provided as a self-contained device with its own power supply, there is the advantage that increased reliability and a largely universal adaptability to a variety of industrially available endoscope systems is given.
- a detection circuit is provided by means of which the processing unit is capable of recognizing the connected video camera based on its specific data such as line number, frame rate or pixel error in order to activate a stored camera-specific equalization.
- the stenciled area in the image of the input video information is excluded from the overlay so as not to overlay system displays of the main system.
- FIG. 1 shows the principle of an image processing unit to the embodiment of the inventive endoscope system shown in FIG. 4, FIG.
- FIG. 2 shows a surgical instrument for use with the image processing unit according to FIG. 1, FIG.
- FIG. 3 shows a graphic construct as a graphical simplification of the instrument according to FIG. 2 for evaluation with the image processing unit according to FIG. 1, FIG.
- FIG. 4 shows an exemplary embodiment of the system according to the invention for interacting with the image processing unit according to FIG. 1 and an instrument according to FIGS. 2 and 3.
- an endoscope camera 1 is provided with an optical system 2 with a polarization filter 2 a whose output video signal passes through a motion discriminator 3 and an adder circuit 4 to be described in greater detail below.
- the function of the polarization filter 2a is shown in more detail below.
- the output of the addition circuit 4 is connected to a monitor 5, which is set up for the reproduction of the image taken by the endoscope camera 1. This monitor 5 is used by the treating surgeon to monitor the operating area in minimally invasive technique.
- the output signal of the endoscope camera 1 reaches a frame memory 6, which is designed to hold the respective current image from the video stream of the output signal of the endoscope camera 1.
- the endoscope camera transmits with the video stream an identification signal contained therein, which contains characteristic values of the optics 2 and of the image converter chip contained in the camera 1 or a coding characterizing this.
- This identification signal is evaluated in a camera recognition unit 7, which activates a geometry equalization circuit 8, which in turn determines the correction values for the individual image portions, which indicate the amount by which they are to be shifted in order to produce a geometrically "pure" image which is free from influences caused by individual errors or distortions of the camera optics.
- the correction level for the imaging geometry 9 thus generates an image which represents the objects to be reproduced in the image plane, such as the geometric image transformation should take place through the lens according to the lens equation.
- the initial evaluation is usually done at low resolution and large image window, while after finding an edge (as described below, the resolution is increased to increase the measurement accuracy and the image area is reduced to the region of interest (ROI).
- Downstream of the image field and quality selection stage 10 is a circuit for improving the image representation 11 in the sense of improving the possibility of evaluation by the subsequent stages.
- This improvement consists in that those portions of the picture content which contribute to the contour recognition to be described below are amplified as far as possible in their amplitude or their effect, while those portions of the picture which do not contribute thereto are as far as possible reduced in their effect.
- the geometrically and contentwise improved image information in terms of its quality of reproduction reaches an image analysis unit 13 which contains the steps which are provided for actually finding the information of interest and for evaluating the same.
- the first stage of the image analysis unit 13 forms an edge detection unit 14. This involves the detection of the straight-line edge contours of the cylindrical area of the surgical instrument which are rectilinear in the planar representation.
- an algorithm for the detection of straights in the image after differentiation finds an algorithm according to Hough (the so-called Hough Transformation), which finds collinear points in the image.
- Hough the so-called Hough Transformation
- an image window (image segment) is thus selected for further processing, which comprises the region of the straight line.
- image segment image segment
- the image field selected to find the straight line does not cover the entire image, but only the edge area, since the instruments used during the operation always protrude from the outside into the image when they are displayed, so that they are reliably detected in this way can.
- this image part is in high resolution after raising the image quality of the edge detection stage 14 and a subsequent stage to the end region detection 15 (FIG. again), processing now being of high quality.
- the edge detection is performed again with high precision (at high resolution) and, to facilitate the geometric processing, the central axis of the two obtained straight lines is calculated and added.
- an edge is determined transversely to the previously recognized lateral boundaries of the shaft part of the surgical instrument, namely at the end remote from the edge of the image.
- Another criterion used is the fact that the straight line must run perpendicular to the two lateral boundary lines of the representation. This limitation is clearly emphasized on the shaft part (see also FIG. 1b and the associated description) and is also determined after differentiation with the Hough transformation.
- steps 14 and 15 are combined in the form of a contour in the form of a contour and form in the summary a "U" (see Fig. 3) or its axis of symmetry as the central axis of the cylinder and that located on the axis distal end of the shaft area characteristic point. (Both constructs are equivalent to one another and can be used according to circumstances.)
- This planar contour representation is transferred to a corresponding contour memory 16, together with an indication of the approximate position in the image. Due to the perspective image of the shaft, the lateral boundary lines do not run parallel. The deviation from the parallelism characterizes the inclination of the shaft about an axis directed perpendicular to the viewing direction of the endoscope optic axis.
- the U-shaped contour or its axis of symmetry and the point marking the distal end of the shaft with their characteristic data in pixel or vector representation reach the input of an optical computer 17, which practically forms an optical lens simulator, and executes the lens equation as a transformation.
- he does not calculate the mapping to a given object for a given optics, but calculates the position and orientation in space for an object of known contour - here the distal shaft end of a surgical instrument - a forceps - whose imaging contour is known.
- This is particularly easy, because the lens equation allows plane objects to be transferred directly into the image plane. Since the image structure found is plane, it is possible to directly define a corresponding contour for the real object, which is essentially cylindrical and thus clearly defined.
- both the edge detection unit 14 and the end detection unit 15 each determine a spatial reference value, which is composed of the direction of the central axis the cylindrical shaft and the direction of the edge connecting the end points of the contour of the cylindrical shell of the shaft, so that the resulting intersection in the middle of the crossbar of the U is a reference for the corresponding transformation to find the position and orientation of the real instrument due to the knowledge forms the contour of a part of his picture.
- the optics calculator 17 is supplied with characteristic data of the optics - essentially the focal length - and the dimensions of the real instrument from corresponding memories 18 and 19 for the calculation to be carried out.
- the optics computer 17 can also be designed as an addressable look-up memory, as described with reference to Fig. 2. This allows lengthy calculation times to be shortened, which is very advantageous in terms of real-time representation. )
- the orientation of the auxiliary tool is not important per se, but the position of a protruding reference point for the measurement at the distal end of the auxiliary tool, which first comes into contact with the tissue during an operation and therefore serves as a pointing instrument.
- the coordinates of this reference point are given by extending the central axis of the cylindrical shaft over the position of the point of intersection with the transverse bar of the U-shaped contour (corresponding to the point defining the end of the shaft on the central axis of the cylinder) towards the distal end of the Instrument is defined by a distance value that can be taken from the dimensional data of the instrument, although its possible eccentricity must be taken into account.
- this eccentricity requires knowledge of the orientation of the instrument with respect to rotation about its central axis.
- a number of markings are mounted, which are optically readable in the manner of digital codes and have a digitally coded information on the type of QR codes. (This will be explained in more detail with reference to Fig. 2.) It is important that the coding ensures that coding is performed such that the coded information differs when viewed from different directions, so that from each viewing direction the radial orientation of the instrument is unambiguous is removable. If, as here, a viewing direction is specified, the orientation of the instrument can therefore be unambiguously assigned from this viewing direction.
- the identification of the optically readable data takes place in OCR module 20 within the image analysis unit 13.
- the camera facing data of the corresponding ring area are transmitted to the optical computer 17, which also performs the data analysis and comparison with a corresponding existing in the instrument data memory 19 record an angle value is output together with a fixed value for the amount of eccentricity, which is likewise stored in the instrument data memory 19 specifically for the relevant instrument.
- a reference point located eccentric to the central axis its position on the illustrated optical detection steps for the surgical instrument is clearly defined, although only a portion of this instrument could be used for evaluation by optical observation.
- the spatial position of a correspondingly selected reference data system is stored in the position memory together with the scale factor for the imaging at this point - relative to a plane perpendicular to the camera viewing direction.
- the reference point at the extremity is available as a pointer with a starting point for the measurement within the surgical field (Situs). By calculating in real time, this reference point can be chosen freely and form the basis for the features shown below.
- These graphic auxiliary representations which will also be described below for facilitating the measurement, are contained in a memory for graphics 22 to be displayed virtually in the image and can optionally be activated via a selection circuit 23. These include, for example, a marking of the reference point or a virtual dipstick, which can be connected to the reference point of the instrument and moved with it in the image.
- Superposition level 4 displays the virtual graphic guided with the instrument in the current camera image for display on the monitor.
- the motion detection stage 3 which monitors the optical flow in the image.
- the function is similar to that of an optical computer mouse, whose processor responds to the camera image as a whole and emits a signal when the movement speed exceeds a predetermined threshold.
- the output signal of the motion detection stage 3 blocks the output of the position data memory 21, so that no survey information appears.
- the activation of the surveying function requires a resting state of camera and tool, which is also a prerequisite for the reading of measured values. With normal activity in the situs, as it accompanies the execution of operative manipulations, the survey information is switched off and therefore does not appear disturbing.
- An improvement in the accuracy of the measurement of the position of the auxiliary tool can be achieved by the fact that on the shaft of the auxiliary tool preferably at the end region circumferential rings are generated, as they can be detected according to the circumferential end edge of the shaft portion. At least two parallel ring parts or one in addition to the annular edge at the end of the cylindrical shaft portion, which are in the field of view of the camera, then form in their orientation one or more rungs within the "U" or on the central axis and by their distance a measure of Inclination of a shaft in addition to deviation of vertical lines of "U" from parallelism.
- delayed signal cancellation circuit 25 which receives a logical input signal from the moving image recognition circuit 3 when substantial image parts are in motion, as a result of the evaluation of the optical flow as differentiation of parts of the image content
- a signal indicating this state appears at the output of stage 3, which prevents the output of the current position of the instrument and of the associated scale value for its reference point via the reset input of position data circuit 21.
- the cancellation of the reset signal is delayed by the delay circuit 25, which leaves even after calming the image, the reset signal for a period of about up to one second, so that a signal evaluation and display of surveying data takes place only when the camera and the instrument is decidedly kept quiet.
- FIG. 1 shows the image processing in the form of modules for clarity in the form of blocks, which is oriented in its description rather to a hardware solution.
- the practical implementation nowadays is performed using processors, here a combination of CPU and GPU in cooperation or corresponding task sharing, whereby the functions described in connection with the individual blocks are executed in real time by the combination of processors with the corresponding software.
- FIG. 2 shows a side view of a surgical instrument 100 according to the invention in the form of a forceps. It has a cylindrical shaft portion 101 which has a dull dark surface provided with a nano-coating which prevents adhesion of liquids or small particles. The distal end of the shaft region is marked by a contrasting ring 102 followed by a coding region 103 which has digital or analog markings which are optically detectable and differ depending on the viewing direction from which the shaft is viewed.
- a reference point 106 is located at the end of the instrument which, when used as a pointing instrument, first comes into contact with the body tissue in the situs region. It is the reference point for surveying or for surveying instruments or data displayed on the screen.
- the shaft of the instrument extends from the edge region 107 of the image into the image part in which a position evaluation of the instrument 100 is to take place.
- the exact positioning of the respective local imaging scale can be calculated by the plane U-shaped construct shown in FIG. 3 which consists of the constructive end point of the shaft as a reference point 117 and the parallel lines delimiting the contour, which run at a distance of the shaft diameter , so that the local magnification results from the respective distance in the image.
- the construct according to FIG. 3 is defined by only three points, namely the reference point for the shank end 117 and two points of one of the straight lines 108 or 109, which in fact run symmetrically to the central axis 110 of the shank.
- a (supplementary) system unit 403 which requires the camera to operate and process the video data on a monitor 404 Components has. This includes the power supply, a lighting unit and the controls, which are still necessary for the operation of the conventional system.
- a processing system 406 Powered into the monitor line is a processing system 406 via interfaces 407 and 408 which are electrically separable.
- this is the socket of the conventional system for a monitor 404, for which the system unit 403 has a standard plug, which together form the interface 406.
- the interface 407 is constructed in reverse.
- the system unit 403 has at the output a socket into which the plug of the monitor is inserted.
- the supplementary system unit 403 can be easily inserted into an existing conventional system when the reproduction formats are uniform.
- the components of the system unit 403 are outlined by a dashed line.
- the input signal from the interface 405 is branched and arrives at a galvanic off switch 409 whose number of poles is adapted to the number of poles of the video line to be switched and which is turned on in the idle state to the interface 404 at the output.
- This can be bridged in case of malfunction or power failure, so that the security for uninterrupted operation of the external conventional system is ensured.
- the system unit simply turns off and the conventional system continues to operate, but without the additional functions of the system unit.
- the circuit breaker 409 is controlled via a logic unit 410, which implements an AND function with an inverting input.
- the non-inverting input is driven by a signal indicative of the presence of the supply voltage while the second input is driven via the inverter input, so long as a self-test unit in the processor unit 411 does not indicate a logic signal indicating an error via a corresponding line 412 outputs.
- a galvanic first on-switch 413 and a galvanic second on-switch 414 are driven via the output signal of the logic unit 410, so that the system unit 405 is inserted into the line between the conventional system 403 and the monitor 404.
- the processor unit 411 is preceded by a so-called frame grabber 415, which decomposes the video signal intended for the monitor output into its individual image components and thus processes them for further processing.
- the output signal of the processor unit 411 is superimposed in an addition circuit 416 to the input signal from the output of the first switch 413, wherein the signal to be superimposed, the survey data to be displayed covers the original signal.
- a memory circuit is still provided, which makes it possible to record the complete during the operation on the monitor 404 signal to be reproduced, for which purpose preferably a USB stick is used, which is easy to handle for the task.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Endoscopes (AREA)
Abstract
Description
Claims (12)
- Endoskopisches Bildverarbeitungssystem enthaltend eine Eingangs- und eine Ausgangsdatenleitung für Videosignale mit Mitteln, welche eine im Erfassungsbereich einer Endoskop-Kamera aufgenommene geometrische Distanzinformation in Form einer optischen Information verarbeiten, die zusammen mit mindestens einem Bild (oder Teilbild) in der von einer Kamera aufgenommenen laufenden Bildinformation enthalten ist, detektieren, von der Bildinformation separieren und weiter aufbereiten, um eine Vermessungsinformation aus dem durch die Digitalkamera gewonnenen Bildinhalt zu erhalten,dadurch gekennzeichnet,dass die Eingangsleitung Mittel zum Anschließen an den Videoausgang eines Systems zur Verarbeitung und von Videosignalen aufweist, die von einer Endoskop-Kamera aufgenommen und zur Wiedergabe auf einem Monitor bestimmt sind und dass die Ausgangsleitung Mittel zum Anschluss an einen Monitor aufweist, der zur Bildwiedergabe während einer Operation dient,dass zwischen der Eingangs- und der Ausgangsdatenleitung eine Verarbeitungseinheit vorgesehen ist,die Verarbeitungseinheit einen Daten- und Videoprozessor aufweist, welcher dem Videoeingangssignal Bildinhalte zufügen, die durch Verarbeitung aus dem aktuellen Videoeingangssignal entstanden sind, wobei es sich dabei um die Darstellung von Vermessungsinformationen in analoger oder digitaler Darstellung handelt, die auf der im Erfassungsbereich der Endoskop-Kamera des vorgeschalteten Systems aufgenommene geometrische Distanzinformation in Form einer optischen Information durch Abbildung einer maßstabgebenden Vorrichtung beruhen sowiedass den die Abbildungseigenschaften der Endoskop-Kamera charakterisierenden Daten und die der maßstabgebenden Vorrichtung und Summierungsmittel, welche das Ausgangsignal der Datenverarbeitungsteil deren Eingangssignal unmittelbar überlagert und dass der Ausgang der Summierungsschaltung mit der Ausgangsdatenleitung verbunden ist.
- Endoskopisches Bildverarbeitungssystem nach Anspruch 1, dadurch gekennzeichnet, dass eine Überbrückungsschaltung zur Verbindung zwischen der Eingangs- und der Ausgangsdatenleitung vorgesehen ist, welche als Ruheschalter ausgestaltet ist und im stromlosen Fall die Eingangs- und Ausgangsdatenleitung galvanisch überbrückt.
- Endoskopisches Bildverarbeitungssystem nach Anspruch 2, dadurch gekennzeichnet, dass mit der galvanischen Überbrückung der Eingangs- und der Ausgangsdatenleitung die Verarbeitungseinheit gleichzeitig galvanisch getrennt wird.
- Endoskopisches Bildverarbeitungssystem nach Anspruch 2 oder 3, dadurch gekennzeichnet, dass eine Detektionsschaltung für eine mögliche Haupt-Fehlfunktion wie ein Fehler in der Stromversorgung, das Ansprechen eines Detektors für Übertemperatur vorgesehen ist, welche auf ein Fehlersignal hin die Überbrückungsschaltung aktiviert.
- Endoskopisches Bildverarbeitungssystem nach einem der vorangehenden Ansprüche, dadurch gekennzeichnet, dass das Ausgangssignal der Verarbeitungseinheit dem Eingangssignal überlagert wird, wenn kein Kriterium erfüllt ist, welches eine temporäre Beeinträchtigung der Datenauswertung betrifft, wie eine Verfälschung der Bildinformation durch eine Kamerabewegung, eine Überstrahlung oder Farbverfälschung.
- Endoskopisches Bildverarbeitungssystem nach einem der vorangehenden Ansprüche, dadurch gekennzeichnet, dass die Verarbeitungseinheit integrierender Bestandteil des Monitors ist.
- Endoskopisches Bildverarbeitungssystem nach einem der vorangehenden Ansprüche, dadurch gekennzeichnet, dass die Verarbeitungseinheit als selbstständiges Gerät mit eigener Stromversorgung vorgesehen ist.
- Endoskopisches Bildverarbeitungssystem nach einem der vorangehenden Ansprüche, dadurch gekennzeichnet, dass eine Erkennungseinheit vorgesehen ist, welche Betriebseigenschaften der angeschlossenen Kamera wie Pixelzahl, Bildwiederholrate, Verzerrung nach deren Anschluss selbstständig ermittelt und Signale aussendet, welche nachfolgende Schaltungsteile entsprechend konfigurieren.
- Endoskopisches Bildverarbeitungssystem nach einem der vorangehenden Ansprüche, dadurch gekennzeichnet, dass die Einfügung der Daten in der Überlagerungseinheit mittels eines Schablonenbereichs erfolgt, welche Bereiche des Eingangs-Videoinformation von der Überlagerung ausnimmt. in denen Systeminformationen des Hauptsystem.
- Endoskopisches Bildverarbeitungssystem nach Anspruch 9, dadurch gekennzeichnet, dass die Schablonenbereiche solche Randbereiche des Eingangsbildes darstellen, in denen Systemanzeigen des Hauptsystems erscheinen.
- Endoskopisches Bildverarbeitungssystem nach einem der vorangehenden Ansprüche, dadurch gekennzeichnet, dass Speichermittel zur Aufzeichnung des dem Ausgang nach der Überlagerung von Vermessungsdaten erscheinenden Signals vorgesehen sind.
- Endoskopisches Bildverarbeitungssystem nach Anspruch 11, dadurch gekennzeichnet, dass es sich bei den Speichermitteln um einen USB- oder anderen über eine Steckverbindung anschließbaren Festkörperspeicher handelt.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112016004501.8T DE112016004501A5 (de) | 2015-10-01 | 2016-09-27 | Endoskopisches bildverarbeitungssystem zur einfügung in eine datenleitung |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102015013037.4 | 2015-10-01 | ||
DE102015013037 | 2015-10-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017054818A1 true WO2017054818A1 (de) | 2017-04-06 |
Family
ID=57286177
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/DE2016/200451 WO2017054818A1 (de) | 2015-10-01 | 2016-09-27 | Endoskopisches bildverarbeitungssystem zur einfügung in eine datenleitung |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE112016004501A5 (de) |
WO (1) | WO2017054818A1 (de) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5436655A (en) * | 1991-08-09 | 1995-07-25 | Olympus Optical Co., Ltd. | Endoscope apparatus for three dimensional measurement for scanning spot light to execute three dimensional measurement |
US6459481B1 (en) | 1999-05-06 | 2002-10-01 | David F. Schaack | Simple system for endoscopic non-contact three-dimentional measurement |
WO2010061293A2 (en) | 2008-11-26 | 2010-06-03 | Haptica Limited | System and method for measuring objects viewed through a camera |
DE102009038021A1 (de) | 2009-08-18 | 2011-02-24 | Olaf Dipl.-Ing. Christiansen | Bildverarbeitungssystem mit einer zusätzlichen zusammen mit der Bildinformation zu verarbeitenden Maßstabsinformation |
WO2012107041A1 (de) | 2011-02-11 | 2012-08-16 | Olaf Christiansen | Endoskopisches bildverarbeitungssystem mit mitteln, welche im erfassungsbereich einer optischen digitalkamera eine geometrische vermessungsinformation erzeugen |
-
2016
- 2016-09-27 DE DE112016004501.8T patent/DE112016004501A5/de not_active Withdrawn
- 2016-09-27 WO PCT/DE2016/200451 patent/WO2017054818A1/de active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5436655A (en) * | 1991-08-09 | 1995-07-25 | Olympus Optical Co., Ltd. | Endoscope apparatus for three dimensional measurement for scanning spot light to execute three dimensional measurement |
US6459481B1 (en) | 1999-05-06 | 2002-10-01 | David F. Schaack | Simple system for endoscopic non-contact three-dimentional measurement |
WO2010061293A2 (en) | 2008-11-26 | 2010-06-03 | Haptica Limited | System and method for measuring objects viewed through a camera |
DE102009038021A1 (de) | 2009-08-18 | 2011-02-24 | Olaf Dipl.-Ing. Christiansen | Bildverarbeitungssystem mit einer zusätzlichen zusammen mit der Bildinformation zu verarbeitenden Maßstabsinformation |
WO2012107041A1 (de) | 2011-02-11 | 2012-08-16 | Olaf Christiansen | Endoskopisches bildverarbeitungssystem mit mitteln, welche im erfassungsbereich einer optischen digitalkamera eine geometrische vermessungsinformation erzeugen |
Also Published As
Publication number | Publication date |
---|---|
DE112016004501A5 (de) | 2018-07-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017054817A1 (de) | Endoskopisches bildverarbeitungssystem für die chirurgie mit mitteln, welche im erfassungsbereich einer optischen digitalkamera eine geometrische distanzinformation erzeugen | |
DE102011078212B4 (de) | Verfahren und Vorrichtung zum Darstellen eines Objektes | |
DE10215808B4 (de) | Verfahren zur Registrierung für navigationsgeführte Eingriffe | |
DE69432961T2 (de) | Anordnung zur Bestimmung der gegenseitigen Lage von Körpern | |
WO2012107041A1 (de) | Endoskopisches bildverarbeitungssystem mit mitteln, welche im erfassungsbereich einer optischen digitalkamera eine geometrische vermessungsinformation erzeugen | |
EP1517647A1 (de) | Verfahren und instrument zur chirurgischen navigation | |
EP3500152B1 (de) | Detektionssystem zur automatischen detektion chirurgischer instrumente | |
EP3332730B1 (de) | Verfahren und trackingsystem zum nachverfolgen eines medizinischen objekts | |
DE102011012460A1 (de) | Chirurgisches Instrument mit integrierter Navigationskontrolle | |
DE102005041602A1 (de) | Verfahren zur Darstellung eines medizinischen Implantats in einem Bild sowie medizinisches bildgebendes System | |
EP2923334A1 (de) | Bestimmen der räumlichen lage und orientierung der wirbelkörper der wirbelsäule | |
DE102007054450A1 (de) | Vorrichtung zur Bereitstellung von Bildern für einen Operateur | |
DE102011005259A1 (de) | Verfahren und System zur Darstellung videoendoskopischer Bilddaten eines Videoendoskops | |
DE112021003948T5 (de) | Bildrekonstruktion und endoskopische verfolgung | |
DE10359981A1 (de) | System und Verfahren zur In Vivo Positions- und Orientierungsbestimmung einer Endoskopie-Kapsel bzw. eines Endoroboters im Rahmen einer kabellosen Endoskopie | |
DE102018119343B4 (de) | Verfahren zur Kalibrierung von Objekten in einem Referenzkoordinatensystem und Verfahren zum Tracking von Objekten | |
DE102010041847A1 (de) | Sensoreinheit für ein Stereoendoskop und Stereoendoskopiesystem | |
DE102011006537B4 (de) | Verfahren zur Registrierung eines ersten Koordinatensystems einer ersten medizinischen Bildgebungseinrichtung mit einem zweiten Koordinatensystem einer zweiten medizinischen Bildgebungseinrichtung und/oder einem dritten Koordinatensystem eines medizinischen Instruments, welches durch Marker einer medizinischen Navigationseinrichtung definiert ist, und medizinisches Untersuchungs- und/oder Behandlungssystem | |
DE19648338C2 (de) | Vorrichtung zur Nachführung einer Zielmarke für ein Therapiegerät | |
DE102017219621A1 (de) | Visualisierungssystem mit einem Beobachtungsgerät und einem Endoskop | |
DE102009009165B4 (de) | Verfahren und Vorrichtung zur Bestimmung eines von einer Endoskopiekapsel in einem Patienten zurückgelegten Weges | |
DE102016105962A1 (de) | Positionsermittlungsanordnung für Intraokularlinse | |
EP3598948A1 (de) | Bildgebendes system und verfahren zum erzeugen einer stereoskopischen darstellung, computerprogramm und datenspeicher | |
WO2017054818A1 (de) | Endoskopisches bildverarbeitungssystem zur einfügung in eine datenleitung | |
EP3626176B1 (de) | Verfahren zum unterstützen eines anwenders, computerprogrammprodukt, datenträger und bildgebendes system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16794506 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112016004501 Country of ref document: DE |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: R225 Ref document number: 112016004501 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16794506 Country of ref document: EP Kind code of ref document: A1 |