EP2672915A1 - Système endoscopique de traitement d'image pourvu de moyens générant des informations de mesure géométriques dans la plage de prise de vues d'une caméra numérique optique - Google Patents

Système endoscopique de traitement d'image pourvu de moyens générant des informations de mesure géométriques dans la plage de prise de vues d'une caméra numérique optique

Info

Publication number
EP2672915A1
EP2672915A1 EP12717164.3A EP12717164A EP2672915A1 EP 2672915 A1 EP2672915 A1 EP 2672915A1 EP 12717164 A EP12717164 A EP 12717164A EP 2672915 A1 EP2672915 A1 EP 2672915A1
Authority
EP
European Patent Office
Prior art keywords
image
instrument
camera
information
auxiliary instrument
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP12717164.3A
Other languages
German (de)
English (en)
Inventor
Olaf Christiansen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of EP2672915A1 publication Critical patent/EP2672915A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1076Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions inside body cavities, e.g. using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • A61B90/94Identification means for patients or instruments, e.g. tags coded with symbols, e.g. text
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • A61B90/98Identification means for patients or instruments, e.g. tags using electromagnetic means, e.g. transponders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/58Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00057Operational features of endoscopes provided with means for testing or calibration
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/061Measuring instruments not otherwise provided for for measuring dimensions, e.g. length
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F04POSITIVE - DISPLACEMENT MACHINES FOR LIQUIDS; PUMPS FOR LIQUIDS OR ELASTIC FLUIDS
    • F04CROTARY-PISTON, OR OSCILLATING-PISTON, POSITIVE-DISPLACEMENT MACHINES FOR LIQUIDS; ROTARY-PISTON, OR OSCILLATING-PISTON, POSITIVE-DISPLACEMENT PUMPS
    • F04C2270/00Control; Monitoring or safety arrangements
    • F04C2270/04Force
    • F04C2270/041Controlled or regulated
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30021Catheter; Guide wire
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/555Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes

Definitions

  • the invention relates to an image processing system specified in the preamble of claim 1, which is particularly useful for medical purposes.
  • Such image processing systems are used today, for example in the form of digital endoscope cameras both in the general technology - in hard to reach repair sites - as well as in minimally invasive surgery. Due to the short focal length of the cameras used, they have a relatively large depth of field, which is also necessary so that the operator has a good overview of the work area and the objects viewed do not get out of the focus area with each movement of the endoscope.
  • the corresponding cameras have a fixed distance setting that is adapted to the work area.
  • the depth of field may in known systems include, for example, a range of 1 mm to infinity.
  • the displayed size on a display monitor can not be used as a reference for other elements or implants to be introduced into the work area.
  • the true size of objects and their distances between them can therefore only be estimated with great difficulty when viewed endoscopically.
  • a multi-point laser projection method is known, with the aid of which laser markings are generated on an object from the surroundings of the optics, which are then taken together with the image information and filtered out of the object in a postprocessing of the finished image and assigned geometric information due to their arrangement in the image become.
  • statements can be made about the distances of the laser markings from each other or their distance from the camera optics.
  • This system can also be used accordingly in medical image reproduction, as described in http://www.egms.de/de/meetings/hnod2009/09hnod324.shtml . This is done manually based on the determined by means of laser points distances of the object on the volume.
  • US 6891148B1 also discloses a system for generating parallel laser beams for the scaling of photographic images. Measures to influence the imaging scale are also not provided.
  • a disadvantage is that the evaluation is not done in real time, so that the results are not immediately available.
  • the endoscope for evaluation must not be pulled out of the access opening. Nevertheless, there is still a significant interruption of work, which can not be tolerated, for example, in minimally invasive operations in the medical field.
  • a special endoscope camera with means for generating laser markings is required, which appear at fixed points.
  • US Pat. No. 7206006B2 also discloses an image processing system with which a distance information determined on the basis of a distance measurement between the camera and the object is set to the scale of an image to be reproduced so that the object is reproduced in original size. In this case, there is the difficulty of making the measurement also the respective relevant object parts whose reproduction is essential to the original scale. In addition, a rangefinder is not useful in endoscopic image processing systems.
  • the invention has for its object to provide an image processing system of the type mentioned above, with which it is possible for an operator during - especially endoscopic - observation exclusively with a normal endoscope camera on a display monitor to make estimates in the operating area with great accuracy and also measurements and Immediate processing without the need for manual measurements on the inside using mechanical or electronic scales or measuring aids. Also, the measurement should be possible immediately during the ongoing observation, so that not only a still image for subsequent measurement must be selected and edited. This is particularly advantageous for minimally invasive operations carried out under endoscopic observation, in which the burden on the patient is to be kept low by keeping the operating time as short as possible. The survey should readily take into account the local scale of the image information recorded by the camera at the measuring point, without the need for later conversions.
  • the invention is based on the recognition that an instrument used in an endoscopic examination, which is located in the field of view of the endoscope camera, can be used for automatically executing geometric calculations by means of data processing if it is possible to automatically detect and normalize the proportions of the instrument. such that the image scale of objects near the instrument is known and can be used to automatically perform geometric measurements and calculations.
  • the geometric distance information forms the basis for the scale information, which in turn serves as a local scale factor for converting the local length dimensions in the image into real dimensions.
  • the comparison information consists of images of the distal part of the auxiliary instrument from different directions and forms the reference base for the size adjustment of the auxiliary instrument to determine its magnification and spatial orientation in the image.
  • the geometric reference information forms the relationship between a reference location and the map to be able to define the position of a geometric reference location (optical identification of the geometric location) of the imaged instrument with respect to the image origin of the map
  • the cursor forms the optical identification of the associated geometric reference location of the instrument in the image when the stored image of the instrument has been aligned with the image representation and the reference location associated with the image is transferred to the screen.
  • surveying information includes the combination of scale and reference location.
  • such measurements may be used in estimating size dimensions of objects or even motions, including those of the instrument itself.
  • auxiliary energy in the observation room it is thus possible to measure the image scale and the distance from the camera by means of the use of an auxiliary instrument as a reference at at least one image position. From the size of the image of the distal region of an otherwise common instrument or an instrument part in the current playback, a known distance information is determined from the image and the local scale of the reproduction of the environment - by conversion into a survey information - closed. In other endoscopic applications, a working instrument can often be introduced into the field of view of the camera through an additionally available working channel.
  • This survey information is the reference for the image scale in the distal area of the auxiliary instrument.
  • the image scale of the adjacent object area can be determined, so that distances can be measured directly from the image. It is also possible to visually connect a virtual dipstick to the image of the distal end of the instrument so that it can be moved with the instrument to take measurements on the object when the instrument recognition in the image is continuously updated with the corresponding evaluation in real time.
  • a geometrical distance information is generated by the image of a part of the auxiliary instrument to be introduced into the receiving area of the camera in the detection area of the optical digital camera within the observation space, which generates scale by comparison with the stored image of the relevant part of the auxiliary instrument. It is not necessary that the geometric distance information directly on the surface of the instrument readable or otherwise removed. The distance information can also be obtained implicitly by matching the camera image with the stored comparison image of the instrument.
  • the scale factor in this case is not determined from a dimension taken from the image information, but directly from the factor to be set during the size adjustment.
  • auxiliary instrument acts as a "pointer"
  • Memory means are in particular provided for a mapping of the relevant part of the auxiliary instrument as comparison information for the purpose of comparison with views contained in the current camera image from different directions in coarse and fine representation or means for the corresponding on-demand reduction of the image information of the stored image.
  • any data representation is suitable which is able to find a reference for finding the image of the auxiliary instrument or its relevant part in the recorded camera image.
  • This may in particular be an all-round representation of the surface of the end region or a corresponding three-dimensional representation, this being preferably matched to the comparison method used.
  • pixel or vector representations are suitable, as they can also be used for methods of content-based image search with the corresponding mathematical methods.
  • geometrical information is generated in the detection range of the optical digital camera, which is processed together with the image information and then detected and used to generate additional geometric data concerning the image content as survey information.
  • the survey information is thereby obtained from one of the image of the auxiliary instrument, which may be formed as a normal surgical instrument, wherein the image of a part of the auxiliary instrument extracted from the camera image and serves as a reference for the magnification.
  • an additional geometric information is then inserted true to scale in fixed relation to the auxiliary instrument in the current camera image as a virtual measuring or display element. Normally, this will be a cursor indicating that the auxiliary instrument has been captured and used for scaling. For example, this can also be a dipstick that appears in the image attached to the auxiliary instrument and can be "operated" with this.
  • the storage means also includes a geometric reference point in fixed relative geometric association with the image stored in the first storage means. This virtual reference point is superimposed after the detection of the position of the auxiliary instrument and its scale in the image as a reference position as a cursor-like optical marking in the current camera image to obtain a defined reference point for the positioning of the instrument.
  • This reference defines the starting point for measurements and instrument movements to be recorded. Preferably, it is located in a projecting portion of the instrument, which comes directly into contact with the body tissue, so that it also forms the calculation point for the local image scale, which is thus as closely as possible attached to an image adjacent to the instrument part object.
  • the current position of the instrument in the camera image is compared by a first rough comparison recognized by the current image and stored image.
  • the further fine adjustment detector means for adjusting the size and orientation of the stored image of the sub-image of the auxiliary instrument relative to the corresponding image of the auxiliary instrument in the camera image, taking into account its perspective distortion to obtain a congruent position by (iteratively) selecting one of several views from different directions and their displacement in the camera image in different coordinate directions while changing the magnification of the exact position of the auxiliary instrument and its orientation and size is determined so that the position of the instrument part of interest is accurately detected.
  • a "virtual" graphic to be superimposed into the current image is generated, which is adapted in size and orientation to the controlling auxiliary instrument and can interact with the remaining image content for the viewer. This is useful, for example, for measurements or even more complex calculations, which can be carried out depending on the position of the virtual graphic. It is important that the graphic depends on the position and orientation of the relevant part of the auxiliary instrument, so that it fits into the image to scale and that the measurements can also be made true to scale.
  • the graphic represents a ruler emanating from the end of the instrument, which is directed parallel to the image plane (ie perpendicular to the optical axis).
  • the local scale is determined and a corresponding length division (in a freely selectable unit of length) is represented by the ruler so that the user can move this ruler by a corresponding movement of the instrument can use like a dipstick.
  • a large number of distance information can be obtained by the optical evaluation of the image of an introduced by an additional opening auxiliary instrument in the field of view of the endoscope camera and its subsequent processing of the recorded image, which are not available in immediate viewing.
  • the online evaluation in real time also allows the resulting data to be imported directly into the camera image, thereby enabling the operator - and in the case of medical application to the surgeon - to incorporate the insights gained directly into his current working method.
  • a size estimation, adaptation and selection of implants can be made directly with the existing presentation means in the sterile area of the operating room.
  • This also includes ongoing information about the position of the instrument itself, so that the data obtained can also be the basis for a tracking system (datalogger).
  • the length of the connecting line is determined from the local image positions of the distal ends of the two instruments on the basis of the determined local scale and displayed digitally in the image. If the local scales at the two instrument ends are different, this can be taken into account in the calculation by averaging.
  • two instruments can also be used, the distal ends of the two jaws of a pair of pliers.
  • a line generated by a laser source, which is projected obliquely onto the object by the auxiliary instrument, can form-from a camera's point of view-a contour line which enables a judgment of the topology of the object captured by the camera, also from the lateral direction.
  • This is particularly favorable for gaps or cracks in the object, which are to be measured to scale to repair purposes.
  • spatial unevenness can be seen, which lead to different magnifications, so that with the selection of the areas of the local survey on this particular consideration can be taken.
  • the system according to the invention is suitable not only for endoscopic applications, but also for open repairs in which the work area of the workpiece for immediate adjustments is not accessible - whether it is too sensitive for repeated manipulations or whether its surface properties only by additional image processing becomes recognizable to the human observer. In the medical field, these would be open surgery in which implants are to be fitted without repeated try-on while avoiding direct contact with the patient's body.
  • the auxiliary instrument can also be used as a pointer (cursor) or to trigger the function.
  • the instrument can as a "mouse replacement" a function selection and control by means of menu selection and / or gestures effect, with auxiliary geometric sizes for orientation in the picture. It is important that the reference distances of these auxiliary variables for orientation in the gesture in the image are adapted to the current image scale, so that the movement strokes to be carried out with the instrument and to be monitored over the screen are constant. This ensures that - corresponding to the free space available during the operation, small and constant movement strokes in the respective image scale are reliably detected and processed in the image.
  • adjusting means for changing the reproduced on the monitor image size or comparison distances are provided according to the local scale factor in the manner of a digital zoom, for which the geometric distance value forms an input to adjust the image size on the monitor according to a predetermined scale factor, in the information representative of a given geometric distance is included as a multiplier such that an object contained in the image content whose geometrical dimension corresponds to the predetermined geometric distance is displayed on the monitor with a corresponding dimension multiplied by the predetermined scale factor.
  • the comparison with the instrument in its current position in the image thus not only makes it possible to measure the object in the depicted object in real time, but also allows further real objects or distances shown in the image to be removed during the movement of Real objects form a criterion to refer to the local image scale at which the object is located, so as to process objects or events in the current image geometrically correct.
  • measured values can also be read by an instrument serving as a measuring instrument whose geometric surface changes analogously to a physical quantity.
  • a measuring instrument serving as a measuring instrument whose geometric surface changes analogously to a physical quantity.
  • An example of this is a spring balance whose extension length is proportional to the force acting on it.
  • FIG. 1 shows the principle of a first embodiment of the system according to the invention with a surgical instrument and an endoscope camera
  • FIG. 1a shows the visible in the screen display in the configuration of FIG. 1,
  • FIG. 2 shows a block diagram as an exemplary embodiment of an evaluation circuit according to the invention
  • FIG. 3 is an expanded block diagram as an embodiment of the invention with further possibilities of signal processing
  • FIGS. 4 and 4a show the principle of an embodiment of the system according to the invention with two surgical instruments and an endoscope camera in a block diagram
  • FIG. 5 is a block diagram of the embodiment of FIG. 4,
  • FIGS. Figures 6, 6a and 6b show three illustrations of embodiments of auxiliary instruments according to the invention
  • FIGS. 7 and 7a show a further variant of an auxiliary instrument according to the invention
  • FIG. 8 shows a further block diagram for signal processing according to an embodiment of the invention
  • Fig. 9 shows an embodiment of a surgical auxiliary instrument with means for detecting tensile or compressive forces
  • FIG. 10 shows an embodiment of an arrangement for controlling functions directly through the auxiliary instrument by means of gestures or menu selection, wherein the movement distances in the image are adapted to the size of the representation of the auxiliary instrument.
  • an object 1 is provided in the region of the receiving area of an endoscope camera 2, which for the sake of simplicity is shown here in two flocks of intersecting straight lines.
  • the camera has an optical axis 21 and its receiving area is limited by the one cone, which is represented by its boundary line 22 in the image. Due to the usually short focal length of an endoscope camera, an imaged object shrinks sharply with increasing distance from the camera, which is associated with a significant change in scale, so that accurate measurements using the recorded image information are not readily possible.
  • a forceps forming auxiliary instrument 3 has a shaft 31, an insert 32 and a (distal) working end 33, which may for example form the legs of the surgical forceps.
  • Fig. 1a the associated image is approximately shown on the monitor, which sees a surgeon on a monitor.
  • the object 1 is correspondingly seen in plan view, the camera perspective, the instrument 3, the shaft 31 and the insert 32 as a support of the (distal) working end (pliers jaws 33)
  • a virtual reference point 101 the Near located at the end of the instrument and located at the real point where the instrument would touch the object 1 in normal posture.
  • a measuring beam 102 emanating from the reference point 101, which is provided with scale marks 103.
  • This measuring beam 102 extends in the simplest case in the image plane in continuation of the direction of the jaws shown, so that it can be controlled with the orientation of the real jaws in his direction.
  • the graduation on the measuring beam is divided, for example, into lengths of two millimeters and adjusted so that it is adapted in the image reproduction to the scale of the object in the reference point 101.
  • This adaptation is - as described in more detail with reference to FIG. 2 is derived from the image of the distal region (jaws 33) of the auxiliary instrument 3.
  • the surgeon can make length measurements, the scale decreasing with the distance of the distal end of the auxiliary instrument from the camera lens. In this way, precise information about the size of surgical objects can be obtained so that on the one hand a precise logging is possible and on the other hand also implants etc. can be precisely selected.
  • an additional display is virtually inserted, which is based on a distance information that is derived from the image of the auxiliary instrument in the camera image.
  • the appearance of the virtual overlay in the image is at the same time the confirmation that the - to be described below - electronic evaluation was done properly. So that an immediate control of the correct operation of the calculation process is given.
  • the surgeon can handle the virtual scale like a real dipstick attached to his instrument. But that would affect him in his way of working. He can turn the virtual dipstick on and off as needed. It is electronically generated according to the location and orientation of the instrument and precisely displays the length to be measured on the measurement object in the selected unit of measurement.
  • the image information forming the output signal of the endoscope camera 2 is transmitted continuously to the monitor 100, which monitors the current video information without interruption in real time.
  • the signal passes to a frame memory unit 101, in each of which a frame from the current video signal is held for processing. This serves as the basis for the adjustment operations to be described below.
  • the detection of the reference forming part of the auxiliary instrument significantly accelerates if, after it has been found for the first time and its location is captured in the image, in recalculations only the differences due to recent movements must be evaluated, so that the movement of the instrument easily tracked after being found for the first time.
  • the individual image captured in the frame memory unit 101 is examined in a first comparison unit 102 with (coarse) detection means for the presence of an image portion which is identical to the image of the distal end 32 of the auxiliary instrument stored in the first memory unit 103.
  • This comparison image is located in the memory segment 103a.
  • a search algorithm is used, as it is used for content-based image search. At first the search criterion becomes coarse, i. out of focus, adjusted and will preferably be directed to an easily recognizable segment of the instrument 3, such as the shaft.
  • the information of the memory unit 103 is guided in the manner of a "template" by continuous rows and column offset over the image content in the memory 101 and obtained after subtraction of the respective image components to be compared a kind of differential image, which is fed to a subsequent evaluation unit 104 as a detector means. If a match criterion, which is determined by an inverse integration over the obtained difference image of the selected part of the instrument, currently exceeds a predetermined threshold value, then a trigger signal 105 outputs a trigger signal together with the reference location of the finding in the evaluated camera image. The match criterion is met when the difference image is weakest. Also suitable are corresponding correlation methods.
  • a reference location which is communicated via the connection 107 to the comparison unit 102 in order to determine the narrowed search field is recorded in the memory 103 in association with the "search template" held in the memory part 103a.
  • the reference location here can also lie outside the comparison image recorded as a search template, as shown in FIG. 2a.
  • comparison images held in the memory 103 can also be the corresponding instrument outlines, which may be sufficient for finding the sought-after object in the original image.
  • an image detail unit 106 is actuated based on the found reference location, which selects a smaller image detail from the image content present in the memory 101, which surrounds the reference location found to speed up the subsequent enemy detection by reducing the amount of data. If, as in the example under consideration, the distal part of the auxiliary instrument used for the enemy detection was not identical to the range used in the coarse detection, the selection of the image section will result in a geometrical offset between the reference location of the detected first sub-element (shaft 31) and the distal element to be searched (jaws) - with which the fine adjustment is performed - made as it corresponds to the real conditions. In the example illustrated, this would be a displacement of the reference location into a search area towards the center of the shaft, which carries the jaws, towards the center of the image
  • a fine adjustment of the distal instrument end is then carried out with further detail images recorded in the memory 103b, which are not shown in detail here.
  • detail images recorded in the memory 103b, which are not shown in detail here.
  • the adjustment is carried out also takes place a rotation in the axis of the viewing direction (ie perpendicular to the viewing plane) and a shift in two coordinate directions, so that the object to be searched is compared in all its possible manifestations.
  • a correlation technique is suitable, which outputs a dependent on the degree of the found match signal.
  • the image size applied thereby, the coordinate alignment and additionally the swivel angle are output around the axis of the line of sight.
  • the position and orientation of the relevant part of the auxiliary instrument in the captured camera image is thus clearly defined by a reference point of the comparison image selected by the matching criterion, and an associated orientation vector whose magnitude forms the scale information.
  • the orientation vector is determined by the viewing direction of the comparison image and its rotation in the image plane.
  • a reference (distance) information which is provided in the second memory means, fixes in the captured camera image the relative geometric association between the starting point of the vector identifying the retrieved comparison image and the location which serves as the virtual reference point for the instrument part in FIG inserted image is inserted.
  • this assignment of the virtual reference point is predefined in each of the comparison images.
  • the reference point for virtual fade into the camera image results directly from the stored comparison image, if it by finding the position with maximum agreement in an enemy detection unit 108 by image selection , Size variation and shift was made correctly in accordance with the picture taken by the camera.
  • the point at which the instrument concerned touches an adjacent surface is thus fixed and can be superimposed directly into the current image due to the information held in memory 103c.
  • Fixedly connected to the found image is also a reference direction that corresponds to the orientation of the auxiliary instrument. Normally this is the orientation of the jaws.
  • a virtual information is read from the second memory 103c, which is adapted in size by the scale factor in the current video image directly or together with the captured still image as picture-in-picture information about the mixing unit 109 in a corner of the current picture in the monitor 100 is displayed.
  • Additional information such as the radial measuring rod shown, is also reproduced to scale from the corresponding memory 110 via a zoom stage 111 in an image synthesis stage 112 in the current picture.
  • the measurement plane runs parallel to the image plane, since the measurement beam is displayed to scale in it.
  • locating the subject portion of the auxiliary instrument will be facilitated if its position is captured in a previous frame and in the subsequent frames the only neighborhood of the previous position in the camera frame is searched.
  • the finding of the sub-instrument can be improved to the extent that the comparison images stored in the memory can be replaced by current images of the relevant part of the current history or partially overwritten with these.
  • a coherent representation of a three-dimensional model can also be provided here, which is adapted, enlarged, reduced or shifted according to a CAD representation depending on the viewing direction.
  • the image data of the currently used instruments are expediently inserted or selected in the memory before the operation.
  • the image data of the currently used instruments are expediently inserted or selected in the memory before the operation.
  • the views used for comparison instead of the views used for comparison, only contour images are stored and compared.
  • the system presented so far offers the possibility to use a conventional surgical auxiliary instrument without significant changes to control a virtual dipstick, which starting from a virtually associated with the surgical auxiliary instrument reference point in the surgical field measurements in the image plane measurements to perform, the dipstick to the image scale adapted to be found at the distance from the camera in which the virtual reference point of the auxiliary instrument is located.
  • the direction of the measuring rod follows the orientation of the instrument, resulting in a very simple handling.
  • An arrow 113 in FIG. 2 indicates that the coarse determination of the location of the instrument with the detector means 102 according to FIG. 2 does not always have to be based on the entire camera image. Once the desired part of the instrument has been found, it is sufficient, after a movement of the same during the next search cycle, to seek only the nearer surroundings of the place where the instrument part was previously located. In this way, it is possible in the current image, the measuring rod - or another graphic information to be displayed accordingly, which may for example be a numerical value - currently carry with the position of the instrument.
  • FIG. 3 a more complex system is shown, in which the possibilities are to be shown how the information found with the means shown in Fig. 2 can be supplemented to further, the information of the treating surgeon expands representations.
  • various graphics or measured variables which are obtained to scale from the current image, after selection and adjustment by appropriate actuation elements in the current image or in an additional still image, the manner of a Picture-in-Picture (Picture-in-Picture) Presentation is inserted.
  • a trip unit 201 is provided, which is started, for example, by an external actuating button which is attached to the camera unit together with other operating elements.
  • a still picture memory unit 202 (similar to the unit 101 in Fig. 2, from which the current video signal of the camera 2 is selected and recorded a single suitable picture to be used in the further processing This selection is assisted by a cursor input unit 203 in which additionally a local assignment can be made
  • the outputs of the units 201-203 control a computing unit 204 which includes a graphics processor to implement the desired processings
  • the processing part of the means shown in Fig. 2 is included in block II, so that true-to-scale graphic information relating to the position of one (or more) auxiliary instruments for further processing or display can be obtained.
  • the output of the calculation unit controls - depending on the selected processing program to a graphics unit 205 are stored in the graphical representations that complement the elements supplied by the unit II (according to Figure 2). These include menu selection boxes or evaluation windows for numeric measurement data.
  • the insertion into the still image stored in the unit 202 takes place in the mixing unit 206, wherein in a subsequent zoom unit 207 an adaptation to the screen size to be outputted takes place.
  • the display as sub-picture in the playback of the monitor takes place in a PiP mixing unit.
  • the generated image can also be displayed on a separate tablet monitor 208, which forms a separate playback device that can be freely moved to a desired display position as a WLAN-enabled wireless screen.
  • the control information for the special tablet monitor 208 is mixed in a unit 209, wherein in a further control unit 210 also the data from a module VII are mixed, which is shown in more detail in Fig. 7. These are log data derived from the movement of the helper tool for logging and gesture data, which is also obtained as the movement of the helper instrument, but are used to control the system, as discussed in more detail below is described in more detail.
  • a measurement by means of two instruments 3 and 4 is shown in principle.
  • the two instruments are identified by markings 34 and 44 attached to their inserts 32 and 42 in the form of circular rings of different numbers.
  • markings 34 and 44 attached to their inserts 32 and 42 in the form of circular rings of different numbers.
  • other markers may be attached to the instruments which can be identified by means of another physical signal transmission. These include, for example, RFID tags.
  • the camera is 2 is equipped with an additional laser source, which emits a laser beam 48, which generates a laser mark 49 on the object 1.
  • Their distance to the intersection of the optical axis with the object 50 defines another reference for the determination of the local scale, as described in the applicant's earlier patent application.
  • the object 1 should consist in this embodiment of a plane inclined in space, which is not directed perpendicular to the optical axis 21 of the camera 2.
  • a measurement in the camera image shown in FIG. 1 is not readily possible because the scale of the reproduced on the monitor object is different depending on the distance from the camera optics - ie in the respective areas of the plane of the object 1.
  • the image scales that are valid in the points can also be calculated.
  • Fig. 4a shows the corresponding monitor image.
  • the laser marking 49 can be seen on the object, which is generated by a laser source attached to the shaft of the endoscope camera 2. Due to the known distance of the laser mark 49 in the monitor image 4a, which corresponds to the real distance of the laser source from the intersection 50 with the optical axis of the camera, the magnification for this imaging range can be calculated, as described in the earlier patent application of the applicant.
  • the virtual reference points 51 and 52 Connected to the ends 33 and 34 are the virtual reference points 51 and 52 recognizable in FIG. 4a, which mark the end of a distance 53 to be measured, which can be freely selected by appropriately positioning the ends of the instruments 33 and 43 on the object 1.
  • the distance to be measured 53 is displayed in the current image as a connection of the instruments associated with the points 51 and 52 virtually connected so that it appears on the monitor - as shown.
  • an additional digital display 54 is displayed, which displays the currently calculated by the associated data system length of the track 53 digitally in a selected unit of measure. In this case, other arithmetic operations can be selected. For example, it is also possible to digitally evaluate the surface area of the triangle spanned by points 49, 51 and 52 and display it accordingly.
  • the screen display is corrected so that the object plane is displayed in an equalized top view, as if the camera would be located vertically above it. This is achieved here so that a trapezoidal distortion of the image is made in such a way that the surroundings of the reference points 50 to 52 are displayed on the same scale. This representation differs from that shown in Fig. 4a. The calculation method necessary for this will be described with reference to FIG. 5.
  • Reference scale shall be the local scale in laser marking 49.
  • the image in the region of these points in each case in the direction the opposite side of the dashed triangle with the points 49, 51 and 52 in Fig. 4a correspondingly compressed or stretched.
  • a corresponding keystone distortion is applied, which optically corresponds to a corresponding inclination of the plane around the respective point opposite triangle side, but is mathematically achieved by an image distortion that increases or decreases in the direction of the perpendicular bisector of each triangle side.
  • the object 1 is linearized on the screen so that length measurements can be made correctly.
  • the scale is preferably set to 1: 1 by means of a corresponding zooming operation, so that in particular on the auxiliary screen 208 according to FIG. 3, immediate shape adjustments can be made.
  • the line between the virtual reference points 51 and 52 can be measured without Trapezentzerrung and a third reference point 49, if not on the scale representation of the entire surface, but only on the length of the Straight line arrives.
  • the length of the straight line in the case of a spatially obliquely lying object surface can be determined from the respective camera distances of the reference points 51 and 52, which can be calculated from the image size according to the radiation set.
  • the data processing required for this (as part of the zoom unit 207 in FIG. 3) is shown schematically in block form.
  • the local scales and image positions of the instruments or the laser markings 50, 51 and 49 are fed to the blocks 301, 302 and 303 as input variables. From this, in the arithmetic unit 304, as indicated above, the (double trapezoidal) distortion to be applied to the image representation is calculated.
  • the current graphical representation is transferred to the image memory 305 and subjected to the distortion detected in the arithmetic unit 304 in the distortion unit 306. The output then takes place via the further intermediate memory 307.
  • predefined movements of the instrument (in the air) or in association with selection menus shown in the screen (not shown here) or precisely the described change in shape - or else their combination can be evaluated as gestures, as shown below with reference to FIG is.
  • Fig. 6 another embodiment of the distal end of an auxiliary instrument is shown, the distal end 33 is formed by the jaws of a surgical forceps.
  • This pliers is characterized in that engaging means 60 are provided on the surface coming into contact with a surface in the area of the underside, which by means of a frictional connection with a particularly resilient surface enable a holding action, so that the end of the instrument in the desired position securely held and secured against slipping.
  • the instrument can be kept in safe contact with light pressure with an organic surface, so that the measuring points can be fixed with two instruments safely selected until the time at which the measurement takes place.
  • intervention means all those measures are suitable which prevent unwanted displacement on the surface to be treated, such as roughening, knobs, toothings, profilings, corrugations or cordings.
  • Fig. 6a shows a detail of the distal end of an embodiment of an auxiliary instrument for use with the invention, which is particularly suitable as a reference and pointing instrument.
  • the screen presentation is shown again.
  • the insert 32 is designed as a rod-shaped region, which carries two balls 61 and 62 of different diameters, which are arranged at a distance 66 from each other.
  • the outside ball 62 has a smaller diameter 67 than the inside ball 61 with the diameter 68.
  • the balls are particularly easy to identify in the image of the camera with digital evaluation, since they from different spatial directions each have the same outer contour (with different Diameters). This can - with known dimensions and positioning from the image of the balls in the camera image on their spatial positioning, the orientation of the instrument and the local magnification be concluded.
  • the virtual reference point is the end 63 of the rod-shaped region 32, which also initially comes into contact with an adjacent object surface, so that the instrument shown in this figure is particularly suitable for accurately positioned selection of measurement points or for pointer applications.
  • the distance 69 is the internal reference information which relates the distance of the external reference point of the instrument to an internal reference base (here the center of the lower sphere 67).
  • the cursor 63 is generated in the screen, indicating that the instrument 31 has been correctly recognized in its positioning.
  • the position of the cursor is relative to the outer frame 64 of the screen on which the display is made.
  • the optical evaluation by matching according to FIG. 2 is particularly simple, because the balls 61 and 62 appear the same from all sides, so that the axial alignment is irrelevant and therefore only one spatial direction must be taken into account.
  • the design can therefore be used under difficult conditions or for measuring a system according to the invention.
  • FIG. 6b shows how not only the end 33 of an instrument can be used to determine a position and a local scale, but with the instrument a signaling in the manner of a gesture is also possible, if not only the position of the instrument in the camera image, but also its shape is evaluated.
  • This change in shape is effected in the illustrated example by the spreading of the two jaws 64 and 65, which can be triggered from outside the observation room located at the end of the instrument forth. (The changed form is recognized by the above-mentioned detection and evaluation means as an instrument shown separately as a figure.)
  • a line-shaped laser marking 75 is generated by a laser line source 71 mounted on the insert 32 in an end adjacent to the lower end of the shaft 31 by means of the radiation 72, 73 in a region 74 deviating in its heightwise extension.
  • a laser line source 71 mounted on the insert 32 in an end adjacent to the lower end of the shaft 31 by means of the radiation 72, 73 in a region 74 deviating in its heightwise extension.
  • the unevenness 74 of the surface 1 of the object 1 can be recognized in perspective by the imaging of the laser line 75 because of its oblique incidence relative to the optical axis of the camera.
  • a second instrument 4 is held such that the connecting path 78 formed by the virtual reference points is crossed by the laser line 75, not only the straight line 74 can be displayed for measurement in the screen, but also the laser line 75 on the basis of the dimensional relationships known to scale be converted into a profile to be displayed on the screen, as shown in Fig. 7a. Accordingly, a perspective view, as shown in Fig. 7, provided with displayed figures.
  • the laser line Since the laser line is projected obliquely onto the object (represented here in a stepped manner), it forms a contour line from the camera, which allows an assessment of the topology of the object recorded by the camera. This is particularly favorable for gaps or cracks in the object, which should be measured accurately for repair purposes. In the case of a two-dimensional camera system, a three-dimensional overview of the object can be obtained in this way, without requiring lengthy subsequent evaluations.
  • the topographical course of the surface is determined by the evaluation of the course of the laser line in the camera image on the basis of the known mathematical relationships and is superimposed in the running image as a contour line. In this way, for example, the depth of cracks can be determined there by measuring exactly where they are hit by the laser line. This allows the user in case of defects, a precise assessment of the measures to be initiated.
  • Fig. 8 is shown schematically by means of a block diagram, as logging of the instrument movements on the one hand logging the operation history and on the other hand, the evaluation of instrument movements for controlling connected devices or other signal triggers in the sense of gestural analysis can be used. In both cases, these are the sequential storage of instrument positions, as well as orientations and form states (see FIG. These states are logged with the associated timestamps in memory 81. The storage takes place optionally in context and spatial allocation, to a - not shown here - on-screen menu.
  • the stored state data from block 81 in an evaluation unit 82 where by comparing with predetermined time and location conditions, the determined sequence of instrument positions and orientation either is evaluated as a gesture for controlling predefined processes, which are addressable in a memory 83 and are passed on to a control unit 85 for execution.
  • the designation provided for this purpose is stored in a memory for the surgical procedure and stored in progress and transferred to the output unit 86.
  • an auxiliary instrument 3 is shown in section, as it can be used for the measurement in such a way that a visible on its surface change can be transmitted by changing the instrument shape via a corresponding measurement to the outside.
  • the insert is designed in the manner of a spring balance.
  • a resiliently extending portion 91 is provided at the end with a hook 92 which engages in a flexible portion 93 of the object.
  • the spring-extendable area is pulled out from the outer end via a manual tensile load of the instrument until a required nominal load is reached.
  • the evaluation can take place in accordance with the representation in FIG. 8, whereby different instrument forms are evaluated instead of different instrument positions, which are assigned to the different evaluation values in the evaluation memory to be addressed, which can be transmitted into the system without auxiliary power or cable the camera is connected.
  • an arrangement for controlling functions directly by the auxiliary instrument by means of gesture or menu selection is provided. Functions of the system can be triggered directly by appropriate movements of the distal end of the instrument. Thus, it is no longer necessary to deposit the instrument for controlling the system.
  • the tip of the instrument only needs to be located in a free space, which makes it possible to carry out a relatively small movement in the image plane.
  • the movement distances required for triggering in the image reproduction are adapted to the scale of the size of the representation of the auxiliary instrument. In this way it is ensured that the actual movements of the instrument are independent of the size of the image in the screen, although the motion evaluation is based on the digital data of the image reproduction.
  • a section 120 of the reproduced image the scene visible to the user is shown.
  • the auxiliary instrument 3 is guided with its distal end away from the object, the details of which are not shown in the cutout 120.
  • the instrument 3 is detected by the arrangement shown in FIG. 2 with the camera 2 reproduced there.
  • the image data are processed accordingly so that the image scale in the area of the instrument 3 is available as scale information and the position of the instrument tip as survey information besides the image data.
  • a cursor 121 is reproduced in the region of the distal end of the instrument 3, which moves in synchronism with the instrument tip and displays the determined position of the instrument tip in the image as the surveying information.
  • the image of the cursor is synthetically generated and forms a virtual mark on the distal end of the instrument. Its appearance forms the optical feedback for the correct detection and processing of the image of the instrument by the recording camera 2 according to FIG. 2.
  • selection fields 122 to 125 with the designations A to D are shown in the image.
  • the letters A to D are associated with actuating functions of the system, which can now be triggered directly by a movement of the instrument 3, in such a way that the image of the instrument tip with the cursor 129 reaches one of the selection fields 122 to 125 and at least partially overlays it .
  • An evaluation logic to be described below triggers the assigned function in this case. Examples of assigned functions include: saving a still image, starting or stopping a video recording, saving a timestamp as part of a log, or invoking a submenu.
  • the representation of the selection fields takes place at a distance x from the starting point, which is determined by the position of the cursor 129 in a momentary rest position of the instrument 3.
  • the set of marks 120 through 125 distributed around the current position of the cursor 121 will appear in the image indicating that a function selection may occur. Since the distance x of the marks from the starting position is dependent on the scale factor of the image of the instrument 3 (x increases with increasing size of the instrument representation), it is ensured that the paths of the instrument to be performed in the image plane remain substantially the same - regardless of the camera distance and the resulting image display scale.
  • An evaluation of the movements of the instrument in the image plane would also be possible without the illustrated marks, which serve only for orientation (pure gestures). In the sense of a complete menu navigation, more detailed labels can also be displayed in the image or submenus can be called up.
  • the image information from the camera reaches an image mixing part 127, by means of which the recorded image information is superimposed on the synthetically generated impressions of a virtual nature.
  • the data characterizing the position of the instrument derived from the block diagram according to FIG. 2 are transferred to a memory 128 for the cursor position via its input 129.
  • the memory 130 for the scale (of the instrument display) receives the corresponding signal from the block circuit according to FIG via the input 131.
  • the researchertabstex is transmitted as a reciprocal value, the amount of which also increases with increasing size of the representation.
  • signals for generating the indication of the markers 122 to 125 are generated in the image, the position being shifted by the distance x (controlled by the scale signal from the block 130) to the position of the cursor (controlled by the position signal from the block 128) is.
  • a timer 133 is used, which cyclically outputs a clock signal at times T1 and T2. If the instrument remains at rest for a period of time that is greater than t1 (about 1 second), the output signal T1 reaches the set input of a flip-flop 134, which via a switch 136 supplies the signals for the generation of the markers 122 to 125 to the image mixing section turns on, so that it is recognizable to the operator that the menu or gesture control is enabled.
  • the display positions of the marks 122 to 125 are now held for the period t1. Cursor movement caused by a movement of the instrument in the image plane which does not reach one of the markers 122 to 125 at the time t1, by the corresponding signal to the control circuit 132 and the reset input of the flip-flop 134, resets both circuits to their initial state that no further function is triggered. Small instrument movements thus remain without effect. When the position of the cursor connected to the instrument tip reaches the position of one of the markers 122 to 125, those marks whose position has not been reached are extinguished. Only the selected marker remains displayed as an acknowledgment for the successful selection.
  • the switch 135 is also turned on, which reduces the display position for the selected marker output signal reduces to the logical switching level, which triggers the associated function with the pulse at time T2 via one of the outputs 138 in the system. This also clears the display of the selected and activated display item on the screen.
  • a measurement is activated with the instrument, as described above, it may be useful to record the measured value in a still photo.
  • the instrument is not available for active function selection.
  • this still picture is triggered when the cursor simply remains immobile until T2 in the middle position and this position leads to a corresponding function triggering via the line "0" of the outputs 138.
  • the still photo can be prevented, however, if the cursor is moved over the instrument 3 only from the center, without reaching any of the other markings.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Electromagnetism (AREA)
  • Dentistry (AREA)
  • Biophysics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)
EP12717164.3A 2011-02-11 2012-02-11 Système endoscopique de traitement d'image pourvu de moyens générant des informations de mesure géométriques dans la plage de prise de vues d'une caméra numérique optique Withdrawn EP2672915A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102011011671 2011-02-11
PCT/DE2012/200007 WO2012107041A1 (fr) 2011-02-11 2012-02-11 Système endoscopique de traitement d'image pourvu de moyens générant des informations de mesure géométriques dans la plage de prise de vues d'une caméra numérique optique

Publications (1)

Publication Number Publication Date
EP2672915A1 true EP2672915A1 (fr) 2013-12-18

Family

ID=46017743

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12717164.3A Withdrawn EP2672915A1 (fr) 2011-02-11 2012-02-11 Système endoscopique de traitement d'image pourvu de moyens générant des informations de mesure géométriques dans la plage de prise de vues d'une caméra numérique optique

Country Status (4)

Country Link
US (1) US9691162B2 (fr)
EP (1) EP2672915A1 (fr)
DE (1) DE112012000752A5 (fr)
WO (1) WO2012107041A1 (fr)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2879609B2 (fr) * 2012-08-02 2022-12-21 Koninklijke Philips N.V. Définition d'unité de commande d'un centre de mouvement à distance robotique
US10420608B2 (en) * 2014-05-20 2019-09-24 Verily Life Sciences Llc System for laser ablation surgery
FR3036195B1 (fr) * 2015-05-12 2018-05-25 Commissariat A L'energie Atomique Et Aux Energies Alternatives Dispositif et procede d’observation d’un objet, avec prise en compte de la distance entre le dispositif et l’objet.
WO2017054818A1 (fr) 2015-10-01 2017-04-06 Olaf Christiansen Système endoscopique de traitement d'images destiné à l'introduction dans une ligne de données
DE112016004461A5 (de) 2015-10-01 2018-06-21 Olaf Christiansen Endoskopisches bildverarbeitungssystem für die chirurgie mit mitteln, welche im erfassungsbereich einer optischen digitalkamera eine geometrische distanzinformation erzeugen
US10565733B1 (en) * 2016-02-28 2020-02-18 Alarm.Com Incorporated Virtual inductance loop
CN109310480B (zh) 2016-07-14 2021-11-05 直观外科手术操作公司 用于远程操作医疗系统中的屏幕菜单的系统和方法
DE102017103198A1 (de) * 2017-02-16 2018-08-16 avateramedical GmBH Vorrichtung zum Festlegen und Wiederauffinden eines Bezugspunkts während eines chirurgischen Eingriffs
US10010379B1 (en) 2017-02-21 2018-07-03 Novarad Corporation Augmented reality viewing and tagging for medical procedures
WO2018180249A1 (fr) 2017-03-28 2018-10-04 富士フイルム株式会社 Dispositif de support de mesure, système endoscopique et processeur
EP3603478A4 (fr) 2017-03-28 2020-02-05 FUJIFILM Corporation Dispositif d'aide à la mesure, système d'endoscope et processeur
EP3609425A4 (fr) * 2017-04-13 2021-03-03 V.T.M. (Virtual Tape Measure) Technologies Ltd. Procédés et instruments de mesure endoscopique
CN110799081B (zh) * 2017-07-18 2022-04-05 富士胶片株式会社 内窥镜装置及测量支持方法
WO2019045144A1 (fr) * 2017-08-31 2019-03-07 (주)레벨소프트 Appareil et procédé de traitement d'image médicale pour dispositif de navigation médicale
EP3679853A4 (fr) * 2017-09-07 2020-10-14 Fujifilm Corporation Système de support au diagnostic, système endoscope, processeur, et procédé de support au diagnostic
KR102102291B1 (ko) * 2017-12-20 2020-04-21 주식회사 고영테크놀러지 옵티컬 트래킹 시스템 및 옵티컬 트래킹 방법
JP7022154B2 (ja) * 2018-01-31 2022-02-17 富士フイルム株式会社 音響波装置および音響波装置の作動方法
US11287874B2 (en) 2018-11-17 2022-03-29 Novarad Corporation Using optical codes with augmented reality displays
US11625825B2 (en) 2019-01-30 2023-04-11 Covidien Lp Method for displaying tumor location within endoscopic images
US11237627B2 (en) 2020-01-16 2022-02-01 Novarad Corporation Alignment of medical images in augmented reality displays
CN113079314B (zh) * 2021-03-04 2022-09-02 首都医科大学附属北京安贞医院 手术视频采集系统视觉引导靶标
US11948265B2 (en) 2021-11-27 2024-04-02 Novarad Corporation Image data set alignment for an AR headset using anatomic structures and data fitting

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5417210A (en) * 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US5669871A (en) * 1994-02-21 1997-09-23 Olympus Optical Co., Ltd. Endoscope measurement apparatus for calculating approximate expression of line projected onto object to measure depth of recess or the like
JP4054104B2 (ja) * 1998-04-10 2008-02-27 オリンパス株式会社 内視鏡画像処理装置
US6806899B1 (en) * 1999-07-17 2004-10-19 David F. Schaack Focusing systems for perspective dimensional measurements and optical metrology
US20020026093A1 (en) * 2000-08-23 2002-02-28 Kabushiki Kaisha Toshiba Endscope system
US8211010B2 (en) * 2002-10-29 2012-07-03 Olympus Corporation Endoscope information processor and processing method
JP5199594B2 (ja) * 2006-03-24 2013-05-15 オリンパス株式会社 画像計測装置および方法
JP4999046B2 (ja) * 2006-04-05 2012-08-15 Hoya株式会社 共焦点内視鏡システム
JP5073415B2 (ja) * 2006-08-28 2012-11-14 オリンパスメディカルシステムズ株式会社 超音波内視鏡
US8248465B2 (en) * 2007-12-27 2012-08-21 Olympus Corporation Measuring endoscope apparatus and program
WO2010061293A2 (fr) * 2008-11-26 2010-06-03 Haptica Limited Système et procédé pour mesurer des objets vus à travers une caméra
JP5361592B2 (ja) * 2009-07-24 2013-12-04 オリンパス株式会社 内視鏡装置、計測方法、およびプログラム
JP2011069965A (ja) * 2009-09-25 2011-04-07 Japan Atomic Energy Agency 撮像装置、画像表示方法、及び画像表示プログラムが記録された記録媒体
JP5535725B2 (ja) * 2010-03-31 2014-07-02 富士フイルム株式会社 内視鏡観察支援システム、並びに、内視鏡観察支援装置、その作動方法およびプログラム

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None *
See also references of WO2012107041A1 *

Also Published As

Publication number Publication date
DE112012000752A5 (de) 2013-11-21
US20150161802A1 (en) 2015-06-11
US9691162B2 (en) 2017-06-27
WO2012107041A1 (fr) 2012-08-16

Similar Documents

Publication Publication Date Title
WO2012107041A1 (fr) Système endoscopique de traitement d'image pourvu de moyens générant des informations de mesure géométriques dans la plage de prise de vues d'une caméra numérique optique
EP3076369B1 (fr) Procede et dispositif destines a la representation d'un objet
DE102007033486B4 (de) Verfahren und System zur Vermischung eines virtuellen Datenmodells mit einem von einer Kamera oder einer Darstellungsvorrichtung generierten Abbild
EP2926733B1 (fr) Visualisation de surface et de profondeur basée sur la triangulation
EP3363358B1 (fr) Dispositif de détermination et recouvrement d'un point de référence lors d'une intervention chirurgicale
DE102007054450A1 (de) Vorrichtung zur Bereitstellung von Bildern für einen Operateur
DE102014218558A1 (de) Benutzerschnittstelle und Verfahren zur automatisierten Positionierung eines Untersuchungstisches relativ zu einer medizintechnischen bildgebenden Anlage
DE102012220115A1 (de) Bildgebendes System, Operationsvorrichtung mit dem bildgebenden System und Verfahren zur Bildgebung
DE102010029275A1 (de) Verfahren zum Bewegen eines Instrumentenarms eines Laparoskopierobotors in einer vorgebbare Relativlage zu einem Trokar
WO1996020421A1 (fr) Microscope, notamment stereomicroscope, et procede permettant de superposer deux images
DE102014007909A1 (de) Chirurgisches Mikroskop
EP4213755B1 (fr) Système d'assistance chirurgicale
DE102014102425B4 (de) Mikroskopsystem und Mikroskopieverfahren unter Verwendung digitaler Marker
DE102011078405B4 (de) Verfahren zur Endoskopie mit magnetgeführter Endoskopkapsel sowie Einrichtung dazu
WO2013144334A1 (fr) Système de navigation médicale muni d'un écran tactile connecté sans fil
DE102018119343B4 (de) Verfahren zur Kalibrierung von Objekten in einem Referenzkoordinatensystem und Verfahren zum Tracking von Objekten
EP3626176B1 (fr) Procédé d'assistance d'un utilisateur, produit programme informatique, support de données et système d'imagerie
DE102010018291B4 (de) Navigationssystem und Röntgensystem
DE102020215559A1 (de) Verfahren zum Betreiben eines Visualisierungssystems bei einer chirurgischen Anwendung und Visualisierungssystem für eine chirurgische Anwendung
DE10235795A1 (de) Medizinische Vorrichtung
EP4124283A1 (fr) Procédé de mesure et dispositif de mesure
DE102021207950A1 (de) Verfahren und System zur Bestimmung einer Lage zumindest eines Objekts in einem Operationssaal
DE102020126029A1 (de) Chirurgisches Assistenzsystem und Darstellungsverfahren
DE102012211396A1 (de) Endoskopieeinrichtung und Endoskopieverfahren
DeLucia et al. Human-centered design of image-guided interventions for minimally-invasive surgeries: Toward a methodology

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20130910

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

17Q First examination report despatched

Effective date: 20170324

RIC1 Information provided on ipc code assigned before grant

Ipc: A61B 5/107 20060101ALI20180102BHEP

Ipc: A61B 1/00 20060101AFI20180102BHEP

Ipc: A61B 90/00 20160101ALI20180102BHEP

Ipc: A61B 1/04 20060101ALI20180102BHEP

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20180213

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20180626

RIC1 Information provided on ipc code assigned before grant

Ipc: A61B 90/00 20160101ALI20180102BHEP

Ipc: A61B 1/00 20060101AFI20180102BHEP

Ipc: A61B 5/107 20060101ALI20180102BHEP

Ipc: A61B 1/04 20060101ALI20180102BHEP