EP2461759A1 - Procédé de superposition d'une image intra-opératoire instantanée d'un champ opératoire avec une image préopératoire du champ opératoire - Google Patents

Procédé de superposition d'une image intra-opératoire instantanée d'un champ opératoire avec une image préopératoire du champ opératoire

Info

Publication number
EP2461759A1
EP2461759A1 EP10752557A EP10752557A EP2461759A1 EP 2461759 A1 EP2461759 A1 EP 2461759A1 EP 10752557 A EP10752557 A EP 10752557A EP 10752557 A EP10752557 A EP 10752557A EP 2461759 A1 EP2461759 A1 EP 2461759A1
Authority
EP
European Patent Office
Prior art keywords
image
points
area
characteristic points
preoperative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP10752557A
Other languages
German (de)
English (en)
Inventor
Thomas Wittenberg
Sven Friedl
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV
Original Assignee
Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV filed Critical Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV
Publication of EP2461759A1 publication Critical patent/EP2461759A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/366Correlation of different images or relation of image positions in respect to the body using projection of images directly onto the body

Definitions

  • the present invention is concerned with a concept for superposing an intraoperative live image of an operating area with a preoperative image of the surgical area, as can be used, for example, to support intraoperative navigation in surgery.
  • Coronary Artery Bypass Grafting is a common surgical procedure in cardiac surgery in developed countries, bypassing severely narrowed or completely occluded coronary arteries in order to restore adequate blood supply to the heart muscle. As with most interventions, preoperative planning is important and routinely done. Various imaging techniques are used to prepare for such an intervention and to assist a surgeon in locating the surgical site of interest.
  • coronary angiography is a suitable diagnostic procedure for coronary bypass operations.
  • Angiography is the description of blood vessels using diagnostic imaging techniques, such as X-rays or magnetic resonance imaging (MRI).
  • MRI magnetic resonance imaging
  • a contrast agent ie a substance which enhances a contrast in the image or is particularly well visible in a selected examination method, is injected into a blood vessel.
  • the vessel interior filled with the contrast agent then becomes visible.
  • Non-invasive tests can not yet present the extent or distribution of anatomical heart disease with sufficient accuracy.
  • Cardiac CT cardiac CT
  • Cardiac CT is an emerging and growing field, but not yet widespread.
  • angiographic images ie angiograms
  • stenoses ie constriction of blood vessels or other hollow organs
  • potential positions of anastomoses ie, connections between two anatomical structures, such as blood vessels.
  • preoperative aids such as preoperative image data
  • planning data at the point of care (PoC) or surgical area, such as surgery.
  • PoC point of care
  • live images of the operating area A live image refers to a directly or in real time recorded and transmitted image. A captured with a camera or an endoscope live image is thus forwarded directly to a display device (eg monitor).
  • Preoperatively obtained angiographic image data are used to plan an intervention. These image data are accessible at the operating table, but only in the form of external screens or printouts. However, the transfer of transferring regions of interest from the preoperative image to an up-to-date view of the surgical site is required by the physician.
  • a cardiologist is responsible for a cardiac catheter and an interpretation of the planning data, whereas a cardiac surgeon sets the actual bypass, makes it clear that this can lead to complications.
  • MIS Minimally Invasive Surgery
  • the view of the operating area is limited and the preoperative image data are usually obtained with different location and shape parameters than the live images.
  • Coronary vessels are usually visible in angiograms, whereas they are hidden under a layer of fat on the heart surface during surgery.
  • the visibility of anatomical structures is clearly limited, which makes orientation on the heart surface, for example, a challenge.
  • Navigation systems for intraoperative support of the physician are established in the market. These are essentially based on easily recognizable markers or electromagnetic methods for determining position.
  • markers in the form of metallic or similar objects are attached to the patient and / or the surgical instruments, which are identifiable and traceable in the preoperative, mostly radiologically obtained image and by optical tracking or tracking systems even during the intervention.
  • Another approach is the use of an electromagnetic field to determine by induction a current position of the surgical instrument.
  • a superimposed view of the pre-perative image data is thereby made possible by adaptation of the same on the basis of the thus determined position data of the surgical instrument.
  • Common to all previous approaches is that they are based on the use of additional (e- lektro-) mechanical aids. This results in an interference with the patient, which is not possible or undesirable in different situations.
  • the attachment of markers to the patient may have medical or diagnostic side effects.
  • the object of the present invention is to provide an improved concept for assisting intraoperative navigation. This object is achieved by a device having the features of patent claim 1 and a method according to claim 14.
  • the core idea of the present invention is an intraoperative, ie obtained during a surgical intervention, live image of an operating area with a preoperatively acquired image of the operating area, such. B. the heart, bring in the best possible in accordance with each other.
  • the two images to be registered ie intraoperative live image and preoperative image, typically differ from each other because they were taken from different positions (ie from different perspectives), at different times and / or with different sensors.
  • the intraoperative live image is defined as a reference image. By interactively setting characteristic points or landmarks, an operator can mark specific pixels or image areas of the surgical area in the live image.
  • the surgeon can define points corresponding to the characteristic points of the live image in order to define corresponding points or regions of the surgical area in the pre-operative image. Based on the characteristic points in the live image and the corresponding points in the preoperative image then a transformation is performed, which adapts the preoperative image as possible to the live image. After the transformation, the respective regions of interest of the live image and the corresponding or corresponding regions of the preoperatively obtained image are then superimposed. By superimposed visualization, the surgeon, more or less in accordance with a virtual reality, can be provided with an extended view in the form of an image of the operating area combined from a live and preoperative image.
  • Embodiments of the present invention provide a device for superimposing an intraoperative first live image of an operating area with a preoperative second image of the operating area to achieve the above object.
  • a device according to the invention comprises a device for receiving and providing the intraoperative live image of the surgical field and means for providing the preoperative image of the surgical field. Further, means are provided for defining characteristic points in the intra-operative first live image and a first image derived therefrom and for defining points corresponding to the characteristic points in the preoperative image, such that the characteristic and the corresponding points in the first and mark the second image corresponding to each other pixels.
  • a means for transforming transforms the first and / or the second image so that the characteristic points of the first image and the corresponding points of the second image after the transformation at least approximately come to lie on each other.
  • the apparatus also includes means for visualizing the second preoperative image superimposed with the first image or the surgical field.
  • a superimposed visualization of the two images can provide an operator with an "extended" view of the surgical field, which, according to one exemplary embodiment, is achieved either by overlaying image data of the first and second image after the transformation on a display device, such as a monitor, or, according to another embodiment, directly by projection of the transformed preoperative image onto the surgical area or situs.
  • the marked characteristic points (landmarks) in temporally successive live images can be followed by a suitable method.
  • methods of motion tracking in particular point tracking, can be used so that the movements and thus also the characteristic points or landmarks can be tracked over time.
  • the image registration or transformation and the resulting overlay are adjusted for each new live image according to the new positions of the characteristic points or the new landmarks.
  • the inventive concept allows in complex situations an extension of the view of an operating area by superimposition and projection of adaptively adapted preoperatively acquired images to an intraoperative live image, without additionally using the patient or surgical instruments attached tools. That is, it dispenses with additional technical aids that interfere with a patient.
  • By tracking the positions of the marked characteristic points or the Landmarks can be a meaningful visualization also be guaranteed on moving organs.
  • a device according to the invention can thus be used for expanded visualization in intraoperative navigation by interactive registration of preoperative image data and intraoperative live images.
  • FIG. 1 shows a schematic representation of a device for superposing a first image with a second image according to an embodiment of the present invention
  • FIG. 2 shows a cardiosurgical surgical theater
  • FIG. 3 interactively marked characteristic points in an intra-operative live image and corresponding points in a preoperative image
  • FIG. 4 shows a composite image superimposed on a first and a transformed second image
  • FIG. 5 is an illustration of a tracing of characteristic points in successive frames of an intra-operative live image up to a restriction of the view of the operating area by the intervention;
  • FIG. 6 shows a schematic illustration of a solution path according to an exemplary embodiment of the present invention
  • FIG. 7 shows a flow diagram of a method for superimposing an intraoperative
  • Live image with a preoperative image of the operating area according to an embodiment of the present invention.
  • the device 100 comprises a device 110 for recording and providing the first image 1 12, ie the intraoperative live image, and a device 120 for providing the second image 1 14, ie the preoperatively obtained image.
  • Means 130 for defining characteristic points 131-1, 132-1, 133-1, 134-1 and 135-1 in the first image 12 and defining points corresponding to the characteristic points 131-1 to 135-1 131-2, 132-2, 133-2, 134-2 and 135-2 is also provided.
  • the first image 112 and the second image 114 typically show the operation area 105 from different perspectives, respectively.
  • the characteristic points 131-1 to 135-1 and the corresponding points 131-2 to 135-2 mark mutually corresponding pixels or positions in the intraoperative live image 112 and the preoperative image 114 of the surgical area.
  • the device 100 further comprises a device 140 for transforming the first and / or second image 1 12, 1 14, so that the characteristic points 131-1 to 135-1 of the first image 1 12 and the corresponding points 131-2 to 135 -2 of the second image 1 14 after the transformation come to lie approximately at least approximately to each other.
  • a device 150 visualizes, after the transformation, the second image 1 14 overlaid with the first image 12 or the operating region 105, so that an overlaid visualization provides an extended or combined view 145 on the operating region.
  • the means 1 10 for receiving and providing the first image 1 12 comprises a camera or an endoscope and a display device for temporally recording the operation area 105 from a first perspective by means of the camera or the endoscope, and the resulting image resulting live image 1 12 on the display device.
  • a monitor 210 is available, which represents a live image of the intervention or the surgical area.
  • a camera 215 may be used above a surgical table, with the monitor 210 preferably mounted near a surgeon 220 site of action.
  • a monitor view of the operating region 105 is typically present anyway.
  • the displayed live image 1 12 of the operation area 105 can be improved by means of an advanced visualization of corresponding preoperatively accumulated image data 1 14.
  • preoperative images such as angiograms and cardiac CTs
  • the means 120 for providing the second, preoperative image of the operating area therefore according to embodiments of the present invention comprises a digital memory to store the preoperative second image 1 14 and retrieve from the digital memory.
  • the second, preoperative image 1 14 of the surgical area 105 is, for example, an angiography or cardio-CT scan of the surgical area 105 obtained before the operation, ie, before the live image 112, where the first and second image 1 12, 1 14 are respectively different perspectives can be recorded.
  • the surgeon 220 or assistant 230 identifies and marks a plurality of visible and distinct characteristic points 131-1, 132-1, ... in which intraoperative live image 1 12 of the operating area 105, which may be, for example, the heart surface.
  • a plurality of matching or corresponding points 131 -2, 132-2,... In the preoperative image 14 are identified and marked.
  • the characteristic points 131-1, 132-1,... And the corresponding points 131 -2, 132-2,... Mean in both images mutually corresponding positions or regions of the operating area 105.
  • This relationship is shown in FIG shown. 3 shows on the left an intraoperative live image 112 of a heart whose heart wall is coated with a layer of fat so that coronary vessels are not visible.
  • the drawn thin, dashed lines mean only contour lines of the heart wall.
  • an angiogram 14 of the heart is shown, in which coronary vessels (thick lines) are clearly visible.
  • characteristic points 131-1 to 136-1 are defined, which mean certain positions of the heart, ie the operation area 105.
  • the device 130 may comprise an electronic input device, such as a keyboard, a trackball, a computer mouse, a light pen or a touch-sensitive display.
  • Corresponding points or landmarks 131-2 to 136-2 corresponding to the same positions of the heart are correspondingly defined in the angiogram 1 14 to the characteristic points or landmarks 131-1 to 136-1 defined in the first image 12 Points 131-1 to 136-1.
  • This definition of characteristic points 131-1 to 136-1 in the live image 112 and corresponding points 131-2 to 136-2 in the preoperative image 14 should preferably be carried out interactively by an experienced surgeon or assistant, in each case by corresponding image positions of the image marked characteristic points and the corresponding points and these logically linked together.
  • the definition of the characteristic Points 131-1 to 136-1 in the live image 112 and the corresponding points 131-2 to 136-2 in the preoperative image 1 14 are also carried out fully automated, for example, by powerful and adapted to the image data pattern or position detection method the corresponding image positions of the characteristic points and the corresponding points are automatically, i. e. without human influence, identified, marked and logically linked together.
  • the defined characteristic points 131 -1 to 136-1 and points 131-2 to 136-2 corresponding thereto correspond to corresponding locations or positions in the operation area 105 and thus form a basis for a transformation between both views or images 1 12, 1 14. Since the images 112, 114 to be registered have typically been taken from different positions, at different times and / or with different sensors, the image must be adapted to each other accordingly.
  • Image registration is a process for matching two or more images of the same scene or at least similar scenes in the best possible way.
  • image registration is used to image an image of a modality (eg, angiography) onto an image of a second modality (live image or video) and accordingly transform at least one of the two images 1 12, 114 to produce the two in To bring agreement with each other.
  • modality eg, angiography
  • live image or video live image or video
  • landmark-based image registration may be used for registration of images derived from different modalities, where image intensity information is difficult or impossible to use.
  • an interactively marked set of landmarks 131-1 through 136-1 and 131-2 through 136-2 corresponding to one another is used to define a transformation, for example, the preoperative second image 14 to the intraoperative live video image 112 maps, or vice versa.
  • affine mappings or transformations or alternatively rigid transformations restricted to rotation and translation may be used.
  • an affine transformation is an image that preserves collinearities and parallel-link spacing between two vector spaces, where preserving collinearity means that images of points that lie on a line, ie collinear, are again on a straight line. Likewise, images of parallel lines are parallel again. Affine transformations are known to produce better matches between two images than rigid transformations. to lead that. However, rigid transformations can be performed faster and with less characteristic and corresponding points 131 -1, ... and 131 -2, ....
  • the means 140 for transforming is adapted to image the second image 14 of the operation region 105 from a second perspective by means of an affine image transformation on the first image 12 of the operation region in its first perspective, or vice versa.
  • the means 140 for transforming is configured to form the second image 1 14 based on the characteristic 131-1, 132-1, ... and corresponding points 131-2, 132-2, Image transformation on the first image 112, or vice versa.
  • the transformed or adapted preoperative image 1 14 can be projected onto the intraoperative live image 1 12 or directly onto the open operation area 105, respectively, according to an embodiment receive.
  • a corresponding projector may be provided above the operating area 105.
  • the alpha channel or ⁇ channel is an additional channel which stores in raster graphics in addition to the color information transparency (transparency) of each pixel (pixels) Background is referred to as alpha blending, which allows for partial transparency and a kind of altered reality view, as shown in FIG.
  • FIG. 4 shows a superimposed or combined image or a combined image 145 of the live image 1 12 shown in FIG. 3 and the perspectivized image 1 14 transformed in perspective. That is to say, the transformed second image 1 14 is shown in FIG 4 superimposed correctly in perspective on the first image 1 12, so that a doctor or surgeon 220 can recognize, for example, coronary vessels 420 running under an adipose tissue 410, as a result of which more successful operations become possible.
  • a scene presented in the live video image 112 will never be quite constant. For example, the heartbeat, camera movement and surgical procedure itself permanently change the view or the image onto the operating area 105. Therefore, the characteristic and correspondingly selected by the surgeon can be selected interactively.
  • Points 131-1 to 136-1 or 131-2 to 136-2 should be invalid after a short period of time (eg within seconds). Constant re-selection of characteristic points and corresponding points would not be an efficient procedure. Rather, the image of the two images 1 12, 1 14 should be adapted to each other on the respective image scene and stabilized against movements for a sufficient period of time.
  • the characteristic points 131-1, 132-1,... Defined in the first image 12 are traced over the time from single image to single image of a live video of the operating region 105, which means that they are automatically identified in successive frames or frames. That is, according to an exemplary embodiment of the present invention, the device 110 is configured to provide the first image 112 in order to provide a plurality of images sequentially of the operation area 105, wherein tracking means is provided for determining the position of the defined characteristic points in the temporally successive frames, the means for transforming being arranged to perform a transformation for each of the plurality of frames based thereon.
  • Template matching refers to a comparison of a prototype of a pattern with a raster image to be examined, which is present in the form of a raster or screen. For each position in the raster image, a correlation between the prototype and the corresponding image area is determined. The assignment or rejection of the pattern depends on the correlation, ie the quality of the comparison. In principle, a window or template in a first frame of the live image sequence is selected for this purpose. Corresponding positions in subsequent frames are found based on a similarity assumption.
  • the window or template is partially pushed over a single image or a frame to be examined, whereby a measure of conformity is calculated.
  • an image section to be examined can be limited. Assuming that an offset of a characteristic point 131 -1, 132-1, ... between two consecutive frames is not too large, only a limited region around the original position of the characteristic pixel 131-1, 132-1, ... be considered.
  • a window or template may be temporally adjusted to a displayed scene, thereby allowing small, slow changes in the appearance of the landmark window.
  • the landmarks 131-1 through 136-1 may be tracked during their motion as long as a change between successive frames is not too relevant or the view of the operation area 105 is obscured (see Fig. 5).
  • the partial image in Fig. 5 at the top left shows newly identified characteristic points 131 -1 to 136-1 in a first frame of a live image or video of an operation area 105.
  • the positions of the characteristic points 131-1 to 136-1 become in subsequent frames tracked by a tracking algorithm (see partial image top right and partial image bottom left) until a view of the surgical area 105 is limited by the surgical procedure itself (see drawing at the bottom right).
  • the tracked characteristic points 131-1 through 136-1 are used to adapt the mapping of the preoperative image 14 to the current operational scene according to a current frame.
  • preoperative images 1 14 This allows a continuous superimposition of preoperative images 1 14 on a live image 1 12 for sufficiently long periods of time.
  • a match size is below a threshold so that the tracking and matching of the first and second frames 1 12, 14 1 stops one another.
  • 12 characteristic pixels 131-1, 132-1,... As well as corresponding pixels 131-2, 132-2,... are newly defined in the preoperative image 14, so that an overlay of the images after the transformation can again be represented by means of a combination image 145.
  • FIG. 6 shows an overview of the inventive concept.
  • An up-to-date view of the operating area 105 is recorded, for example, with the aid of a camera 215, so that an intra-operative live image 112 is created.
  • the camera 215 may be mounted in open surgery, for example, over an operating table, for example in an operating room lamp.
  • an optical channel of an endoscope allows generation of the live image 1 12.
  • the current live image 1 12 can be made available to an operator 220 on a monitor 210 that is directly accessible to him.
  • he can load preoperatively acquired image data 1 14 and also display it by means of the monitor 210.
  • the operator 220 can mark landmarks of interest in the current view 12 of the operation area 105.
  • the preoperative image data 114 can be suitably transformed (140) such that after transformation and superimposition the respective interesting and associated regions of the images are superimposed.
  • the overlaid visualization (145) can provide the surgeon 220 with an improved or extended view of the projection area or operating area 105. This can be done either by overlaying the image data on the monitor 210 or directly by projecting the transformed preoperative image data onto the site. In this case, the transformed preoperative image is projected directly onto the site by means of a projector.
  • the corresponding landmarks are tracked with a suitable tracking method (610).
  • a suitable tracking method for this purpose, methods of motion tracking, in particular the point tracking are used.
  • the movements and thus the landmarks can be tracked over time.
  • the transformation and the resulting overlay are adjusted for each new single-lens image 112 according to the new landmark positions.
  • the first image 12, ie the intraoperative live image is made available. Furthermore, in a second step 720, the second image 114, ie the preoperative image of the operative area, is provided.
  • characteristic points 131-1, 132-1, ... in the first image 1 12 and points 131-2, 132-2,... Corresponding thereto are defined in the second image 114, so that the characteristic points and the corresponding points in the first and second image mark corresponding positions of the operation area 105.
  • the first and / or the second image is transformed so that the characteristic points 131-1, 132-1,... Of the first image 112 and the corresponding points 131-2, 132-2,. ..
  • a step 750 of overlaying the second image with the first image or the operation area follows to obtain the overlaid view 145 of the operation area 105.
  • the second image 1114 can be transformed for this purpose and then the first image 112 and the transformed second image 114 can be superimposed in order to produce a complex image. Bound or superimposed image 145 to receive, which is displayed in a step 760, for example via a monitor or projector. Alternatively, the live image 1 12 could be transformed, but this will generally be more expensive.
  • aspects have been described herein in the context of a device for superposing an intraoperative live image with a preoperative image, it will be understood that these aspects also constitute a description of the corresponding method for superimposing a preoperative image on an intraoperative live image such that a block or a device of a device is also to be understood as a corresponding method step or as a feature of a method step.
  • aspects described in connection with or as a method step also represent a description of a corresponding block or detail or feature of a corresponding device.
  • embodiments of the present invention may be implemented in hardware or in software.
  • the implementation may be performed using a digital storage medium such as a floppy disk of a DVD, a Blu-ray disc, a CD, a ROM, a PROM, an EPROM, an EEPROM or flash memory, a hard disk or other magnetic disk or optical memory are stored on the electronically readable control signals that can cooperate with a programmable computer system or cooperate such that the respective method is performed.
  • the digital storage medium can be computer readable.
  • some embodiments according to the invention comprise a data carrier having electronically readable control signals capable of interacting with a programmable computer system such that one of the methods described herein is performed.
  • embodiments of the present invention may be implemented as a computer program product having a program code, wherein the program code is operable to perform one of the inventive methods when the computer program product runs on a computer.
  • the program code may for example also be stored on a machine-readable carrier.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Processing (AREA)
  • Endoscopes (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

L'invention concerne un dispositif (100) pour la superposition d'une première image intra-opératoire instantanée (112) d'un champ opératoire (105) avec une seconde image préopératoire (114) du champ opératoire (105). Ledit dispositif présente une installation (110) permettant de capturer et de fournir la première image (112), une installation (120) permettant de fournir la seconde image (114), une installation (130) permettant de définir des points caractéristiques (131-1 ;... ; 136-1) dans la première image (112) et de définir dans la seconde image (114) des points (131-2 ;... ; 136-2) correspondant aux points caractéristiques, de sorte que les points caractéristiques et les points qui leur correspondent dans la première et la seconde image marquent respectivement les positions correspondantes du champ opératoire (105). Le dispositif présente également une installation (140) permettant de transformer la première et/ou la seconde image de sorte que les points caractéristiques (131-1 ;... ; 136-1) de la première image (112) et les points (131-2 ;... ; 136-2) qui leur correspondent dans la seconde image (114) viennent à se superposer, et une installation (150) permettant d'obtenir, après la transformation ou la superposition de la seconde image avec la première image ou le champ opératoire, une vue superposée (145) du champ opératoire (105).
EP10752557A 2009-09-07 2010-09-04 Procédé de superposition d'une image intra-opératoire instantanée d'un champ opératoire avec une image préopératoire du champ opératoire Withdrawn EP2461759A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102009040430A DE102009040430B4 (de) 2009-09-07 2009-09-07 Vorrichtung, Verfahren und Computerprogramm zur Überlagerung eines intraoperativen Livebildes eines Operationsgebiets oder des Operationsgebiets mit einem präoperativen Bild des Operationsgebiets
PCT/EP2010/063000 WO2011026958A1 (fr) 2009-09-07 2010-09-04 Procédé de superposition d'une image intra-opératoire instantanée d'un champ opératoire avec une image préopératoire du champ opératoire

Publications (1)

Publication Number Publication Date
EP2461759A1 true EP2461759A1 (fr) 2012-06-13

Family

ID=43243083

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10752557A Withdrawn EP2461759A1 (fr) 2009-09-07 2010-09-04 Procédé de superposition d'une image intra-opératoire instantanée d'un champ opératoire avec une image préopératoire du champ opératoire

Country Status (4)

Country Link
US (1) US20120188352A1 (fr)
EP (1) EP2461759A1 (fr)
DE (1) DE102009040430B4 (fr)
WO (1) WO2011026958A1 (fr)

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2632382B2 (fr) * 2010-10-28 2024-06-26 Intersect ENT International GmbH Accessoire de navigation pour appareils optiques en médecine et procédé associé
US9510771B1 (en) 2011-10-28 2016-12-06 Nuvasive, Inc. Systems and methods for performing spine surgery
DE102011120937B4 (de) * 2011-12-14 2019-03-14 Carl Zeiss Meditec Ag Anordnung und Verfahren zur Registrierung von Gewebeverschiebungen
CN104411226B (zh) * 2012-06-28 2017-01-18 皇家飞利浦有限公司 使用以机器人的方式操纵的内窥镜的增强的血管可视化
DE102012220115A1 (de) * 2012-11-05 2014-05-22 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Bildgebendes System, Operationsvorrichtung mit dem bildgebenden System und Verfahren zur Bildgebung
US9848922B2 (en) 2013-10-09 2017-12-26 Nuvasive, Inc. Systems and methods for performing spine surgery
DE102013222230A1 (de) 2013-10-31 2015-04-30 Fiagon Gmbh Chirurgisches Instrument
NL2012416B1 (en) 2014-03-12 2015-11-26 Stichting Katholieke Univ Anatomical Image Projection System.
WO2016082017A1 (fr) * 2014-11-27 2016-06-02 Synaptive Medical (Barbados) Inc. Procédé, système et appareil pour enregistrement d'image chirurgicale quantitatif
JP2018516718A (ja) * 2015-03-01 2018-06-28 アリス エムディー, インコーポレイテッドARIS MD, Inc. 拡張リアリティの形態学的診断法
WO2017013986A1 (fr) * 2015-07-17 2017-01-26 シャープ株式会社 Dispositif de traitement d'informations, terminal, et système de communication à distance
CN105534536A (zh) * 2015-11-20 2016-05-04 江门市宏乔新材料科技有限公司江海区分公司 一种体表成像投影系统
AU2017227708A1 (en) 2016-03-01 2018-10-18 ARIS MD, Inc. Systems and methods for rendering immersive environments
US11705238B2 (en) * 2018-07-26 2023-07-18 Covidien Lp Systems and methods for providing assistance during surgery
EP3719749A1 (fr) 2019-04-03 2020-10-07 Fiagon AG Medical Technologies Procédé et configuration d'enregistrement
DE102019212103A1 (de) 2019-08-13 2021-02-18 Siemens Healthcare Gmbh Surrogatmarker basierend auf medizinischen Bilddaten
IT202000007252A1 (it) * 2020-04-06 2021-10-06 Artiness S R L Metodo di tracciamento di un dispositivo medico in tempo reale a partire da immagini ecocardiografiche per la supervisione olografica remota
WO2022055588A1 (fr) 2020-09-08 2022-03-17 Medtronic, Inc. Utilitaire de découverte d'imagerie pour augmenter la gestion d'image clinique
US20230053189A1 (en) * 2021-08-11 2023-02-16 Terumo Cardiovascular Systems Corporation Augmented-reality endoscopic vessel harvesting
CN114187335A (zh) * 2021-11-09 2022-03-15 北京东软医疗设备有限公司 多视图医学图像的配准方法、装置及设备

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4417944A1 (de) * 1994-05-21 1995-11-23 Zeiss Carl Fa Verfahren zum Korrelieren verschiedener Koordinatensysteme in der rechnergestützten, stereotaktischen Chirurgie
JP3568280B2 (ja) * 1995-07-12 2004-09-22 富士写真フイルム株式会社 外科手術支援システム
AU2665297A (en) * 1996-04-10 1997-10-29 Endoscopic Technologies, Inc. Instruments for cardiovascular surgery
US6856830B2 (en) * 2001-07-19 2005-02-15 Bin He Method and apparatus of three dimension electrocardiographic imaging
DE10210647A1 (de) * 2002-03-11 2003-10-02 Siemens Ag Verfahren zur Bilddarstellung eines in einen Untersuchungsbereich eines Patienten eingebrachten medizinischen Instruments
US7203277B2 (en) * 2003-04-25 2007-04-10 Brainlab Ag Visualization device and method for combined patient and object image data
FR2855292B1 (fr) * 2003-05-22 2005-12-09 Inst Nat Rech Inf Automat Dispositif et procede de recalage en temps reel de motifs sur des images, notamment pour le guidage par localisation
US7103399B2 (en) * 2003-09-08 2006-09-05 Vanderbilt University Apparatus and methods of cortical surface registration and deformation tracking for patient-to-image alignment in relation to image-guided surgery
DE10359317A1 (de) * 2003-12-17 2005-07-21 Siemens Ag Verfahren zur gezielten Navigation eines medizinischen Instruments, insbesondere eines Katheders
US20090163809A1 (en) * 2004-06-03 2009-06-25 Kane Scott D Medical method and associated apparatus utilizable in accessing internal organs through skin surface
US8989349B2 (en) * 2004-09-30 2015-03-24 Accuray, Inc. Dynamic tracking of moving targets
CA2600731A1 (fr) * 2005-03-11 2006-09-14 Bracco Imaging S.P.A. Methodes et appareil de navigation et visualisation a des fins chirurgicales a l'aide d'un microscope
JP3932303B2 (ja) * 2005-05-13 2007-06-20 独立行政法人放射線医学総合研究所 臓器動態の定量化方法、装置、臓器位置の予測方法、装置、放射線照射方法、装置及び臓器異常検出装置
EP1966764A1 (fr) * 2005-12-19 2008-09-10 Philips Intellectual Property & Standards GmbH Reconstruction iterative d'une image d'un objet mobile a partir de donnees de projection
CN101341516A (zh) * 2005-12-20 2009-01-07 皇家飞利浦电子股份有限公司 用于图像数据运动补偿的方法
WO2007115825A1 (fr) * 2006-04-12 2007-10-18 Nassir Navab Procédé et dispositif d'augmentation sans enregistrement
US20100210902A1 (en) * 2006-05-04 2010-08-19 Nassir Navab Virtual Penetrating Mirror Device and Method for Visualizing Virtual Objects in Endoscopic Applications
US20080161687A1 (en) * 2006-12-29 2008-07-03 Suri Jasjit S Repeat biopsy system
WO2008107905A2 (fr) * 2007-03-08 2008-09-12 Sync-Rx, Ltd. Imagerie et outils à utiliser avec des organes mobiles
EP2358269B1 (fr) * 2007-03-08 2019-04-10 Sync-RX, Ltd. Traitement d'image et activation d'outil pour procédures médicales
US8010177B2 (en) * 2007-04-24 2011-08-30 Medtronic, Inc. Intraoperative image registration
EP2303385B1 (fr) * 2008-06-19 2013-12-11 Sync-RX, Ltd. Avancement progressif d'un instrument médical

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2011026958A1 *

Also Published As

Publication number Publication date
DE102009040430A1 (de) 2011-03-17
WO2011026958A1 (fr) 2011-03-10
US20120188352A1 (en) 2012-07-26
DE102009040430B4 (de) 2013-03-07

Similar Documents

Publication Publication Date Title
DE102009040430B4 (de) Vorrichtung, Verfahren und Computerprogramm zur Überlagerung eines intraoperativen Livebildes eines Operationsgebiets oder des Operationsgebiets mit einem präoperativen Bild des Operationsgebiets
DE102005030646B4 (de) Verfahren zur Kontur-Visualisierung von zumindest einer interessierenden Region in 2D-Durchleuchtungsbildern
DE102005023167B4 (de) Verfahren und Vorrichtung zur Registrierung von 2D-Projektionsbildern relativ zu einem 3D-Bilddatensatz
DE10322739B4 (de) Verfahren zur markerlosen Navigation in präoperativen 3D-Bildern unter Verwendung eines intraoperativ gewonnenen 3D-C-Bogen-Bildes
DE102011079561B4 (de) Verfahren und Röntgengerät zum zeitlich aktuellen Darstellen eines bewegten Abschnitts eines Körpers, Computerprogramm und Datenträger
DE10333543A1 (de) Verfahren zur gekoppelten Darstellung intraoperativer sowie interaktiv und iteraktiv re-registrierter präoperativer Bilder in der medizinischen Bildgebung
DE10210646A1 (de) Verfahren zur Bilddarstellung eines in einen Untersuchungsbereich eines Patienten eingebrachten medizinischen Instruments
DE102016203857B4 (de) Verfahren zur Erfassung und Verarbeitung von Bilddaten eines Untersuchungsobjekts
DE10015826A1 (de) System und Verfahren zur Erzeugung eines Bildes
DE10323008A1 (de) Verfahren zur automatischen Fusion von 2D-Fluoro-C-Bogen-Bildern mit präoperativen 3D-Bildern unter einmaliger Verwendung von Navigationsmarken
DE102004004620A1 (de) Verfahren zur Registrierung und Überlagerung von Bilddaten bei Serienaufnahmen in der medizinischen Bildgebung
DE10210647A1 (de) Verfahren zur Bilddarstellung eines in einen Untersuchungsbereich eines Patienten eingebrachten medizinischen Instruments
DE10210650A1 (de) Verfahren zur dreidimensionalen Darstellung eines Untersuchungsbereichs eines Patienten in Form eines 3D-Rekonstruktionsbilds
EP1121900A2 (fr) Méthode pour déterminer la position d'un instrument médical
DE102006003126A1 (de) Verfahren und Vorrichtung zum Visualisieren von 3D-Objekten
DE102005027678A1 (de) Verfahren und Vorrichtung zur Markierung von dreidimensionalen Strukturen auf zweidimensionalen Projektionsbildern
DE102006026752A1 (de) Verfahren zur Registrierung von funktionellen MR-Bilddaten mit Röntgendurchleuchtung
EP1348394B1 (fr) Assistance à la planification ou navigation par des données génériques obtenues de patients avec adaptation bi-dimensionelle
DE102008045276B4 (de) Verfahren zur Ansteuerung einer medizintechnischen Anlage, medizintechnische Anlage und Computerprogramm
DE102019201227A1 (de) Bildgebendes Gerät und Verfahren zum Erzeugen eines bewegungskompensierten Bildes oder Videos, Computerprogrammprodukt und computerlesbares Speichermedium
EP1464285B1 (fr) Recalage en perspective et visualisation des régions corporelles internes
DE102007051479B4 (de) Verfahren und Vorrichtung zur Darstellung von Bilddaten mehrerer Bilddatensätze während einer medizinischen Intervention
DE102012200686A1 (de) Verfahren und Vorrichtung zur Positionierung einer Röntgenvorrichtung
DE102009024652A1 (de) Hochauflösende, dreidimensionale medizinische Bildgebung mit dynamischen Echtzeitinformationen
DE102008054298B4 (de) Verfahren und Vorrichtung zur 3D-Visualisierung eines Eingriffspfades eines medizinischen Instrumentes, eines medizinischen Instrumentes und/oder einer bestimmten Gewebestruktur eines Patienten

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20120305

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20130402