WO2008035271A2 - Device for registering a 3d model - Google Patents

Device for registering a 3d model Download PDF

Info

Publication number
WO2008035271A2
WO2008035271A2 PCT/IB2007/053740 IB2007053740W WO2008035271A2 WO 2008035271 A2 WO2008035271 A2 WO 2008035271A2 IB 2007053740 W IB2007053740 W IB 2007053740W WO 2008035271 A2 WO2008035271 A2 WO 2008035271A2
Authority
WO
WIPO (PCT)
Prior art keywords
model
spatial coordinates
coordinates
imaging device
examination apparatus
Prior art date
Application number
PCT/IB2007/053740
Other languages
French (fr)
Other versions
WO2008035271A3 (en
Inventor
Heinrich Schulz
Jochen Kruecker
Original Assignee
Koninklijke Philips Electronics N.V.
Philips Intellectual Property & Standards Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V., Philips Intellectual Property & Standards Gmbh filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2008035271A2 publication Critical patent/WO2008035271A2/en
Publication of WO2008035271A3 publication Critical patent/WO2008035271A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/062Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using magnetic field
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/252User interfaces for surgical systems indicating steps of a surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/363Use of fiducial points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • A61B2090/3945Active visible markers, e.g. light emitting diodes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/397Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave
    • A61B2090/3975Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave active
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/397Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave
    • A61B2090/3975Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave active
    • A61B2090/3979Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave active infrared
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular

Definitions

  • the invention relates to an examination apparatus, a method, and a record carrier that allow to register live sectional images of an object with a previously acquired 3D model.
  • the examination apparatus is intended for examining an object, particularly for (minimally invasive) diagnostic or therapeutic interventions at patients.
  • the apparatus comprises the following components: a) An imaging device for generating a sectional image of the object.
  • the image will typically be two-dimensional, but it may also be three-dimensional.
  • each pixel/voxel of a "sectional image” uniquely corresponds to one point of the object (in contrast to projections, where each pixel corresponds to a line through the object).
  • the imaging device may typically be a mobile, e.g. hand- held device that allows a flexible selection of the plane of the generated sectional image with respect to the object.
  • a localization device for determining spatial coordinates of the imaging device with respect to a given reference frame.
  • the "spatial coordinates" may in general comprise the spatial position and/or orientation of selected points. In case of the imaging device, the spatial coordinates will typically comprise the spatial positions of three selected points on the imaging device. If not otherwise stated, the term “spatial coordinates” will in the following always refer to the given reference frame.
  • a data processing device e.g. a microcomputer or workstation, for registering the reference frame with a given 3D model of the object based on the spatial coordinates and model coordinates of at least one object feature ("anatomical marker") that appears in the sectional image and in the 3D model.
  • a "3D model” is understood as usual as a three-dimensional set of data, more precisely as a mapping of three-dimensional model coordinates x', y', z' to associated image values, for example grey values, color values, or membership values (e.g. describing if the point with the model coordinates belongs to bone, tissue or the like).
  • the described examination apparatus has the advantage that it allows an improved matching of measured spatial coordinates with a previously acquired 3D model as it exploits anatomical markers, which appear in the sectional image and which can be chosen from the immediate surroundings of the region of interest. It should be noted in this respect that the spatial coordinates of an object feature appearing in a sectional image can uniquely be determined (which is not the case with projection images). Therefore, each object feature provides a maximal amount of information for the desired registration between reference frame and 3D model.
  • the examination apparatus further comprises at least one marker that is attached to the object, wherein said marker is called “artificial marker” in the following to distinguish it from the mentioned object features that can be considered as “natural” or “anatomical” markers.
  • the artificial marker may for example be a piece of metal (e.g. in the shape of a circle or a cross) which can be attached to the skin of a patient and shows up in X-ray images with a high contrast.
  • a set of several artificial markers is used.
  • the spatial coordinates of the artificial marker(s) with respect to a given reference frame can be determined with the localization device.
  • the registering of the reference frame with the given 3D model of the object in the data processing device can additionally be based on the spatial coordinates and model coordinates of the at least one artificial marker. This allows to improve the accuracy of the registering procedure and provides initial values for a coarse registration as long as object features have not yet been registered.
  • the data processing device comprises a "landmark determination module" for determining the spatial coordinates of the at least one object feature based on the sectional image and on the spatial coordinates of the imaging device.
  • the monitoring of the spatial coordinates of the imaging device allows to move said imaging device during an intervention while keeping full control over the spatial position of the generated sectional images. This fact is exploited by the landmark determination module for calculating the spatial coordinates of an identified object feature.
  • the object features that are used for the registration process may in principle be determined automatically by the data processing device.
  • the data processing device is however connected to an input unit, e.g. a keyboard, a mouse or another pointing device, for interactively determining the object feature(s) in the sectional image and/or in the 3D model.
  • an input unit e.g. a keyboard, a mouse or another pointing device
  • the examination apparatus may further comprise an instrument that can be navigated within the object, wherein the spatial coordinates of said instrument can be determined by the localization device.
  • the navigation of an instrument like a needle or a catheter in the body of a patient is a typical task in minimally invasive surgery.
  • the position of the instrument can be identified and visualized in the 3D model, too, due to the available registration between reference frame and 3D model. It is therefore possible to track the movement of the instrument with high accuracy in the 3D model.
  • the localization device can in principle be realized by any suitable system that allows to measure the required spatial coordinates with sufficient accuracy. Suited localization devices may operate for example based on magnetic, electromagnetic, optical or acoustical measurements. They often use "active" markers which do not only passively appear in images (e.g. as a contrast in an X-ray projection) but actively generate data or signals that allow to determine their spatial position and/or orientation. The active markers may particularly measure and/or emit signals themselves.
  • An active marker is a magnetic field sensor that can measure magnitude and orientation of an external (spatially or temporally inhomogeneous) magnetic field, wherein said measurements allow to infer the spatial position of the marker with respect to the generator of the magnetic field.
  • an active marker may be a source of electromagnetic and/or acoustical radiation, e.g. of near infra red (NIR) or ultrasound, wherein the position of this source can be determined by stereoscopic methods from the intersection of at least two independent lines of sight.
  • NIR near infra red
  • the imaging device may in principle be any device that allows to generate (in real time) sectional images of the object under examination.
  • the imaging device comprises a 2D or 3D ultrasonic scanner, which is a compact device that may readily be used by a physician during an intervention.
  • the 3D model which is registered with the reference frame may originate from any suitable source, for example from theoretical constructions or from statistical data pools.
  • the 3D model is however previously acquired from the particular object under examination by a further imaging device, e.g. a Computed Tomography (CT) scanner or a
  • Magnetic Resonance Imaging (MRI) scanner If an artificial marker is used, it should already be present at its final location on the object during the acquisition of the 3D model.
  • Generating 3D images of a body volume with a CT or MRI scanner is a typical diagnostic step prior to a surgical intervention, and therefore the associated 3D models are usually available without additional effort.
  • the examination apparatus may further optionally comprise a display device, e.g. a monitor, for displaying the sectional image generated by the imaging device, the 3D model and/or images derived therefrom (e.g. sections calculated from the 3D model or overlays of the sectional image and the 3D model).
  • the invention further relates to a method for registering a reference frame with a 3D model of an object, the method comprising the following steps: a) Generating a sectional image of the object with an imaging device, for example an ultrasonic scanner. b) Measuring the spatial coordinates with respect to the reference frame of the imaging device and optionally also of at least one artificial marker that is attached to the object.
  • the invention comprises in general form the steps that can be executed with an examination apparatus of the kind described above. Therefore, reference is made to the preceding description for more information on the details, advantages and improvements of that method.
  • the invention comprises a record carrier, for example a floppy disk, a hard disk, or a compact disc (CD), on which a computer program for registering a reference frame with a 3D model of an object is stored, wherein said program is adapted to execute a method of the aforementioned kind.
  • the Figure shows in particular a patient 2 lying on a table 1 during a minimally invasive intervention comprising the navigation of a catheter 6 through the vessel system of the patient 2.
  • a movable ultrasonic scanner 3 is used to generate a two-dimensional sectional image S of the body, wherein the plane of said sectional image is indicated by a dotted line and wherein the data of the generated sectional image S are communicated to a data processing device 10 (e.g. a workstation).
  • a data processing device 10 e.g. a workstation
  • the examination apparatus further comprises a localization device 4 which allows to determine the spatial coordinates - with respect to a given reference frame x, y, z - of various components:
  • the spatial coordinates rj (typically including both position and orientation) of at least one marker I on the imaging device 3, allowing to determine the position and orientation of the imaging device and thus of the generated sectional images provided that the imaging device 3 has been calibrated beforehand. If an ultrasonic scanner is used as imaging device 3, it is usually the head of this scanner that is tracked.
  • the spatial coordinates ij, ⁇ _ M , Ic measured by the localization device 4 are also transferred to the data processing device 10.
  • the data processing device 10 further comprises a storage 13 in which a 3D model V of the examined body region is stored.
  • Said 3D model V may for example have been generated pre-operatively with a CT scanner 30 or an MRI device.
  • the localization device 4 itself has limited accuracy and the accuracy may vary strongly within the region of interest.
  • the markers M should be positioned surrounding the region of interest, thereby averaging errors at the marker positions. Since markers are typically placed on the patient skin only, an ideal situation can however hardly be achieved.
  • the reference frame x, y, z and the 3D model V i.e. the model based reference frame x', y', z'
  • the Figure shows in this respect the bifurcation A of a vessel as an exemplary anatomical marker.
  • the data processing device 10 is preferably coupled to input devices like a keyboard 21 and a mouse 22 via which a physician can select object features as anatomical markers in the sectional image S and/or the 3D model V.
  • a landmark determination module 11 (which may be realized by dedicated hardware, by software, or by a mixture of both) can then calculate the spatial coordinates ⁇ A of the anatomical marker(s) A with respect to the reference frame x, y, z by taking the actual coordinates rj of the imaging device 3 into consideration.
  • the spatial coordinates ⁇ _ M of the artificial markers M can directly be measured by the localization device 4 (using the pointer 5).
  • a registration module 12 can therefore determine the required registration between the reference frame x, y, z and the 3D model V, i.e. the model based coordinate frame x', y', z', or with other words the mapping between spatial coordinates r and model coordinates r'.
  • the spatial coordinates re of the tip of the catheter which are directly measured by the localization device 4, may be visualized in a representation of the 3D model V on a monitor 23.
  • anatomical landmarks A in the sectional live images S identify the corresponding location in the pre-operative 3D model V, and use these points as additional markers for the registration.
  • a marker set adapted to the region of interest can be obtained in successive steps leading to improved registration accuracy.
  • Using a spatially tracked live (real-time) imaging modality in addition to the pre-operative 3D image allows to establish registration, to improve accuracy of registration, and/or to verify registration between physical space/localization system space and the pre- operative image.
  • the live image is used to identify the current physical location of common anatomical landmarks visible in both (live and pre-operative) images. These landmarks can either be used by themselves to establish a registration, or can be combined with the positions of external fiducial markers M to improve spatial registration accuracy of fiducial marker based registration procedures.
  • the workstation 10 should offer the possibility to navigate on captured images S and V, to identify points in both images, and to compute registration transformations mapping one set of coordinates onto another.
  • the workstation 10 will then allow the computation of an initial registration based on coordinates ⁇ ' M of artificial markers M identified in the pre-operative 3D image V and corresponding coordinates ⁇ _ M identified with the pointer device 5.
  • the workstation 10 will further allow improvement or substitution of the initial registration by identifying landmarks A in the live image S, converting the live image coordinates of the identified landmarks to tracking system coordinates ⁇ A and mapping those to corresponding coordinates ⁇ A ' in the pre-operative 3D image V, in combination with or replacing the coordinates identified for the initial registration.
  • the workstation 10 should offer a side-by- side or overlay display of the live image S and the corresponding section of the pre-operative image V.
  • the invention can be applied in a variety of clinical procedures comprising minimally invasive surgery, interventional radiology, and catheter examinations.
  • needle-based procedures such as biopsies and ablations, laparoscopic procedures, EP procedures and stenting can be improved using the proposed registration method.

Abstract

The invention relates to an examination apparatus comprising an imaging device, for example an ultrasonic scanner (3), for generating sectional images (S) of a body volume, and a localization device (4) for determining the spatial coordinates (ij) of the imaging device (3) with respect to a reference frame (x, y, z). Optionally also the spatial coordinates (Γ_M, Ic) of artificial markers (M) attached to the examined object (2) and/or of an instrument like a catheter (6) can be determined. Object features (A) ('anatomical markers') can be identified in the sectional images (S), and a registration between the reference frame (x, y, z) and a previously acquired 3D model (V) of the examined body region can be determined by a data processing device (10) based on the acquired data.

Description

Device for registering a 3D model
The invention relates to an examination apparatus, a method, and a record carrier that allow to register live sectional images of an object with a previously acquired 3D model.
Surgery and minimally invasive therapy require reliable, precise navigation along predefined paths to predetermined target points. Many kinds of interventions are therefore guided based on pre-operative high-resolution 3D images taken from the region of interest. The US 2005/0027193 Al describes in this respect a method that allows the automatic merging of two-dimensional ("2D") X-ray projections generated with a C-arm X-ray device with pre-operatively generated three-dimensional ("3D") images. By fixing a tool plate to the C-arm and by localizing said plate, projections can be matched with the 3D images independently of the actual C-arm position and without repeated use of markers attached to the patients.
Based on this background it was an object of the present invention to provide means that allow to navigate during an examination procedure with higher accuracy using a pre-operatively generated 3D model.
This objective is achieved by an examination apparatus according to claim 1, by a method according to claim 10, and by a record carrier according to claim 11. Preferred embodiments are disclosed in the dependent claims.
The examination apparatus according to the present invention is intended for examining an object, particularly for (minimally invasive) diagnostic or therapeutic interventions at patients. The apparatus comprises the following components: a) An imaging device for generating a sectional image of the object. The image will typically be two-dimensional, but it may also be three-dimensional. By definition, each pixel/voxel of a "sectional image" uniquely corresponds to one point of the object (in contrast to projections, where each pixel corresponds to a line through the object). The imaging device may typically be a mobile, e.g. hand- held device that allows a flexible selection of the plane of the generated sectional image with respect to the object. b) A localization device for determining spatial coordinates of the imaging device with respect to a given reference frame. The "spatial coordinates" may in general comprise the spatial position and/or orientation of selected points. In case of the imaging device, the spatial coordinates will typically comprise the spatial positions of three selected points on the imaging device. If not otherwise stated, the term "spatial coordinates" will in the following always refer to the given reference frame. c) A data processing device, e.g. a microcomputer or workstation, for registering the reference frame with a given 3D model of the object based on the spatial coordinates and model coordinates of at least one object feature ("anatomical marker") that appears in the sectional image and in the 3D model. In this respect, a "3D model" is understood as usual as a three-dimensional set of data, more precisely as a mapping of three-dimensional model coordinates x', y', z' to associated image values, for example grey values, color values, or membership values (e.g. describing if the point with the model coordinates belongs to bone, tissue or the like).
The described examination apparatus has the advantage that it allows an improved matching of measured spatial coordinates with a previously acquired 3D model as it exploits anatomical markers, which appear in the sectional image and which can be chosen from the immediate surroundings of the region of interest. It should be noted in this respect that the spatial coordinates of an object feature appearing in a sectional image can uniquely be determined (which is not the case with projection images). Therefore, each object feature provides a maximal amount of information for the desired registration between reference frame and 3D model.
According to a preferred embodiment of the invention, the examination apparatus further comprises at least one marker that is attached to the object, wherein said marker is called "artificial marker" in the following to distinguish it from the mentioned object features that can be considered as "natural" or "anatomical" markers. The artificial marker may for example be a piece of metal (e.g. in the shape of a circle or a cross) which can be attached to the skin of a patient and shows up in X-ray images with a high contrast. Typically, a set of several artificial markers is used. The spatial coordinates of the artificial marker(s) with respect to a given reference frame can be determined with the localization device. Moreover, the registering of the reference frame with the given 3D model of the object in the data processing device can additionally be based on the spatial coordinates and model coordinates of the at least one artificial marker. This allows to improve the accuracy of the registering procedure and provides initial values for a coarse registration as long as object features have not yet been registered. In another preferred embodiment of the apparatus, the data processing device comprises a "landmark determination module" for determining the spatial coordinates of the at least one object feature based on the sectional image and on the spatial coordinates of the imaging device. The monitoring of the spatial coordinates of the imaging device allows to move said imaging device during an intervention while keeping full control over the spatial position of the generated sectional images. This fact is exploited by the landmark determination module for calculating the spatial coordinates of an identified object feature. The object features that are used for the registration process may in principle be determined automatically by the data processing device. In a preferred embodiment of the invention, the data processing device is however connected to an input unit, e.g. a keyboard, a mouse or another pointing device, for interactively determining the object feature(s) in the sectional image and/or in the 3D model. Thus the expert knowledge of a human operator can be exploited to identify and correctly locate characteristic object features (e.g. vessel bifurcations or bone structures) in the available images. The examination apparatus may further comprise an instrument that can be navigated within the object, wherein the spatial coordinates of said instrument can be determined by the localization device. The navigation of an instrument like a needle or a catheter in the body of a patient is a typical task in minimally invasive surgery. As the spatial coordinates of this instrument can be determined, the position of the instrument can be identified and visualized in the 3D model, too, due to the available registration between reference frame and 3D model. It is therefore possible to track the movement of the instrument with high accuracy in the 3D model.
The localization device can in principle be realized by any suitable system that allows to measure the required spatial coordinates with sufficient accuracy. Suited localization devices may operate for example based on magnetic, electromagnetic, optical or acoustical measurements. They often use "active" markers which do not only passively appear in images (e.g. as a contrast in an X-ray projection) but actively generate data or signals that allow to determine their spatial position and/or orientation. The active markers may particularly measure and/or emit signals themselves. One example of an active marker is a magnetic field sensor that can measure magnitude and orientation of an external (spatially or temporally inhomogeneous) magnetic field, wherein said measurements allow to infer the spatial position of the marker with respect to the generator of the magnetic field. In another embodiment, an active marker may be a source of electromagnetic and/or acoustical radiation, e.g. of near infra red (NIR) or ultrasound, wherein the position of this source can be determined by stereoscopic methods from the intersection of at least two independent lines of sight.
The imaging device may in principle be any device that allows to generate (in real time) sectional images of the object under examination. Preferably, the imaging device comprises a 2D or 3D ultrasonic scanner, which is a compact device that may readily be used by a physician during an intervention.
The 3D model which is registered with the reference frame may originate from any suitable source, for example from theoretical constructions or from statistical data pools.
Preferably, the 3D model is however previously acquired from the particular object under examination by a further imaging device, e.g. a Computed Tomography (CT) scanner or a
Magnetic Resonance Imaging (MRI) scanner. If an artificial marker is used, it should already be present at its final location on the object during the acquisition of the 3D model.
Generating 3D images of a body volume with a CT or MRI scanner is a typical diagnostic step prior to a surgical intervention, and therefore the associated 3D models are usually available without additional effort.
The examination apparatus may further optionally comprise a display device, e.g. a monitor, for displaying the sectional image generated by the imaging device, the 3D model and/or images derived therefrom (e.g. sections calculated from the 3D model or overlays of the sectional image and the 3D model). The invention further relates to a method for registering a reference frame with a 3D model of an object, the method comprising the following steps: a) Generating a sectional image of the object with an imaging device, for example an ultrasonic scanner. b) Measuring the spatial coordinates with respect to the reference frame of the imaging device and optionally also of at least one artificial marker that is attached to the object. c) Determining the spatial coordinates of at least one object feature in the sectional image. d) Identifying the model coordinates (i.e. coordinates with respect to a model based reference frame) of the object feature and, if applicable, of the artificial marker in the
3D model. e) Calculating the desired registration between the reference frame and the 3D model based on the spatial and model coordinates of the at least one object feature and optionally of the artificial marker. The method comprises in general form the steps that can be executed with an examination apparatus of the kind described above. Therefore, reference is made to the preceding description for more information on the details, advantages and improvements of that method. Finally, the invention comprises a record carrier, for example a floppy disk, a hard disk, or a compact disc (CD), on which a computer program for registering a reference frame with a 3D model of an object is stored, wherein said program is adapted to execute a method of the aforementioned kind.
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiment(s) described hereinafter. These embodiments will be described by way of example with the help of the accompanying single drawing which shows schematically an examination apparatus according to the present invention.
The Figure shows in particular a patient 2 lying on a table 1 during a minimally invasive intervention comprising the navigation of a catheter 6 through the vessel system of the patient 2. A movable ultrasonic scanner 3 is used to generate a two-dimensional sectional image S of the body, wherein the plane of said sectional image is indicated by a dotted line and wherein the data of the generated sectional image S are communicated to a data processing device 10 (e.g. a workstation).
The examination apparatus further comprises a localization device 4 which allows to determine the spatial coordinates - with respect to a given reference frame x, y, z - of various components:
The spatial coordinates rj (typically including both position and orientation) of at least one marker I on the imaging device 3, allowing to determine the position and orientation of the imaging device and thus of the generated sectional images provided that the imaging device 3 has been calibrated beforehand. If an ultrasonic scanner is used as imaging device 3, it is usually the head of this scanner that is tracked.
The spatial coordinates re of a marker C fixed at the tip of the catheter 6, allowing to determine the position and orientation of said tip. The spatial coordinates of the tip of a pointer 5, wherein said pointer may particularly be moved to artificial (e.g. X-ray opaque) markers M attached to the skin of the patient 2 for determining their spatial coordinates Γ_M.
The spatial coordinates ij, Γ_M, Ic measured by the localization device 4 are also transferred to the data processing device 10.
The data processing device 10 further comprises a storage 13 in which a 3D model V of the examined body region is stored. Said 3D model V may for example have been generated pre-operatively with a CT scanner 30 or an MRI device.
The scenario described up to now and similar situations in surgery and minimally invasive therapy require a reliable, precise navigation of the catheter or other instruments along predefined paths to predetermined target points. These interventions may therefore be guided based on the pre-operative high-resolution 3D models. The 3D models are however non real-time (at least when high resolution is required), and their use for imaging during an intervention is thus very limited. Proper spatial registration of pre- operative images in combination with a navigation system, i.e. the localization device 4, would make it possible to use the images much more effectively during an invention.
A spatial registration that is solely based on the spatial coordinates of the artificial markers M (determined via the pointer 5 and the localization device 4) and the image of these markers in the 3D model typically suffers from inaccuracy caused by various kinds of motion changing the spatial relationship of the markers M to each other.
Furthermore, the localization device 4 itself has limited accuracy and the accuracy may vary strongly within the region of interest. For these reasons the markers M should be positioned surrounding the region of interest, thereby averaging errors at the marker positions. Since markers are typically placed on the patient skin only, an ideal situation can however hardly be achieved.
In order to improve the accuracy of the registration between the reference frame x, y, z and the 3D model V (i.e. the model based reference frame x', y', z'), it is proposed here to make use of both artificial markers M attached to the patient and of natural or anatomical markers A that are present in the examined object and that can be identified both on the sectional images S and in the 3D model V. The Figure shows in this respect the bifurcation A of a vessel as an exemplary anatomical marker. The data processing device 10 is preferably coupled to input devices like a keyboard 21 and a mouse 22 via which a physician can select object features as anatomical markers in the sectional image S and/or the 3D model V. A landmark determination module 11 (which may be realized by dedicated hardware, by software, or by a mixture of both) can then calculate the spatial coordinates ΓA of the anatomical marker(s) A with respect to the reference frame x, y, z by taking the actual coordinates rj of the imaging device 3 into consideration.
As was described above, the spatial coordinates Γ_M of the artificial markers M can directly be measured by the localization device 4 (using the pointer 5). A registration module 12 can therefore determine the required registration between the reference frame x, y, z and the 3D model V, i.e. the model based coordinate frame x', y', z', or with other words the mapping between spatial coordinates r and model coordinates r'. Once this registration is known, the spatial coordinates re of the tip of the catheter, which are directly measured by the localization device 4, may be visualized in a representation of the 3D model V on a monitor 23.
In summary, it is proposed to identify anatomical landmarks A in the sectional live images S, identify the corresponding location in the pre-operative 3D model V, and use these points as additional markers for the registration. By doing so a marker set adapted to the region of interest can be obtained in successive steps leading to improved registration accuracy.
Using a spatially tracked live (real-time) imaging modality in addition to the pre-operative 3D image allows to establish registration, to improve accuracy of registration, and/or to verify registration between physical space/localization system space and the pre- operative image. The live image is used to identify the current physical location of common anatomical landmarks visible in both (live and pre-operative) images. These landmarks can either be used by themselves to establish a registration, or can be combined with the positions of external fiducial markers M to improve spatial registration accuracy of fiducial marker based registration procedures. The workstation 10 should offer the possibility to navigate on captured images S and V, to identify points in both images, and to compute registration transformations mapping one set of coordinates onto another. The workstation 10 will then allow the computation of an initial registration based on coordinates Γ'M of artificial markers M identified in the pre-operative 3D image V and corresponding coordinates Γ_M identified with the pointer device 5. The workstation 10 will further allow improvement or substitution of the initial registration by identifying landmarks A in the live image S, converting the live image coordinates of the identified landmarks to tracking system coordinates ΓA and mapping those to corresponding coordinates ΓA' in the pre-operative 3D image V, in combination with or replacing the coordinates identified for the initial registration. For verification of the registration, the workstation 10 should offer a side-by- side or overlay display of the live image S and the corresponding section of the pre-operative image V.
The invention can be applied in a variety of clinical procedures comprising minimally invasive surgery, interventional radiology, and catheter examinations. In particular, needle-based procedures such as biopsies and ablations, laparoscopic procedures, EP procedures and stenting can be improved using the proposed registration method.
Finally it is pointed out that in the present application the term "comprising" does not exclude other elements or steps, that "a" or "an" does not exclude a plurality, and that a single processor or other unit may fulfill the functions of several means. The invention resides in each and every novel characteristic feature and each and every combination of characteristic features. Moreover, reference signs in the claims shall not be construed as limiting their scope.

Claims

CLAIMS:
1. Examination apparatus for examining an object (2), comprising a) an imaging device (3) for generating a sectional image (S) of the object (2); b) a localization device (4) for determining the spatial coordinates (ij) of the imaging device (3) with respect to a reference frame (x, y, z); c) a data processing device (10) for registering the reference frame (x, y, z) with a 3D model (V) of the object (2) based on the spatial coordinates (ΓA) and model coordinates (rΑ) of at least one object feature (A) that appears in the sectional image (S) and in the 3D model (V).
2. Examination apparatus according to claim 1, characterized in that it comprises at least one artificial marker (M) that can be attached to the object (2), and that the registering of the data processing device (10) is further based on the spatial coordinates (Γ_M) and model coordinates (Γ'M) of said artificial marker (M).
3. The examination apparatus according to claim 1, characterized in that the data processing device (10) comprises a landmark determination module (11) for determining the spatial coordinates (iχ) of the at least one object feature (A) based on the sectional image (S) and on the spatial coordinates (ij) of the imaging device (3).
4. The examination apparatus according to claim 1, characterized in that it comprises an input unit (21, 22) that is coupled to the data processing device (10) for interactively determining the at least one object feature (A) in the sectional image (S) and/or in the 3D model (V).
5. The examination apparatus according to claim 1, characterized in that it comprises an instrument (6) that can be moved within the object (2), wherein the spatial coordinates (re) of the instrument can be determined by the localization device (4).
6. The examination apparatus according to claim 1, characterized in that the localization device (4) operates based on magnetic, electromagnetic, optical or acoustical measurements.
7. The examination apparatus according to claim 1, characterized in that the imaging device comprises an ultrasonic scanner (3).
8. The examination apparatus according to claim 1, characterized in that the 3D model (V) is acquired by a further imaging device, preferably a CT scanner (30) or a MRI scanner.
9. The examination apparatus according to claim 1, characterized in that it comprises a display device (23) for displaying the sectional image (S), the 3D model (V) and/or images derived therefrom.
10. A method for registering a reference frame (x, y, z) with a 3D model (V) of an object (2), comprising a) generating a sectional image (S) of the object (2) with an imaging device (3); b) measuring the spatial coordinates (ij) of the imaging device (3); c) determining the spatial coordinates (iχ) of at least one object feature (A) from the sectional image (S); d) identifying the model coordinates (r_Α) of the object feature (A) in the 3D model (V); e) calculating the registration between the reference frame (x, y, z) and the 3D model (V) based on the spatial coordinates (iχ) and the model coordinates (rΑ) of the object feature (A).
11. A record carrier on which a computer program registering a reference frame (x, y, z) with a 3D model (V) of an object (2) is stored, said program being adapted to execute a method according to claim 10.
PCT/IB2007/053740 2006-09-20 2007-09-17 Device for registering a 3d model WO2008035271A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP06120972 2006-09-20
EP06120972.2 2006-09-20

Publications (2)

Publication Number Publication Date
WO2008035271A2 true WO2008035271A2 (en) 2008-03-27
WO2008035271A3 WO2008035271A3 (en) 2008-11-06

Family

ID=39200920

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2007/053740 WO2008035271A2 (en) 2006-09-20 2007-09-17 Device for registering a 3d model

Country Status (1)

Country Link
WO (1) WO2008035271A2 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009150647A2 (en) * 2008-06-11 2009-12-17 Dune Medical Devices Ltd. Double registration
WO2013140315A1 (en) * 2012-03-23 2013-09-26 Koninklijke Philips N.V. Calibration of tracked interventional ultrasound
US9257220B2 (en) 2013-03-05 2016-02-09 Ezono Ag Magnetization device and method
EP2996607A1 (en) * 2013-03-15 2016-03-23 The Cleveland Clinic Foundation Method and system to facilitate intraoperative positioning and guidance
US9459087B2 (en) 2013-03-05 2016-10-04 Ezono Ag Magnetic position detection system
US9597008B2 (en) 2011-09-06 2017-03-21 Ezono Ag Imaging probe and method of obtaining position and/or orientation information
KR20190076806A (en) * 2017-12-22 2019-07-02 한국기술교육대학교 산학협력단 Method for workspace modeling based on virtual wall using 3d scanner and system thereof
US10434278B2 (en) 2013-03-05 2019-10-08 Ezono Ag System for image guided procedure
US11132801B2 (en) 2018-02-02 2021-09-28 Centerline Biomedical, Inc. Segmentation of three-dimensional images containing anatomic structures
US11150776B2 (en) 2018-02-02 2021-10-19 Centerline Biomedical, Inc. Graphical user interface for marking anatomic structures
US11393110B2 (en) 2019-04-04 2022-07-19 Centerline Biomedical, Inc. Spatial registration of tracking system with an image using two-dimensional image projections
US11538574B2 (en) 2019-04-04 2022-12-27 Centerline Biomedical, Inc. Registration of spatial tracking system with augmented reality display

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6317621B1 (en) * 1999-04-30 2001-11-13 Siemens Aktiengesellschaft Method and device for catheter navigation in three-dimensional vascular tree exposures

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6317621B1 (en) * 1999-04-30 2001-11-13 Siemens Aktiengesellschaft Method and device for catheter navigation in three-dimensional vascular tree exposures

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
ELA SJOLIE ET AL: "Minimal invasive abdominal surgery based on ultrasound vision, possible?" COMPUTER ASSISTED RADIOLOGY AND SURGERY, [Online] vol. 1230, June 2001 (2001-06), pages 38-43, XP002492904 Retrieved from the Internet: URL:http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B7581-47V9SDF-H&_user=987766&_rdoc=1&_fmt=&_orig=search&_sort=d&view=c&_acct=C000049880&_version=1&_urlVersion=0&_userid=987766&md5=b0d4613dfadc8bde83d9bfd7ad393c19> [retrieved on 2008-08-20] *
KASPERSEN J. H. ET AL: "Three-dimensional ultrasound-based navigation combined with preoperative CT during abdominal interventions: A feasibility study" CARDIOVASCULAR AND INTERVENTIONAL RADIOLOGY, vol. 26, no. 4, 25 June 2003 (2003-06-25), pages 347-356, XP008095609 Springer, Berlin, Germany *
MAGUIRE G Q ET AL: "GRAPHICS APPLIED TO MEDICAL IMAGE REGISTRATION" IEEE COMPUTER GRAPHICS AND APPLICATIONS, IEEE SERVICE CENTER, NEW YORK, NY, US, vol. 11, no. 2, 1 March 1991 (1991-03-01), pages 20-28, XP001176846 ISSN: 0272-1716 *
RICHARD L. WAHI ET AL: ""Anatometabolic" Tumor Imaging : Fusion of FDG PET with CT or MRI to Localize Foci of Increased Activity" THEJOURNALOF NUDEARMEDICINE, [Online] vol. 34, no. 7, July 2003 (2003-07), XP002492903 Retrieved from the Internet: URL:http://jnm.snmjournals.org/cgi/reprint/34/7/1190?ijkey=f462e5584eeaeed9d7c7e06673d7ba8a67d685f4> [retrieved on 2008-08-20] *
TAYLOR L M ET AL: "Design of a simple ultrasound/ CT fusion image fusion solution for the evaluation of prostate seed brachytherapy" PROCEEDINGS OF THE IEEE 27TH. ANNUAL NORTHEAST BIOENGINEERING CONFERENCE. UNIVERITY OF CONNECTICUT STORRS, CT, MARCH 31 - APRIL 1, 2001; [PROCEEDINGS OF THE IEEE ANNUAL NORTHEAST BIOENGINEERING CONFERENCE], NEW YORK, NY : IEEE, US, vol. CONF. 27, 31 March 2001 (2001-03-31), pages 57-58, XP010543358 ISBN: 978-0-7803-6717-3 *

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009150647A3 (en) * 2008-06-11 2010-03-18 Dune Medical Devices Ltd. Double registration
US9008756B2 (en) 2008-06-11 2015-04-14 Dune Medical Devices Ltd. Mapping system and method for mapping a target containing tissue
WO2009150647A2 (en) * 2008-06-11 2009-12-17 Dune Medical Devices Ltd. Double registration
US9597008B2 (en) 2011-09-06 2017-03-21 Ezono Ag Imaging probe and method of obtaining position and/or orientation information
US10765343B2 (en) 2011-09-06 2020-09-08 Ezono Ag Imaging probe and method of obtaining position and/or orientation information
US10758155B2 (en) 2011-09-06 2020-09-01 Ezono Ag Imaging probe and method of obtaining position and/or orientation information
WO2013140315A1 (en) * 2012-03-23 2013-09-26 Koninklijke Philips N.V. Calibration of tracked interventional ultrasound
US10434278B2 (en) 2013-03-05 2019-10-08 Ezono Ag System for image guided procedure
US9257220B2 (en) 2013-03-05 2016-02-09 Ezono Ag Magnetization device and method
US9459087B2 (en) 2013-03-05 2016-10-04 Ezono Ag Magnetic position detection system
EP2996607B1 (en) * 2013-03-15 2021-06-16 The Cleveland Clinic Foundation System to facilitate intraoperative positioning and guidance
JP2017217510A (en) * 2013-03-15 2017-12-14 ザ クリーブランド クリニック ファウンデーションThe Cleveland ClinicFoundation Method and system to facilitate intraoperative positioning and guidance
US10799145B2 (en) 2013-03-15 2020-10-13 The Cleveland Clinic Foundation Method and system to facilitate intraoperative positioning and guidance
EP2996607A1 (en) * 2013-03-15 2016-03-23 The Cleveland Clinic Foundation Method and system to facilitate intraoperative positioning and guidance
JP7290763B2 (en) 2013-03-15 2023-06-13 ザ クリーブランド クリニック ファウンデーション A system that facilitates intraoperative positioning and guidance
KR101998396B1 (en) 2017-12-22 2019-07-09 한국기술교육대학교 산학협력단 Method for workspace modeling based on virtual wall using 3d scanner and system thereof
KR20190076806A (en) * 2017-12-22 2019-07-02 한국기술교육대학교 산학협력단 Method for workspace modeling based on virtual wall using 3d scanner and system thereof
US11132801B2 (en) 2018-02-02 2021-09-28 Centerline Biomedical, Inc. Segmentation of three-dimensional images containing anatomic structures
US11150776B2 (en) 2018-02-02 2021-10-19 Centerline Biomedical, Inc. Graphical user interface for marking anatomic structures
US11604556B2 (en) 2018-02-02 2023-03-14 Centerline Biomedical, Inc. Graphical user interface for marking anatomic structures
US11393110B2 (en) 2019-04-04 2022-07-19 Centerline Biomedical, Inc. Spatial registration of tracking system with an image using two-dimensional image projections
US11538574B2 (en) 2019-04-04 2022-12-27 Centerline Biomedical, Inc. Registration of spatial tracking system with augmented reality display

Also Published As

Publication number Publication date
WO2008035271A3 (en) 2008-11-06

Similar Documents

Publication Publication Date Title
WO2008035271A2 (en) Device for registering a 3d model
US9320569B2 (en) Systems and methods for implant distance measurement
US7885441B2 (en) Systems and methods for implant virtual review
US7831096B2 (en) Medical navigation system with tool and/or implant integration into fluoroscopic image projections and method of use
US8131031B2 (en) Systems and methods for inferred patient annotation
US20180125604A1 (en) Automatic Identification Of Tracked Surgical Devices Using An Electromagnetic Localization System
US7664542B2 (en) Registering intra-operative image data sets with pre-operative 3D image data sets on the basis of optical surface extraction
US8682413B2 (en) Systems and methods for automated tracker-driven image selection
US6996430B1 (en) Method and system for displaying cross-sectional images of a body
US20210059770A1 (en) Direct Visualization of a Device Location
JP6404713B2 (en) System and method for guided injection in endoscopic surgery
US7467007B2 (en) Respiratory gated image fusion of computed tomography 3D images and live fluoroscopy images
US8694075B2 (en) Intra-operative registration for navigated surgical procedures
US10166078B2 (en) System and method for mapping navigation space to patient space in a medical procedure
EP2153794B1 (en) System for and method of visualizing an interior of a body
US8886289B2 (en) Dynamic reference method and system for use with surgical procedures
US20080119725A1 (en) Systems and Methods for Visual Verification of CT Registration and Feedback
US20080234570A1 (en) System For Guiding a Medical Instrument in a Patient Body
US20050004449A1 (en) Method for marker-less navigation in preoperative 3D images using an intraoperatively acquired 3D C-arm image
US10357317B2 (en) Handheld scanner for rapid registration in a medical navigation system
US20080154120A1 (en) Systems and methods for intraoperative measurements on navigated placements of implants
US20080119712A1 (en) Systems and Methods for Automated Image Registration
US20180303550A1 (en) Endoscopic View of Invasive Procedures in Narrow Passages
US20080119724A1 (en) Systems and methods for intraoperative implant placement analysis
CN115861163A (en) Provision of resulting image data

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07826403

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07826403

Country of ref document: EP

Kind code of ref document: A2