WO2011031134A1 - Image processing method and system - Google Patents

Image processing method and system Download PDF

Info

Publication number
WO2011031134A1
WO2011031134A1 PCT/NL2009/050549 NL2009050549W WO2011031134A1 WO 2011031134 A1 WO2011031134 A1 WO 2011031134A1 NL 2009050549 W NL2009050549 W NL 2009050549W WO 2011031134 A1 WO2011031134 A1 WO 2011031134A1
Authority
WO
WIPO (PCT)
Prior art keywords
parameters
model
time
images
models
Prior art date
Application number
PCT/NL2009/050549
Other languages
French (fr)
Inventor
Wiro Joep Niessen
Cornelis Timotheüs METZ
Theodores Van Walsum
Michiel Schaap
Original Assignee
Erasmus University Medical Center Rotterdam
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Erasmus University Medical Center Rotterdam filed Critical Erasmus University Medical Center Rotterdam
Priority to PCT/NL2009/050549 priority Critical patent/WO2011031134A1/en
Publication of WO2011031134A1 publication Critical patent/WO2011031134A1/en

Links

Classifications

    • G06T3/14
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/344Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/38Registration of image sequences
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/100764D tomography; Time-sequential 3D tomography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • G06T2207/10124Digitally reconstructed radiograph [DRR]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30048Heart; Cardiac
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular

Definitions

  • the invention relates to an image processing method and system, for processing X ray images for example, which provides for registration of 2D images and a reference 3D model.
  • Real time X-ray imaging can be applied to visualize the arteries and guide wires during therapies like coronary angioplasty. Contrast fluid can be injected, for example from a catheter, to make blood vessels visible in X-ray images at least downstream from the catheter. Unfortunately, due to its projective nature, real time X-ray imaging does not provide three dimensional information such as distance. It only visualizes the vessel lumen, which makes the visualization of total occlusions impossible. This limits the value of X-ray images for control of guidewire navigation.
  • non-real time techniques like coronary computed tomography angiography also allow the visualization of coronary plaque and total occlusions, but of course positions determined with no-real time techniques no longer apply after the patient has been moved prior to therapy.
  • More information can be obtained by fusing tomographic information with X-ray images. Fusion of tomographic information with X-ray images is described in an article titled "2D-3D registration of coronary angiograms for cardiac procedure planning and guidance", by Guy- Anne Turgeon et al., published in Med. Phys. 32,12, December 2005 pages 3737 to 3749.
  • This involves registration of 3D volume data with projection images.
  • registration involves the determination of parameters of a spatial transformation of image positions to provide for coincidence of positions where corresponding points on imaged objects are imaged in an integrated approach.
  • Use of feature matching and intensity matching are common techniques to find registration parameters. In simple cases, translation and rotation parameters suffice for 2D-2D registration.
  • the article provides for a 2D-3D registration method, to register a preoperative 3D mesh of the coronary arteries to intra-operative angiograms.
  • the article first converts the intra-operative angiograms into binary images using a multi- scale segmentation.
  • the transformation is described by skew parameters for the x- and y-coordinates from the normal from the imaging plane to the x-ray source, the magnification in x and y generated by the projection, the field of view of the image intensifier, the magnification factor and the isocenter distance relative to the focal spot source to object distance.
  • registration parameters are conventionally determined for pairs of images for corresponding time points.
  • the article notes that in the 2D-3D registration of a pre-operative 3D mesh of the coronary arteries to intra-operative angiograms heart motion creates difficulties for registration.
  • the article uses coronary artery registration at a selected time point within the cardiac cycle.
  • the registration is performed at end diastole, the most quiescent phase of the heart cycle.
  • Both the pre-operative and the intraoperative images are acquired at this same time point within the cardiac cycle.
  • Electrocardiogram (ECG) gating is used to ensure that the angiograms are acquired at the desired cardiac phase (end diastole).
  • intra-operative angiograms at a frame rate that shows live motion (e.g. at 15 Hz frame rate).
  • the technique described in the article can only provide for registration at periodic time intervals.
  • the 3D pre-operative mesh lacks time dependence.
  • this can be overcome by capturing 3D pre-operative meshes for a plurality of cardiac phases and subsequent registration of 2D images for respective time points each with the 3D mesh for a corresponding cardiac phase.
  • a computer program product such as a memory disk or other computer readable medium may be provided with a program of instructions for a programmable computer that, when executed by the programmable computer, will cause the programmable computer to perform the steps of determining the parameters of the first and second spatial transformations according to the method of claim 1
  • a method according to claim 1 is provided.
  • a reference 3D model is generated, e.g. of a set of coronary arteries, for a selected time value, such as an end- systole phase or end- diastole phase of the heart.
  • the 3D model may be obtained using computed tomography angiography for example.
  • 2D images such as X-ray images for example, are obtained for a series of time points, for example at a rate of at least five frames per second.
  • registration is performed using a combination of a first and second spatial transformation.
  • first and second spatial transformation For a plurality time points (cardiac phases) respective first parameters are determined of respective first spatial transformations that map the reference 3D model to mapped 3D models matching with 3D scan data for these time points.
  • Second parameters are determined for the second spatial transformation are determined that transforms the mapped 3D model for a time point to a registered 3D model that matches the 2D image for that time point, but also corresponds to matches for other time points.
  • the freedom to select registration parameters is distributed over the selection of parameters of the first and second spatial transformations, the former being independent of the 2D images.
  • the freedom to select the second spatial transformation is constrained by imposing a requirement the second transformation provides for matches with the 2D images for a range of time points.
  • the determination of the transformation may comprise the application of search algorithms of the type that determine parameter values for which a given score function yields a larger value than for other parameter values.
  • a first spatial transformation may be used that has more degrees of freedom than the second spatial transformation.
  • the second spatial transformation may be used that has more degrees of freedom than the second spatial transformation.
  • transformation of spatial positions according to the first transformation may allow for spatial deformation, described for example by a polynomial function that assigns transformed spatial positions to the spatial positions in the reference 3D model.
  • selection of the second transformation may be limited to a space of transformations that all preserve the shape of the 3D model at least locally, differing from each other only by overall translation and/or rotation.
  • Parameters of a second transformation are determined that map the mapped 3D model for the first one of the associated time values to a registered 3D model that defines derived 2D image data that match the 2D image for the first one of the time values, and that moreover maps the mapped 3D models for plurality of time points to registered 3D models that define derived 2D image data that match the 2D image for the plurality of time points.
  • second transformations with parameter that are related to the selected second parameters map the mapped 3D models for plurality of time points to registered 3D models that define derived 2D image data that match the 2D image for the plurality of time points. Relation between parameters may be imposed for example by weighing the similarity between the second parameters and the related parameters in the match.
  • the parameters of the second spatial transformation may be selected using a temporal window for example.
  • a search may be performed for second parameters of a second transformation that maximizes an aggregate of match scores for 2D images for a plurality of time points in the window.
  • these match scores are obtained by comparing information from captured 2D images for respective time points in the window, with information that is derived from the mapped 3D models according to the first transformations for these time points and the same second transformation for all these time points.
  • a window may be used that covers at least the duration of one cardiac cycle, or at least a time interval that includes parts of both the end-systole and end-diastole phase. This ensures that maximally different poses of blood vessels will be available for the determination of the second transformation.
  • Figure 1 shows a system for displaying combined scan data and X ray images
  • Figure 2 shows a flow chart of operation
  • FIG. 1a illustrates first and second transformations
  • Figure 3 shows graphs with results of a model consistency
  • Figure 6a,b illustrate registration results at two phases in the cardiac cycle.
  • FIG. 1 shows a system displaying combined scan data and X-ray images.
  • the system comprises a 3D scanner 10, a cardiac phase detector 12, a storage device 14, an X-ray device 16, a processor 18 and a display device 19.
  • 3D scanner 10 may be a tomographic X-ray scanner for example or an MRI scanner.
  • 3D scanner 10 has an input coupled to an output of cardiac phase detector 12 and an output coupled to storage device 14.
  • Storage device 14 may comprise a hard disc, or a set of hard discs for example, or a semi-conductor memory etc.
  • Processor 18 has inputs coupled to X-ray device 16 and storage device 14 and an output coupled to display device 19.
  • a network connection between storage device 14 and 3D scanner 10 and processor 18 may be used.
  • 3D scanner 10 is used to make 3D scans of a volume of tissue (e.g. the heart and surrounding tissue like blood vessels), to generate 3D models of the volume for a series of phase points during the cardiac cycle, identified by cardiac phase detector 12.
  • X-ray device 16 captures a stream of 2D X-ray image data of the same tissue for which the 3D models were stored.
  • the stream of 2D X-ray image may be captured during
  • Image capture may be aided by providing for contrast fluid in blood vessels. Contrast fluid may be released from a catheter used in a percutaneous coronary intervention for example.
  • Processor 18 uses the 2D X-ray image data to search for registration parameters that describe a transformation to register 3D model data with the 2D X-ray image data.
  • the registration parameters may be applied to various purposes. For example, they may be used to provide 3D data for control of movement of a probe (e.g. a guide wire or catheter). They may be used to control 3D display of information from the 3D model, such as stereoscopic display, registered to the actual pose of the heart during an intervention, or processor 18 may use the registered 3D model data to generate 2D projection images from the 3D model data registered to the actual pose of the heart. Processor 18 may combine the resulting model- generated images and the corresponding X-ray images and supplies the result to display device 19 for controlling displayed images. In the displayed images information from both the 2D images generated from the 3D model and the captured stream of 2D images by device 16 may be integrated.
  • FIG. 2 shows a flow chart of operation.
  • 3D scanner 10 obtains measurements for a series of phase points during the cardiac cycle, as defined by cardiac phase detector 12.
  • the data captured by 3D scanner 10 may be divided into n different 3D images, denoted I(i), where i is an integer from 0..n-l that indicates the phase value in the cardiac cycle.
  • Image quality may vary over the different 3D images, e.g. as a consequence of varying radiation dose and/or cardiac motion over the image acquisition period.
  • Contrast fluid may be injected to enhance the contrast between blood vessels and other body parts.
  • 3D scanner 10 processes the measurements to construct a reference 3D model for a set of cardiac phase points.
  • the model may be a centerline model representing centerlines of coronaries for example, or a lumen model or a voxel model.
  • the 3D model may take the form of a 3D image.
  • transformations of the model may take the form of transformed 3D image, wherein image values have been mapped to different positions according to spatial mapping of these positions.
  • the 3D model may define features of a 3D image, such as the locations and/or orientations and/or diameters of bloodvessels and other structures or their connections.
  • transformations of the model may take the form of features with transformed parameters, such as transformed positions and./or orientations.
  • 3D scanner 10 may generate a 3D centerline, lumen or voxel model of the coronary arteries for this phase point. This model be constructed in a way that is known per se. Any known tomographic reconstruction algorithm may be used, or algorithms that detect blood vessels as high contrast structures and reconstruct their 3D from different images for the selected phase point.
  • 3D scanner 10 stores the resulting data representing the reference 3D model in storage device 14.
  • Figure 2a illustrates transformations involved in operation.
  • the figure shows the 3D models 30a-c for different time points symbolically by means of cubes.
  • 3D models for different time points tl, t2,...tn are shown from left to right, by way of example 3D models 30a-c are shown for three time points tl, t2,...tn, but it should be understood that 3D models for more time points will be used.
  • First spatial transformations Tl(t) from the reference models to respective mapped models 32a-c are shown for respective time points.
  • the figure shows derived 2D images 36a-c (simulated X rays for example) derived from the registered mapped models 34a-c and a stream of captured 2D images 38a-c.
  • the captured 2D images 38a-c are shown for the same time points tl, t2,... tn as the 3D models 30a-c.
  • 2D images 38a-c will be captured at later times. They are shown for the same times indicate that they occur at corresponding phase points in the cardiac cycle.
  • 2D images 38a-c and 3D models 30a-c may be captured for corresponding phase points, but this is not essential. If not, interpolated first spatial transformations Tl(t') may be used to obtain mapped 3D models for the phase points of the 2D images 38a-c.
  • the process serves to obtain the registered mapped models 34a-c.
  • These registered mapped models 34a-c may be used to generate images that provide real time 3D information during coronary angioplasty for example. They may be used to control navigation of tools.
  • the registered mapped models 34a-c are derived using a composition of first spatial
  • first and second transformations are used.
  • the transformations are shown by unidirectional arrows, the transformations define two-way relations between models and transformed models.
  • the first spatial transformations allow for spatial deformation of the reference 3D model and the second transformations T2 may be rigid spatial transformations that provided for an overall translation and/or rotation only.
  • the second transformation may be represented by a translation vector or a translation vector and rotation angles for example.
  • the first spatial transformation Tl(tl),...Tl(tn) may be represented by respective deformation functions that represent displacement of points in the 3D model as a function of position.
  • a B-spline polynomial function may be used as a deformation function.
  • the parameters of the polynomial function form the parameters of the first spatial transformation.
  • the first spatial transformations maps the reference 3D spatial model 30b to mapped 3D models 32a, c that match with the measured 3D models 30a,b for the respective ones of the time values tl,....tn.
  • the second spatial transformations T2 map the mapped 3D models 32a-c to registered mapped 3D models 34a-c that have derived 2D images 36a-c that match captured X-ray images 38a-c.
  • 3D scanner 10 determines a deformation of the anatomy of interest, for example the coronary arteries, from the reference 3D model at other phase points "i", where i is not equal to f.
  • 3D scanner 10 executes a search algorithm that searches for a transformation that maps the reference 3D model, obtained for the selected phase f of the cardiac cycle, e.g. at end-systole or end-diastole, to the 3D models for other phase-points i by applying deformations that are derived by a non-rigid registration procedure.
  • each first transformation may be represented by a respective deformation field for a respective phase point, which represents displacements of points in the reference model as a continuous function of position in the reference model. Displacements described by polynomial functions position may be used for example. In this case the coefficients of the polynomials form the parameters of the first transformations.
  • a B-spline function may be used for example.
  • the determination of the first transformation comprises selection of parameters of the transformation that best map the reference model onto the 3D models for other phase-points.
  • a registration procedure may be used that is similar to the one proposed by Wierzbicki et al. in an article titled "Validation of dynamic heart models obtained using non-linear registration for virtual reality training, planning, and guidance of minimally invasive cardiac surgeries", published in Med Image Anal. 8(3) (2004) 387-401.
  • the data captured by 3D scanner 10 may be divided into n different 3D images, denoted I(i), where i is an integer from ⁇ .. ⁇ -l that indicates the phase in the cardiac cycle, further referred to as time-point.
  • the image at the time-point (labeled f) in which the reference 3D model is defined is chosen as reference image 1(f).
  • a series of 3D images I(i) for time points "i" unequal to f are subsequently registered to this reference image for the fixed phase point f, by selecting first transformations that minimize the error of the registration to this time-point.
  • s) defining displacements as a function of position may be determined for different scales of resolution, the overall transformation being a composite of displacements at successive scales.
  • s) for different scales may be determined recursively, with one or more repeated scaling cycles wherein the transformations T(f+j
  • s) may be determined that transform an image obtained by applying the transformations determined in the preceding scaling cycle or cycles to the reference 3D model. Transforms T(f+j
  • a search algorithm Prior to searching specific structures, such a as coronaries may be detected in the 3D models for different time points, using conventional techniques, and a search algorithm may be executed to search for parameters of transformations that map the spatial positions of the detected structures for different time points to matching positions (i.e. transformations that result in a minimized value of a similarity measure between the detected transformed structures). Matching of detected structures may be used as a form of image matching that replaces direct matching of 3D dimensional image values.
  • a B- spline deformation function may be used to describe the assignment of spatial positions in the reference 3D model or detected structures to transformed spatial positions.
  • a search for an optimal B-spline deformation function may be performed, for example by means of a stochastic gradient descent optimization approach in which a sum of squared differences or mutual information is used as similarity measure.
  • other transformations may be used, such as position dependent translations and/or rotations or even overall translations and/or rotations to be applied to the reference 3D model.
  • the determination of the first transformation comprises determining displacement of points on these larger structures at a cardiac phase.
  • the displacement of blood vessels at that cardiac phase is derived at least partly from the displacement of the larger structures that are adjacent to these blood vessels in the reference model.
  • the transformation of other structures such as the coronary arteries of the reference 3D model may be inferred from the transformation or transformations of the larger structures. This is especially advantageous in those phases of the cardiac cycle where the arteries are not well visible due to motion artifacts or due to lower signal to noise ratios due to dose reduction.
  • 3D scanner 10 stores information representing the reference 3D image and the first transformations in storage device 14.
  • second and third step 22, 23 are performed by 3D scanner 10, it should be appreciated that in fact either step or both may be performed by processor 18, or another processor (not shown). For the sake of explanation of second and third steps 22, 23 this processor has been considered to be part of 3D scanner 10.
  • X ray device 16 captures an X-ray image and generates X-ray data representing the captured image. Contrast fluid may be provided in the blood vessels during this step.
  • Processor 18 associates the sampling time point of the image with a phase position in the cardiac cycle at the time of capturing the image.
  • a further cardiac phase detector (not shown) coupled to X-ray device 16 or processor 18 may be used to determine this phase point for example.
  • the overall radiation dose used in the image capture of fourth step 24 may be less than the overall radiation dose used in the scanning of first step 21.
  • a (substantially) smaller dose than in first step 21 suffices in sixth step 26, as only a 2D image is needed.
  • Fourth step 24 is followed by a number of steps that are performed by processor 18 providing for a second transformation that aligns the 3D model data with the X-ray data.
  • a rigid intensity-based registration framework may be applied. Instead of matching the 3D model for an individual time point with an X-ray image for a corresponding phase point, the match is made dependent on X-ray images for a plurality of phase points, for example in a temporal window.
  • matching scores between the registered mapped 3D models for the phase points in a window are determined with respect to X- ray images for corresponding associated phase points, using the same second transformation for each phase point.
  • a second transformation may be selected that maximizes a combined match score for pairs of captured X ray images and model derived 2D images for these phase points, rather than the match score for only one pair of image for an individual phase point.
  • a second transformation may be selected maximizes a function that is combination of match scores for one or more pairs of images for one or more individual phase point plus a measure of similarity between parameters of the second
  • a match score or a measure may be derived from a measure of difference, such as a mean square differences.
  • the dependence of the selection of the second transformation on X- ray images for a plurality of time points is used to improve the robustness and accuracy of the registration procedure. Different projections of the same blood vessels will be visible at different time points, which makes it possible to resolve ambiguity.
  • the image data that is used for matching covers only a small spatial area, for example down- stream from a catheter or near its tip. In this case the image data for a single time point on its own will often be insufficient to provide for reliable determination of the second transformation.
  • the approach provides for use of additional
  • processor 18 defines the sampling time points of the X-ray images in the window of time points, for example including a predetermined number "w" (w being an integer of at least two) of time points for which X-ray images have been captured up to and including the time point for which the last X-ray image was captured.
  • a window is selected that covers at least a measured duration of a cardiac cycle. Thus a maximum of different views is obtained.
  • a window is selected that covers only part of the cardiac cycle.
  • a window covering part of the cardiac cycle is selected with a length that covers at least part of the end-systole phase and the end-diastole phase of the measured cardiac cycle.
  • the window extends backwards from the time-point for which a match for the X-ray image is sought plus the w - 1 previous time-point positions, which ensures that the most recent imaging data can be used.
  • processor 18 searches for a common pose of the 3D models obtained with the first transformations T from third step 23.
  • a pose is defined by a second spatial transformation that maps the 3D models, for example by means of translation and reorientation, to potentially registered mapped 3D models that correspond to the structures in visible the X-ray images.
  • a second transformation of this type is determined from a plurality of respective associated phase points in the window.
  • a search algorithm is executed, to search for the pose that maximizes a match score that combines match score contributions between X ray data for respective time points in the window and data derived from 3D models for the associated phase points.
  • the pose may be represented by a set of spatial transformation parameters, such as parameters of an overall spatial translation and/or rotation of the 3D model, i.e. transformations that preserve the form and size of the imaged objects.
  • a search algorithm is executed to search in the parameter space of such transformations.
  • the transformations may have a
  • predetermined deformation component for example to compensate for known non-linearity of imaging.
  • DRRs digitally reconstructed radiographs
  • a similarity metric may be computed by comparing the DRRs and the X-ray images using a similarity metric, for example by computing a normalized cross correlation.
  • processor 18 removes static structures by subtracting from each pixel the average pixel intensity of this pixel over time from the X-ray images before computing the metric. Alternatively the metric may be computed without such a subtraction.
  • the similarity metric defines a match score between an X-eray image and a DRRs for a phase point.
  • a sum of the resulting metric values over phase points covered by the time window w may be determined to derive a combined match score for a specific pose.
  • a search for pose (a set of parameters of the second transformation) is performed that leads to a higher value, or at least a not lower value of this score than other poses. This search may be performed using a multi-resolution gradient ascent procedure in the space of the transformation parameters that represent the pose.
  • contrast fluid is injected before capturing the X- ray image in sixth step 26 (e.g. once for a series of repetitions of sixth step 26).
  • the digital reconstruction of the radiographs may be configured to simulate the effect of the contrast fluid in the generation of the digitally reconstructed radiographs.
  • the affected blood vessels may be identified manually for example.
  • first transformations Tl(f+j) that are determined in third step 23 for respective phase positions in the cardiac cycle define first transformations Tl(f+j) of the reference 3D model and that the common pose estimation for the 3D models at the time points in the window defines a second transformation T2 to be applied in common to the results of applying the first transformations.
  • the 3D models for use in the correlation may be derived directly from the reference model by applying the composite transformations.
  • a further cardiac phase detector (not shown) coupled to X-ray device 16 or processor 18 may be used to determine the associated phase points in the cardiac cycle associated with respective ones of the X-ray images.
  • an additional search may be made for associated phase positions in the cardiac cycle that, when assigned to the X-ray images, result in the best match with the 3D models for those phase positions in the cardiac cycle.
  • a less constrained relation may be used, for example to allow for a predetermined phase error.
  • interpolation e.g. spline interpolation
  • the interpolated 3D models for use in the correlation may be derived directly from the reference model by applying interpolated composite transformations.
  • processor 18 uses the registration. Different forms of use are possible.
  • processor 18 controls a display to display model data derived from the reference model according to the transformations.
  • Processor 18 may generate the derived model data from the reference 3D model obtained in second step 22 and applying the
  • transformations T obtained in third step 23 and by applying the common pose determined for the window in sixth step 26 may be combined with the X-ray data for the cardiac phase that is associated with the model data.
  • the transformations and the model may be used to control movement of a probe.
  • the process repeats from fourth step 24 for a newly captured X-ray image for a next sampling time point.
  • fourth to seventh steps 24-27 are repeated real-time, at a predetermined rate corresponding to the temporal sampling rate of X-ray device 16.
  • a sampling rate of fifteen X-ray images per second may be used for example and preferably at a sampling rate that lies well above normal heart rates, for example at least five images per second.
  • First to third step 21-23 may be executed at a different time. They need not be performed in real time.
  • 3D scanner 10 may provide for an input to receive annotations and it may be programmed to add the annotations to the reference 3D model, for rendering in combination with the X-ray images.
  • 3D scanner 10 may be programmed to support human controlled generation or adaptation of the model or the transformations in second and third steps 22, 23.
  • the entire selection of the transformations T(f+i) and/or segmentation or centre line selection of model generation may be input into 3D scanner 10 by the user.
  • one or more adjustment steps may be added.
  • Manual initialization may be performed by applying one 2D translation in the plane orthogonal to the projection line for the complete cardiac cycle.
  • Such adjustment steps may be performed for example in arbitrary repetitions of fourth to seventh steps 24-27, once for all subsequent repetitions until a next adjustment is made. Alternatively, or in addition, such adjustments may be made before the repetitions.
  • a dual- source CT system (Somatom Definition; Siemens Medical Solutions, Forchheim, Germany) was used. No beta-blockers were administered in preparation for the scan. The entire volume of the heart was scanned in one breath-hold, while simultaneously recording the ECG trace. An optimized heart-rate- dependent ECG pulsing protocol was applied. Outside the ECG pulsing window a tube current of 4% was used in order to reduce the effective dose of the CT examination. Multi-phase 4D datasets were reconstructed at 5% steps from 0% to 95% of the RR-interval, with a slice thickness of 3 mm and slice increment of 1.5 mm. Voxel sizes of the
  • X-ray data was acquired with an AXIOM- Artis C-arm system (Siemens Medical Solutions, Forchheim, Germany). The pixel size of these images is 0.22x0.22 mm2.
  • the 4D coronary models were evaluated using manually annotated coronary centerlines at both systole and diastole of the cardiac cycle. In every dataset one of the following vessels was randomly selected for annotation: right coronary artery (RCA), left anterior descending artery (LAD) or left circumflex artery (LCX). In total 11 LADs, 11 LCXs and 9 RCAs were annotated.
  • Figure 4 represents model accuracy.
  • the figure displays a boxplot of the distance between the centerlines at systole and diastole before (I) and after (R) registration, and the difference between these two (D). Distances are averaged over systolic and diastolic measurements. Results are presented for all vessels combined and for the LAD, LCX and RCA separately. It can be noticed that the LAD shows the least deformation during the cardiac cycle and the RCA the most. The median accuracy is approximately 1 mm.
  • Fig. 6 Some misalignment is still visible in Fig. 6, which may be caused by irregular heart deformation, inaccurate reporting of the projection geometry of the X-ray system in the DICOM headers, errors in the 4D coronary model, or non-rigid deformation caused by the breathing cycle. Improvements could also be applied to the registration procedure for derivation of the 4D coronary model, especially to be better able to handle the more excessive motion of the RCA. It can, for example, be advantageous to take the cyclic behavior of the heart motion into account or to impose a smoothness constraint on the registration in the time- dimension of the image. Furthermore, as we
  • a smoothness constraint may be imposed by performing a search for parameters of the transformation Tl that produce a match by minimizing a score function that is a combination (e.g. a sum) sum of contributions representing accuracy and smoothness in the determination of the transformation in third step 23.
  • the contribution representing accuracy may comprise a contribution which is a measure of mismatch between individual images and transformed images.
  • the contribution representing smoothness may be a measure of change of the parameters of the
  • transformation e.g. a possibly weighted sum of squares of differences between corresponding parameters for successive time points.
  • Any known search method may be used, such as a simple method that solves the parameters of the transformation from an equation implied by minimum score function.
  • Other techniques may be used to affect selection of the transformation, such as 4D tracking.
  • cyclic behavior of the heart motion may be taken into account by using images I(f+i) for time points over a plurality of cardiac cycles in the determination of the transformation in third step 23.
  • This step may comprises determination of transformations for time points in a single cardiac cycle that produce a match by minimizing a score function that measure a combination of deviations between transformed model images and images for corresponding time points from the plurality of different cardiac cycles.
  • this may be combined with a smoothness constraint for changes of the transformation parameters in the cycle that includes a measure of changes of the transformation parameter from the end of the cycle to its beginning.
  • a 3D centerline or lumen model of the coronary arteries is derived from a phase in the cardiac cycle with minimal motion.
  • Deformation of the coronary arteries is derived from the 4D CTA data by means of a non-rigid registration procedure.
  • the coronary arteries move with the larger structures in the image to which they are attached, such as the heart chambers, which is especially advantageous in those phases of the cardiac cycle where the arteries are not well visible due to motion artifacts or dose reduction.
  • the resulting 4D coronary model is applied in a temporal window based registration procedure for the dynamic alignment of CTA with X-ray data.
  • a transformation of the reference 3D model is used to derive the image data for display, after registration according to the transformations that have been found in the process.
  • a 3D model measured at the phase position of the X-ray image and transformed according to the common transformation could be used for the combined display.
  • the reference 3D model provides for qualitatively better imaging data, because it is very difficult to extract 3D models at those phases in the cardiac cycle where dose reduction is applied in the acquisition process.
  • registration can be used for many purposes.
  • An image derived from the reference 3D model or the 3D images could be projected on its own, after transformation according to the registration, in order to show information about the current location of various elements of the model.
  • Registered images could be shown side by side, or in a mutually overlapping fashion.
  • the match score for determining the second transformation is obtained by applying a common second transformation is used that is the same for all time points in the window.
  • the resulting second transformation is used for registering the 3D model at least at one time point.
  • the procedure to determine the second transformation may be repeated for successive time points, using a respective different window for each time point, such as a window that extends backward from the time point over a detected cardiac cycle or a predetermined fraction of the cardiac cycle. This provides for a minimum of delay between capture of X-ray images and the availability of the second transformations for the time points corresponding to the X-ray images.
  • the resulting second transformation for one time point in the window may be used for registering the 3D model at a plurality of time points in the window.
  • a combination potentially different parameters of second transformations for a plurality of time points may be determined together. This may be done by searching for a combination of parameters for different time points that maximizes a score that combines (a) measures of similarity between pairs of X-ray images and model derived images for the time points in the window and (b) measures of similarity between the parameters of the second transformations for the time points in the window.
  • match score for selecting the second transformation is determined form a sum of match score for pairs of images for phase points in a window
  • a constraint may be imposed that limits the change of the parameters of the second transformation between successive time points.
  • the transformation may be selected to maximize an amended match score for example, which is defined as a sum of the image based match score for a pair of an X-ray image and a model derived 2D image for the time point plus a measure of similarity between the parameters of the second translation for the time point and an adjacent time point (e.g. a previous time point).
  • the measure of similarity may be a weighted sum of squares of changes in the parameters of the second transformation compared to the previous repetition.
  • a search algorithm may be executed in the space of parameters to optimize the combined similarity measure.
  • transformation for a time point in sixth step 26 is made to depend on X-ray images for a range of time points, including said time point and at one other time point at a different cardiac phase.
  • the match scores between images for individual time points may be defined by counting matching features detected the X-ray image that match features present in the transformed 3D model, or by summing correlations between image values, and/or by subtracting decrements from the score for detected differences.
  • the aggregate match score may be a sum of match scores for different time points wherein equal weight is given to different time points, or time dependent weight may be given.
  • the sum over match scores for different time points may computed by direct summing or by means of recursive filtering.
  • the match score used in the search of sixth step 26 may relate only for a selected limited region of 3D space. For example, when the registration is used to control navigation of the tip of a catheter through blood vessels, registration in a region containing the position of the tip may be more important than elsewhere. In another example, when contrast fluid is fed to the blood vessels from a catheter, only a part of the blood vessels downstream from the catheter may show up clearly in the X-rays. In both cases a match score may be used that is based only on similarity between the 3D model and the X-ray image in said region and not in a larger region that contains other parts of the heart.
  • Such a limitation to a selected region may have the effect that only a small number of blood vessels is available for determining the second transformation. If only a single X-ray image would be used to determine the second transformation, this could lead to inaccuracy when relevant blood vessels are occluded, or when bends in the blood vessels are invisible due to projection. This is overcome by using a plurality of X-ray images for different time points, wherein blood vessels are visible under different orientations due to movement of the heart.
  • the selection of sixth step 26 is made in a more restricted way than the selections of the transformations from the reference 3D model to the 3D models in third step 23.
  • the freedom to select the composite transformations T2( Tl(f+j) ) from the reference 3D model to the match with the X-ray image is distributed over selection of transformations Tl(f+j) on the basis of 3D scan data and the selection of the second transformation T2 on the basis of the X-ray data.
  • the range of time points has a length that is sufficient to cover both the end-diastole and the end systole phase and more preferably an entire cardiac cycle.
  • 3D position dependent rotations and/or translations or B spline transformations may be used in third step 23 or seventh step 27 or both.
  • the transformations used in third step 23 and seventh step 27 may be overall translations.
  • these transformations may be overall rotations and/or
  • the existence of a match between objects typically means that a match score between the objects substantially has an optimal value, compared to match scores between objects that do not match.
  • the match scores may be determined from a comparison of image values or between features extracted from images for example. Determination of parameters that provide for a match typically involves executing a search algorithm that searches for parameters that result in matching objects, after a transformation according to the parameters. Such a search algorithms may be a simple algorithm that tries various parameter values, computes match scores obtained with these parameter values and selects those of the tried parameters that substantially resulted in the best match score. Alternatively the search algorithm may be an algorithm that solves an equation for the optimal parameter values or a descent algorithm that step along successive parameter values with successively better match scores until the substantially optimal parameter values have been found.
  • 3D reference model is used for one time point in the cycle that is mapped to all time points in the cycle
  • more than one 3D reference model may be used, such as 3D reference models for different time points in the cycle.
  • Reference models for the end systolic and end diastolic phase points may be used for example.
  • First transformations may be determined for each reference model, and mapping of the models to different time points may comprise combining information from the reference models and/or selection between the reference models, for example according to the time distance between the mapped time point and the time points corresponding to the reference models..
  • processing of the 3D scan data may comprise generation of the reference 3D model and determination of the first transformations Tl(j+f).
  • Combined processing of the 3D scan data and the 2D images may comprise determination of the second transformations T2 and generation of images from the reference 3D model using the first and second transformations.
  • processing of the 3D scan data is performed by a processor in 3D scanner 10 and the combined processing is performed by processor 18.
  • the processor of 3D scanner 10 and processor 18 form a processing system to perform the process.
  • a processing system with a different distribution of tasks may be used, such as a processing system wherein processor 18 performs all operations, or a system wherein processor 18 is part of 3D scanner 10, or a system with additional processors that perform parts of the tasks.
  • processors in the processing system such as processor 18 may be implemented as a programmable computer programmed with a computer program to make it perform the functions as described. Alternatively, part or all of these functions may be realized by electronic circuits that are specifically designed to perform these functions. Both possibilities will be indicated by stating that the processing system is configured to implement functions.

Abstract

Registration of two dimensional images with a time dependent three dimensional model of a moving object such as the heart is performed using two transformations. A first transformation, which may involve deformation, maps a three dimensional model of the object for one time point, such as the end- systolic heart phase, to mapped models that match 3D scans of the object for different time points. The first transformation is determined by matching 3D scans with mapped 3D models, typically using large scale structures of the object. Afterwards a stream of 2D images, such as X-ray images of the object, is captured for a series of time points. From this stream, second transformations are determined that map the mapped 3D models to the 2D images. The parameters of the second transformation for an individual time point are determined dependent on the 2D images for a plurality of the time points, to ensure that these parameters map the mapped 3D models for plurality of time points to registered 3D models that define derived 2D image data that match the combination of 2D images for the plurality of time points. Thus, instead of using a single spatial transformation to register the 3D models with a 2D image, registration is performed using a reference 3D model and a combination of a first and second spatial transformation.

Description

Title: Image processing method and system
Field of the invention The invention relates to an image processing method and system, for processing X ray images for example, which provides for registration of 2D images and a reference 3D model.
Background
Real time X-ray imaging can be applied to visualize the arteries and guide wires during therapies like coronary angioplasty. Contrast fluid can be injected, for example from a catheter, to make blood vessels visible in X-ray images at least downstream from the catheter. Unfortunately, due to its projective nature, real time X-ray imaging does not provide three dimensional information such as distance. It only visualizes the vessel lumen, which makes the visualization of total occlusions impossible. This limits the value of X-ray images for control of guidewire navigation.
On the other hand, non-real time techniques like coronary computed tomography angiography also allow the visualization of coronary plaque and total occlusions, but of course positions determined with no-real time techniques no longer apply after the patient has been moved prior to therapy.
More information can be obtained by fusing tomographic information with X-ray images. Fusion of tomographic information with X-ray images is described in an article titled "2D-3D registration of coronary angiograms for cardiac procedure planning and guidance", by Guy- Anne Turgeon et al., published in Med. Phys. 32,12, December 2005 pages 3737 to 3749. This involves registration of 3D volume data with projection images. As is known per se, registration involves the determination of parameters of a spatial transformation of image positions to provide for coincidence of positions where corresponding points on imaged objects are imaged in an integrated approach. Use of feature matching and intensity matching are common techniques to find registration parameters. In simple cases, translation and rotation parameters suffice for 2D-2D registration.
The article provides for a 2D-3D registration method, to register a preoperative 3D mesh of the coronary arteries to intra-operative angiograms. The article first converts the intra-operative angiograms into binary images using a multi- scale segmentation. In the article the transformation is described by skew parameters for the x- and y-coordinates from the normal from the imaging plane to the x-ray source, the magnification in x and y generated by the projection, the field of view of the image intensifier, the magnification factor and the isocenter distance relative to the focal spot source to object distance.
In the case of registration of different temporal sequences of images, registration parameters are conventionally determined for pairs of images for corresponding time points. The article notes that in the 2D-3D registration of a pre-operative 3D mesh of the coronary arteries to intra-operative angiograms heart motion creates difficulties for registration.
The article uses coronary artery registration at a selected time point within the cardiac cycle. The registration is performed at end diastole, the most quiescent phase of the heart cycle. Both the pre-operative and the intraoperative images are acquired at this same time point within the cardiac cycle. Electrocardiogram (ECG) gating is used to ensure that the angiograms are acquired at the desired cardiac phase (end diastole).
It is desirable to use intra-operative angiograms at a frame rate that shows live motion (e.g. at 15 Hz frame rate). The technique described in the article can only provide for registration at periodic time intervals. Moreover, the 3D pre-operative mesh lacks time dependence. Of course this can be overcome by capturing 3D pre-operative meshes for a plurality of cardiac phases and subsequent registration of 2D images for respective time points each with the 3D mesh for a corresponding cardiac phase. However, it is difficult to obtain consistent high quality 3D meshes throughout the complete cardiac cycle. Due to heart motion the 3D images for other phases than end- diastole do not have the same quality as those for end-diastole. To overcome this problem, it may be necessary to perform the 3D preoperative imaging for prolonged periods and/or with increased effective patient radiation dose.
A computer program product such as a memory disk or other computer readable medium may be provided with a program of instructions for a programmable computer that, when executed by the programmable computer, will cause the programmable computer to perform the steps of determining the parameters of the first and second spatial transformations according to the method of claim 1
Summary
Among others, it is an object to provide for an image processing method and system, for X-ray images for example, which provides for high quality registered images based on a 3D model.
A method according to claim 1 is provided. Herein a reference 3D model is generated, e.g. of a set of coronary arteries, for a selected time value, such as an end- systole phase or end- diastole phase of the heart. The 3D model may be obtained using computed tomography angiography for example. 2D images, such as X-ray images for example, are obtained for a series of time points, for example at a rate of at least five frames per second.
Instead of using a single spatial transformation to register the 3D models with a 2D image, registration is performed using a combination of a first and second spatial transformation. For a plurality time points (cardiac phases) respective first parameters are determined of respective first spatial transformations that map the reference 3D model to mapped 3D models matching with 3D scan data for these time points. Second parameters are determined for the second spatial transformation are determined that transforms the mapped 3D model for a time point to a registered 3D model that matches the 2D image for that time point, but also corresponds to matches for other time points.
In this way, the freedom to select registration parameters is distributed over the selection of parameters of the first and second spatial transformations, the former being independent of the 2D images. The freedom to select the second spatial transformation is constrained by imposing a requirement the second transformation provides for matches with the 2D images for a range of time points. The determination of the transformation may comprise the application of search algorithms of the type that determine parameter values for which a given score function yields a larger value than for other parameter values.
A first spatial transformation may be used that has more degrees of freedom than the second spatial transformation. For example the
transformation of spatial positions according to the first transformation may allow for spatial deformation, described for example by a polynomial function that assigns transformed spatial positions to the spatial positions in the reference 3D model. In contrast, the selection of the second transformation may be limited to a space of transformations that all preserve the shape of the 3D model at least locally, differing from each other only by overall translation and/or rotation.
Parameters of a second transformation are determined that map the mapped 3D model for the first one of the associated time values to a registered 3D model that defines derived 2D image data that match the 2D image for the first one of the time values, and that moreover maps the mapped 3D models for plurality of time points to registered 3D models that define derived 2D image data that match the 2D image for the plurality of time points. Instead, a requirement may be imposed that second transformations with parameter that are related to the selected second parameters map the mapped 3D models for plurality of time points to registered 3D models that define derived 2D image data that match the 2D image for the plurality of time points. Relation between parameters may be imposed for example by weighing the similarity between the second parameters and the related parameters in the match.
The parameters of the second spatial transformation may be selected using a temporal window for example. When a window is used, a search may be performed for second parameters of a second transformation that maximizes an aggregate of match scores for 2D images for a plurality of time points in the window. In an embodiment these match scores are obtained by comparing information from captured 2D images for respective time points in the window, with information that is derived from the mapped 3D models according to the first transformations for these time points and the same second transformation for all these time points. For example, a window may be used that covers at least the duration of one cardiac cycle, or at least a time interval that includes parts of both the end-systole and end-diastole phase. This ensures that maximally different poses of blood vessels will be available for the determination of the second transformation.
Brief description of the drawing
These and other objects and advantageous aspects will become apparent from a description of exemplary embodiments, using the following figures.
Figure 1 shows a system for displaying combined scan data and X ray images
Figure 2 shows a flow chart of operation
Figure 2a illustrates first and second transformations
Figure 3 shows graphs with results of a model consistency
experiment
Figure 4 represents model accuracy Figure 5 shows results of an experiment
Figure 6a,b illustrate registration results at two phases in the cardiac cycle. Detailed description of exemplary embodiments
Figure 1 shows a system displaying combined scan data and X-ray images. The system comprises a 3D scanner 10, a cardiac phase detector 12, a storage device 14, an X-ray device 16, a processor 18 and a display device 19. 3D scanner 10 may be a tomographic X-ray scanner for example or an MRI scanner. 3D scanner 10 has an input coupled to an output of cardiac phase detector 12 and an output coupled to storage device 14. Storage device 14 may comprise a hard disc, or a set of hard discs for example, or a semi-conductor memory etc. Processor 18 has inputs coupled to X-ray device 16 and storage device 14 and an output coupled to display device 19. A network connection between storage device 14 and 3D scanner 10 and processor 18 may be used.
In operation 3D scanner 10 is used to make 3D scans of a volume of tissue (e.g. the heart and surrounding tissue like blood vessels), to generate 3D models of the volume for a series of phase points during the cardiac cycle, identified by cardiac phase detector 12. Subsequently, X-ray device 16 captures a stream of 2D X-ray image data of the same tissue for which the 3D models were stored. The stream of 2D X-ray image may be captured during
percutaneous coronary interventions for example. Image capture may be aided by providing for contrast fluid in blood vessels. Contrast fluid may be released from a catheter used in a percutaneous coronary intervention for example.
Processor 18 uses the 2D X-ray image data to search for registration parameters that describe a transformation to register 3D model data with the 2D X-ray image data. The registration parameters may be applied to various purposes. For example, they may be used to provide 3D data for control of movement of a probe (e.g. a guide wire or catheter). They may be used to control 3D display of information from the 3D model, such as stereoscopic display, registered to the actual pose of the heart during an intervention, or processor 18 may use the registered 3D model data to generate 2D projection images from the 3D model data registered to the actual pose of the heart. Processor 18 may combine the resulting model- generated images and the corresponding X-ray images and supplies the result to display device 19 for controlling displayed images. In the displayed images information from both the 2D images generated from the 3D model and the captured stream of 2D images by device 16 may be integrated.
Figure 2 shows a flow chart of operation. In a first step 21 of the flow chart of figure 2, 3D scanner 10 obtains measurements for a series of phase points during the cardiac cycle, as defined by cardiac phase detector 12. The data captured by 3D scanner 10 may be divided into n different 3D images, denoted I(i), where i is an integer from 0..n-l that indicates the phase value in the cardiac cycle. Image quality may vary over the different 3D images, e.g. as a consequence of varying radiation dose and/or cardiac motion over the image acquisition period. Contrast fluid may be injected to enhance the contrast between blood vessels and other body parts.
In a second step 22 3D scanner 10 processes the measurements to construct a reference 3D model for a set of cardiac phase points. The model may be a centerline model representing centerlines of coronaries for example, or a lumen model or a voxel model. The 3D model may take the form of a 3D image. In this case transformations of the model may take the form of transformed 3D image, wherein image values have been mapped to different positions according to spatial mapping of these positions. Alternatively, the 3D model may define features of a 3D image, such as the locations and/or orientations and/or diameters of bloodvessels and other structures or their connections. In this case transformations of the model may take the form of features with transformed parameters, such as transformed positions and./or orientations. Although an embodiment will be described wherein the reference 3D model is constructed by 3D scanner 10, it should be appreciated that in fact this may be done in processor 18, or another processor (not shown), but for the sake of explanation this processor will be considered to be part of 3D scanner 10.
3D scanner 10 (or a processor that is considered to be part of 3D scanner 10) may select a phase point i=f in the cardiac cycle that is associated with minimal motion, such as an end- systole phase or end- diastole phase where image quality is in general higher than in other parts of the cardiac cycle. 3D scanner 10 may generate a 3D centerline, lumen or voxel model of the coronary arteries for this phase point. This model be constructed in a way that is known per se. Any known tomographic reconstruction algorithm may be used, or algorithms that detect blood vessels as high contrast structures and reconstruct their 3D from different images for the selected phase point. 3D scanner 10 stores the resulting data representing the reference 3D model in storage device 14.
Figure 2a illustrates transformations involved in operation. The figure shows the 3D models 30a-c for different time points symbolically by means of cubes. 3D models for different time points tl, t2,...tn are shown from left to right, by way of example 3D models 30a-c are shown for three time points tl, t2,...tn, but it should be understood that 3D models for more time points will be used. Furthermore by way of example, the 3D model for the time point t2 has been taken as the reference model (t2=tf). First spatial transformations Tl(t) from the reference models to respective mapped models 32a-c are shown for respective time points. Second spatial transformations T2 are shown to registered mapped models 34a-c from the mapped models 32a-c for respective time points (and from the reference model in the case of t=t2). Finally, the figure shows derived 2D images 36a-c (simulated X rays for example) derived from the registered mapped models 34a-c and a stream of captured 2D images 38a-c. For the sake of illustration the captured 2D images 38a-c are shown for the same time points tl, t2,... tn as the 3D models 30a-c. However, it should be appreciated that in fact 2D images 38a-c will be captured at later times. They are shown for the same times indicate that they occur at corresponding phase points in the cardiac cycle. However, 2D images 38a-c and 3D models 30a-c may be captured for corresponding phase points, but this is not essential. If not, interpolated first spatial transformations Tl(t') may be used to obtain mapped 3D models for the phase points of the 2D images 38a-c.
The process serves to obtain the registered mapped models 34a-c. These registered mapped models 34a-c may be used to generate images that provide real time 3D information during coronary angioplasty for example. They may be used to control navigation of tools. As can be seen the registered mapped models 34a-c are derived using a composition of first spatial
transformations Tl(tl),... Tl(tn) and second spatial transformations T2 and not by applying the second transformations T2 to all measured 3D models 30a- c.
Preferably, reversible first and second transformations are used. Thus, although the transformations are shown by unidirectional arrows, the transformations define two-way relations between models and transformed models. In an embodiment the first spatial transformations allow for spatial deformation of the reference 3D model and the second transformations T2 may be rigid spatial transformations that provided for an overall translation and/or rotation only. The second transformation may be represented by a translation vector or a translation vector and rotation angles for example. The first spatial transformation Tl(tl),...Tl(tn) may be represented by respective deformation functions that represent displacement of points in the 3D model as a function of position. A B-spline polynomial function may be used as a deformation function. In this case the parameters of the polynomial function form the parameters of the first spatial transformation. Of course other types of parameters may be used, such as a grid of displacement values. The first spatial transformations maps the reference 3D spatial model 30b to mapped 3D models 32a, c that match with the measured 3D models 30a,b for the respective ones of the time values tl,....tn. The second spatial transformations T2 map the mapped 3D models 32a-c to registered mapped 3D models 34a-c that have derived 2D images 36a-c that match captured X-ray images 38a-c.
In a third step 23 3D scanner 10 (or a processor that is considered to be part of 3D scanner 10) determines a deformation of the anatomy of interest, for example the coronary arteries, from the reference 3D model at other phase points "i", where i is not equal to f. 3D scanner 10 executes a search algorithm that searches for a transformation that maps the reference 3D model, obtained for the selected phase f of the cardiac cycle, e.g. at end-systole or end-diastole, to the 3D models for other phase-points i by applying deformations that are derived by a non-rigid registration procedure.
The transformations that map the reference 3D model to the other phase-points will be called first transformations. In an embodiment each first transformation may be represented by a respective deformation field for a respective phase point, which represents displacements of points in the reference model as a continuous function of position in the reference model. Displacements described by polynomial functions position may be used for example. In this case the coefficients of the polynomials form the parameters of the first transformations. A B-spline function may be used for example. The determination of the first transformation comprises selection of parameters of the transformation that best map the reference model onto the 3D models for other phase-points.
A registration procedure may be used that is similar to the one proposed by Wierzbicki et al. in an article titled "Validation of dynamic heart models obtained using non-linear registration for virtual reality training, planning, and guidance of minimally invasive cardiac surgeries", published in Med Image Anal. 8(3) (2004) 387-401. The data captured by 3D scanner 10 may be divided into n different 3D images, denoted I(i), where i is an integer from Ο..η-l that indicates the phase in the cardiac cycle, further referred to as time-point. The image at the time-point (labeled f) in which the reference 3D model is defined is chosen as reference image 1(f). A series of 3D images I(i) for time points "i" unequal to f are subsequently registered to this reference image for the fixed phase point f, by selecting first transformations that minimize the error of the registration to this time-point.
In an embodiment registration starts using I(f+1) for a time point i=f+l next to the time point of the reference model. From a comparison of I(f+1) and 1(f) a spatial transformation T(f+1) is determined. The resulting transformation T(f+1) is used as the initial transformation for registration of the next time-point image I(f+2). For each time point a stochastic gradient descent optimization approach may be used to search for a first transformation that minimizes the sum of squared differences or mutual information as similarity measure, i.e. that results in matching models .
In an embodiment registration is performed using a multi-resolution approach in which Gaussian filtering is applied to obtain 3D images I(f+j | s) at scales s at successively reduced resolution. Transformations T(f+j | s) defining displacements as a function of position may be determined for different scales of resolution, the overall transformation being a composite of displacements at successive scales. In this case transformations T(f+j | s) for different scales may be determined recursively, with one or more repeated scaling cycles wherein the transformations T(f+j | s) are determined for successively finer scales. In each but the first scaling cycle position dependent transformations T(f+j | s) may be determined that transform an image obtained by applying the transformations determined in the preceding scaling cycle or cycles to the reference 3D model. Transforms T(f+j | s) are selected that minimize the difference between the transformed image and the image I(f+j | s) at the scale of the cycle. Prior to searching specific structures, such a as coronaries may be detected in the 3D models for different time points, using conventional techniques, and a search algorithm may be executed to search for parameters of transformations that map the spatial positions of the detected structures for different time points to matching positions (i.e. transformations that result in a minimized value of a similarity measure between the detected transformed structures). Matching of detected structures may be used as a form of image matching that replaces direct matching of 3D dimensional image values.
A B- spline deformation function may be used to describe the assignment of spatial positions in the reference 3D model or detected structures to transformed spatial positions. A search for an optimal B-spline deformation function may be performed, for example by means of a stochastic gradient descent optimization approach in which a sum of squared differences or mutual information is used as similarity measure. Alternatively, other transformations may be used, such as position dependent translations and/or rotations or even overall translations and/or rotations to be applied to the reference 3D model.
It may be assumed that the coronary arteries move in the image together with the larger structures to which they are attached, such as the heart chambers. Because a continuous deformation is used, the determination of the first transformation comprises determining displacement of points on these larger structures at a cardiac phase. As a result the displacement of blood vessels at that cardiac phase is derived at least partly from the displacement of the larger structures that are adjacent to these blood vessels in the reference model. Thus the transformation of other structures such as the coronary arteries of the reference 3D model may be inferred from the transformation or transformations of the larger structures. This is especially advantageous in those phases of the cardiac cycle where the arteries are not well visible due to motion artifacts or due to lower signal to noise ratios due to dose reduction. The reference 3D image 1(f), combined with a set of the first transformations T(f+j) that have been found in this way, is used as a 4D (3D+t) model. 3D scanner 10 stores information representing the reference 3D image and the first transformations in storage device 14. Although embodiments have been described wherein second and third step 22, 23 are performed by 3D scanner 10, it should be appreciated that in fact either step or both may be performed by processor 18, or another processor (not shown). For the sake of explanation of second and third steps 22, 23 this processor has been considered to be part of 3D scanner 10.
In a fourth step 24, X ray device 16 captures an X-ray image and generates X-ray data representing the captured image. Contrast fluid may be provided in the blood vessels during this step. Processor 18 associates the sampling time point of the image with a phase position in the cardiac cycle at the time of capturing the image. A further cardiac phase detector (not shown) coupled to X-ray device 16 or processor 18 may be used to determine this phase point for example. When both 3D scanner 10 and X-ray device 16 use X-rays, it may be noted that the overall radiation dose used in the image capture of fourth step 24 may be less than the overall radiation dose used in the scanning of first step 21. A (substantially) smaller dose than in first step 21 suffices in sixth step 26, as only a 2D image is needed.
Fourth step 24 is followed by a number of steps that are performed by processor 18 providing for a second transformation that aligns the 3D model data with the X-ray data. A rigid intensity-based registration framework may be applied. Instead of matching the 3D model for an individual time point with an X-ray image for a corresponding phase point, the match is made dependent on X-ray images for a plurality of phase points, for example in a temporal window.
In an embodiment, matching scores between the registered mapped 3D models for the phase points in a window are determined with respect to X- ray images for corresponding associated phase points, using the same second transformation for each phase point. A second transformation may be selected that maximizes a combined match score for pairs of captured X ray images and model derived 2D images for these phase points, rather than the match score for only one pair of image for an individual phase point. Alternatively, a second transformation may be selected maximizes a function that is combination of match scores for one or more pairs of images for one or more individual phase point plus a measure of similarity between parameters of the second
transformation for pairs of successive time points. Even if the maximized function comprises a match score for only one phase point, the measure of similarity between parameters of the second transformation for pairs of successive time points imports a dependence on X-ray images for other time points. As will be understood, a match score or a measure may be derived from a measure of difference, such as a mean square differences.
The dependence of the selection of the second transformation on X- ray images for a plurality of time points is used to improve the robustness and accuracy of the registration procedure. Different projections of the same blood vessels will be visible at different time points, which makes it possible to resolve ambiguity. In an embodiment, the image data that is used for matching covers only a small spatial area, for example down- stream from a catheter or near its tip. In this case the image data for a single time point on its own will often be insufficient to provide for reliable determination of the second transformation. The approach provides for use of additional
information in the time- dimension of the data, which is especially useful in the case of mono-plane X-ray imaging, for which the transformation in the projection direction of the X-ray system is in general hard to find.
In a fifth step 25 processor 18 defines the sampling time points of the X-ray images in the window of time points, for example including a predetermined number "w" (w being an integer of at least two) of time points for which X-ray images have been captured up to and including the time point for which the last X-ray image was captured. In an embodiment a window is selected that covers at least a measured duration of a cardiac cycle. Thus a maximum of different views is obtained. In another embodiment a window is selected that covers only part of the cardiac cycle. In a further embodiment a window covering part of the cardiac cycle is selected with a length that covers at least part of the end-systole phase and the end-diastole phase of the measured cardiac cycle. Thus it is ensured that at least the main differences of the views of the blood vessels are covered. In an embodiment the window extends backwards from the time-point for which a match for the X-ray image is sought plus the w - 1 previous time-point positions, which ensures that the most recent imaging data can be used.
In a sixth step 26 processor 18 searches for a common pose of the 3D models obtained with the first transformations T from third step 23. A pose is defined by a second spatial transformation that maps the 3D models, for example by means of translation and reorientation, to potentially registered mapped 3D models that correspond to the structures in visible the X-ray images. A second transformation of this type is determined from a plurality of respective associated phase points in the window. A search algorithm is executed, to search for the pose that maximizes a match score that combines match score contributions between X ray data for respective time points in the window and data derived from 3D models for the associated phase points. The pose may be represented by a set of spatial transformation parameters, such as parameters of an overall spatial translation and/or rotation of the 3D model, i.e. transformations that preserve the form and size of the imaged objects. In sixth step 26 a search algorithm is executed to search in the parameter space of such transformations. In addition the transformations may have a
predetermined deformation component, for example to compensate for known non-linearity of imaging.
For every pose estimation, digitally reconstructed radiographs (DRRs) of the coronary lumen model may be generated from the 3D models obtained by transforming the 3D models for the selected phase points according to the estimated pose. A similarity metric may be computed by comparing the DRRs and the X-ray images using a similarity metric, for example by computing a normalized cross correlation. In an embodiment processor 18 removes static structures by subtracting from each pixel the average pixel intensity of this pixel over time from the X-ray images before computing the metric. Alternatively the metric may be computed without such a subtraction. The similarity metric defines a match score between an X-eray image and a DRRs for a phase point. A sum of the resulting metric values over phase points covered by the time window w may be determined to derive a combined match score for a specific pose. A search for pose (a set of parameters of the second transformation) is performed that leads to a higher value, or at least a not lower value of this score than other poses. This search may be performed using a multi-resolution gradient ascent procedure in the space of the transformation parameters that represent the pose.
In an embodiment contrast fluid is injected before capturing the X- ray image in sixth step 26 (e.g. once for a series of repetitions of sixth step 26). In an embodiment the digital reconstruction of the radiographs may be configured to simulate the effect of the contrast fluid in the generation of the digitally reconstructed radiographs. Alternatively, the affected blood vessels may be identified manually for example.
It may be noted that the first transformations Tl(f+j) that are determined in third step 23 for respective phase positions in the cardiac cycle define first transformations Tl(f+j) of the reference 3D model and that the common pose estimation for the 3D models at the time points in the window defines a second transformation T2 to be applied in common to the results of applying the first transformations. Composite transformations T"(f+j)=Tl( T2(f+j) ) may be defined, obtained by composing the respective ones of the first transformations Tl(f+j) with the second transformation T2. The 3D models for use in the correlation may be derived directly from the reference model by applying the composite transformations. A further cardiac phase detector (not shown) coupled to X-ray device 16 or processor 18 may be used to determine the associated phase points in the cardiac cycle associated with respective ones of the X-ray images. Instead of using a further cardiac phase detector, or in addition to it, an additional search may be made for associated phase positions in the cardiac cycle that, when assigned to the X-ray images, result in the best match with the 3D models for those phase positions in the cardiac cycle. This search may be constrained for example according to phase=offset+factor*time, to search for an optimal factor between the sampling time point scale and phase position scale and a common phase offset in the cardiac cycle. Alternatively a less constrained relation may be used, for example to allow for a predetermined phase error.
When the temporal sample rate and/or phase of X-ray data time- points does not match the frame rate and or phase of the X-ray data, interpolation (e.g. spline interpolation) may be applied to derive intermediate 3D models for selected time points. The interpolated 3D models for use in the correlation may be derived directly from the reference model by applying interpolated composite transformations.
In a seventh step 27, processor 18 uses the registration. Different forms of use are possible. In one embodiment processor 18 controls a display to display model data derived from the reference model according to the transformations. Processor 18 may generate the derived model data from the reference 3D model obtained in second step 22 and applying the
transformations T obtained in third step 23 and by applying the common pose determined for the window in sixth step 26. Optionally display of the derived model data may be combined with the X-ray data for the cardiac phase that is associated with the model data. P In another embodiment the transformations and the model may be used to control movement of a probe. After seventh step 27, the process repeats from fourth step 24 for a newly captured X-ray image for a next sampling time point. Preferably, fourth to seventh steps 24-27 are repeated real-time, at a predetermined rate corresponding to the temporal sampling rate of X-ray device 16. A sampling rate of fifteen X-ray images per second may be used for example and preferably at a sampling rate that lies well above normal heart rates, for example at least five images per second. First to third step 21-23 may be executed at a different time. They need not be performed in real time.
Although in one embodiment all steps of the process may be executed automatically, without human intervention, it should be noted that the system may provide for human intervention at various stages, preferably as far as this does not affect real time performance. Thus for example, 3D scanner 10 may provide for an input to receive annotations and it may be programmed to add the annotations to the reference 3D model, for rendering in combination with the X-ray images. Similarly, 3D scanner 10 may be programmed to support human controlled generation or adaptation of the model or the transformations in second and third steps 22, 23. In one embodiment, the entire selection of the transformations T(f+i) and/or segmentation or centre line selection of model generation may be input into 3D scanner 10 by the user.
In an embodiment one or more adjustment steps may be added. Manual initialization may be performed by applying one 2D translation in the plane orthogonal to the projection line for the complete cardiac cycle. Such adjustment steps may be performed for example in arbitrary repetitions of fourth to seventh steps 24-27, once for all subsequent repetitions until a next adjustment is made. Alternatively, or in addition, such adjustments may be made before the repetitions.
In an embodiment a dual- source CT system (Somatom Definition; Siemens Medical Solutions, Forchheim, Germany) was used. No beta-blockers were administered in preparation for the scan. The entire volume of the heart was scanned in one breath-hold, while simultaneously recording the ECG trace. An optimized heart-rate- dependent ECG pulsing protocol was applied. Outside the ECG pulsing window a tube current of 4% was used in order to reduce the effective dose of the CT examination. Multi-phase 4D datasets were reconstructed at 5% steps from 0% to 95% of the RR-interval, with a slice thickness of 3 mm and slice increment of 1.5 mm. Voxel sizes of the
reconstructed volume are approximately 0.7x0.7x1.5 mm3.
X-ray data was acquired with an AXIOM- Artis C-arm system (Siemens Medical Solutions, Forchheim, Germany). The pixel size of these images is 0.22x0.22 mm2. The 4D coronary models were evaluated using manually annotated coronary centerlines at both systole and diastole of the cardiac cycle. In every dataset one of the following vessels was randomly selected for annotation: right coronary artery (RCA), left anterior descending artery (LAD) or left circumflex artery (LCX). In total 11 LADs, 11 LCXs and 9 RCAs were annotated.
Graphs showing the results for the model consistency experiment can be found in Figure 3. Results are presented using the image at systole and diastole respectively, to select the phase point for the reference 3D model in the registration procedure. The average distance between the centerlines is plotted against the time-point index number. Each dotted line in the plot represents a dataset. The continuous line represents the median distance over all datasets, which is below 1 mm for both the systolic and diastolic
experiment.
Figure 4 represents model accuracy. The figure displays a boxplot of the distance between the centerlines at systole and diastole before (I) and after (R) registration, and the difference between these two (D). Distances are averaged over systolic and diastolic measurements. Results are presented for all vessels combined and for the LAD, LCX and RCA separately. It can be noticed that the LAD shows the least deformation during the cardiac cycle and the RCA the most. The median accuracy is approximately 1 mm.
Figure 5 shows results of the experiment. X-ray time-point positions are plotted against the best matching CT time-point. The dashed line shows the results for w = 1 and the solid line for w = 4. It can be noticed that using a larger window in the registration procedure results in a much better time point position derivation than the 2D/3D+t procedure. For some time-phases at the end of the RR-interval, registration ended up in a local optimum, caused by using the same manual initialization for the complete cardiac cycle, which is for some phases outside the capture range of the registration procedure. This may be solved by using the resulting optimal pose for the previous time-point as initial pose for the time-point under consideration.
Illustrations with registration results at two phases in the cardiac cycle are shown in Figure 6. Although this experiment was limited to one dataset, it demonstrates the feasibility of this approach. Moreover, it indicates that registration of CTA and mono-plane X-ray images, which is an ill-posed problem, can benefit from the information in the time dimension of the imaging data.
Some misalignment is still visible in Fig. 6, which may be caused by irregular heart deformation, inaccurate reporting of the projection geometry of the X-ray system in the DICOM headers, errors in the 4D coronary model, or non-rigid deformation caused by the breathing cycle. Improvements could also be applied to the registration procedure for derivation of the 4D coronary model, especially to be better able to handle the more excessive motion of the RCA. It can, for example, be advantageous to take the cyclic behavior of the heart motion into account or to impose a smoothness constraint on the registration in the time- dimension of the image. Furthermore, as we
discretized the 4D coronary model in the time dimension, there may still exist a small time mismatch between the X-ray time-point and coronary model time- point.
In an embodiment a smoothness constraint may be imposed by performing a search for parameters of the transformation Tl that produce a match by minimizing a score function that is a combination (e.g. a sum) sum of contributions representing accuracy and smoothness in the determination of the transformation in third step 23. The contribution representing accuracy may comprise a contribution which is a measure of mismatch between individual images and transformed images. The contribution representing smoothness may be a measure of change of the parameters of the
transformation (e.g. a possibly weighted sum of squares of differences between corresponding parameters for successive time points). Any known search method may be used, such as a simple method that solves the parameters of the transformation from an equation implied by minimum score function. Other techniques may be used to affect selection of the transformation, such as 4D tracking.
In an embodiment, cyclic behavior of the heart motion may be taken into account by using images I(f+i) for time points over a plurality of cardiac cycles in the determination of the transformation in third step 23. This step may comprises determination of transformations for time points in a single cardiac cycle that produce a match by minimizing a score function that measure a combination of deviations between transformed model images and images for corresponding time points from the plurality of different cardiac cycles. In an embodiment, this may be combined with a smoothness constraint for changes of the transformation parameters in the cycle that includes a measure of changes of the transformation parameter from the end of the cycle to its beginning.
As described, a 3D centerline or lumen model of the coronary arteries is derived from a phase in the cardiac cycle with minimal motion. Deformation of the coronary arteries is derived from the 4D CTA data by means of a non-rigid registration procedure. We assume that the coronary arteries move with the larger structures in the image to which they are attached, such as the heart chambers, which is especially advantageous in those phases of the cardiac cycle where the arteries are not well visible due to motion artifacts or dose reduction. The resulting 4D coronary model is applied in a temporal window based registration procedure for the dynamic alignment of CTA with X-ray data.
An embodiment has been shown wherein the 3D models for respective phase positions in the cardiac cycle, for use in the selection of the transformations for registering with the X-ray images, are derived using the transformations that were determined in third step 23. Alternatively 3D models obtained from measurements for these respective phase positions could be matched directly to the X-ray images. However, the indirect approach, using transformations from the reference 3D model may yield more reliable results, because it reduces the effects of inaccuracy of the 3D images and low image quality that occur at many phase positions.
In an embodiment a transformation of the reference 3D model is used to derive the image data for display, after registration according to the transformations that have been found in the process. Alternatively a 3D model measured at the phase position of the X-ray image and transformed according to the common transformation could be used for the combined display.
However, the reference 3D model provides for qualitatively better imaging data, because it is very difficult to extract 3D models at those phases in the cardiac cycle where dose reduction is applied in the acquisition process.
Although an embodiment has been shown wherein the results of registration are used for combined display of data derived form the reference 3D model or the 3D images and the X-ray images, it should be realized that registration can be used for many purposes. An image derived from the reference 3D model or the 3D images could be projected on its own, after transformation according to the registration, in order to show information about the current location of various elements of the model. Registered images could be shown side by side, or in a mutually overlapping fashion.
An embodiment has been shown wherein the match score for determining the second transformation is obtained by applying a common second transformation is used that is the same for all time points in the window. As described, the resulting second transformation is used for registering the 3D model at least at one time point. In an embodiment, the procedure to determine the second transformation may be repeated for successive time points, using a respective different window for each time point, such as a window that extends backward from the time point over a detected cardiac cycle or a predetermined fraction of the cardiac cycle. This provides for a minimum of delay between capture of X-ray images and the availability of the second transformations for the time points corresponding to the X-ray images.
In an embodiment the resulting second transformation for one time point in the window may be used for registering the 3D model at a plurality of time points in the window. In another embodiment a combination potentially different parameters of second transformations for a plurality of time points may be determined together. This may be done by searching for a combination of parameters for different time points that maximizes a score that combines (a) measures of similarity between pairs of X-ray images and model derived images for the time points in the window and (b) measures of similarity between the parameters of the second transformations for the time points in the window.
Although an example has been shown wherein the match score for selecting the second transformation is determined form a sum of match score for pairs of images for phase points in a window, it should be appreciated that there are other ways to make the selection of the second transformation depend on X ray images for a plurality of time points. For example, in one embodiment sixth step 26 a constraint may be imposed that limits the change of the parameters of the second transformation between successive time points. The transformation may be selected to maximize an amended match score for example, which is defined as a sum of the image based match score for a pair of an X-ray image and a model derived 2D image for the time point plus a measure of similarity between the parameters of the second translation for the time point and an adjacent time point (e.g. a previous time point). The measure of similarity may be a weighted sum of squares of changes in the parameters of the second transformation compared to the previous repetition.
It should be appreciated that criteria with similar effect can be defined in many ways. In another example an optimization criterion is used that is a weighted sum A+B of an image similarity measure A and a
transformation parameter similarity measure B for a set of time points, summing similarity measures A=sum A(t) between X ray images for the time points t and images reconstructed using the transformations and similarity measures B=sum B(t) between transformation parameters for a time point and one or more adjacent time points. A search algorithm may be executed in the space of parameters to optimize the combined similarity measure.
In each case the selection of the X-ray image dependent
transformation for a time point in sixth step 26 is made to depend on X-ray images for a range of time points, including said time point and at one other time point at a different cardiac phase.
The match scores between images for individual time points may be defined by counting matching features detected the X-ray image that match features present in the transformed 3D model, or by summing correlations between image values, and/or by subtracting decrements from the score for detected differences. The aggregate match score may be a sum of match scores for different time points wherein equal weight is given to different time points, or time dependent weight may be given. The sum over match scores for different time points may computed by direct summing or by means of recursive filtering.
In an embodiment, the match score used in the search of sixth step 26 may relate only for a selected limited region of 3D space. For example, when the registration is used to control navigation of the tip of a catheter through blood vessels, registration in a region containing the position of the tip may be more important than elsewhere. In another example, when contrast fluid is fed to the blood vessels from a catheter, only a part of the blood vessels downstream from the catheter may show up clearly in the X-rays. In both cases a match score may be used that is based only on similarity between the 3D model and the X-ray image in said region and not in a larger region that contains other parts of the heart.
Such a limitation to a selected region may have the effect that only a small number of blood vessels is available for determining the second transformation. If only a single X-ray image would be used to determine the second transformation, this could lead to inaccuracy when relevant blood vessels are occluded, or when bends in the blood vessels are invisible due to projection. This is overcome by using a plurality of X-ray images for different time points, wherein blood vessels are visible under different orientations due to movement of the heart.
Because the second transformation is selected dependent on X-ray images for a range of time points, the selection of sixth step 26 is made in a more restricted way than the selections of the transformations from the reference 3D model to the 3D models in third step 23. In this way, the freedom to select the composite transformations T2( Tl(f+j) ) from the reference 3D model to the match with the X-ray image is distributed over selection of transformations Tl(f+j) on the basis of 3D scan data and the selection of the second transformation T2 on the basis of the X-ray data. Most of this freedom of choice is assigned to the selection of the transformations based on 3D scan data and the freedom of selection on the basis of the X-ray data is constrained by imposing a requirement of temporal coherence. Preferably, the range of time points has a length that is sufficient to cover both the end-diastole and the end systole phase and more preferably an entire cardiac cycle.
In an embodiment 3D position dependent rotations and/or translations or B spline transformations may be used in third step 23 or seventh step 27 or both. In another embodiment, the transformations used in third step 23 and seventh step 27 may be overall translations. In another embodiment these transformations may be overall rotations and/or
combinations of overall rotations and overall rotations.
As used herein, the existence of a match between objects typically means that a match score between the objects substantially has an optimal value, compared to match scores between objects that do not match. The match scores may be determined from a comparison of image values or between features extracted from images for example. Determination of parameters that provide for a match typically involves executing a search algorithm that searches for parameters that result in matching objects, after a transformation according to the parameters. Such a search algorithms may be a simple algorithm that tries various parameter values, computes match scores obtained with these parameter values and selects those of the tried parameters that substantially resulted in the best match score. Alternatively the search algorithm may be an algorithm that solves an equation for the optimal parameter values or a descent algorithm that step along successive parameter values with successively better match scores until the substantially optimal parameter values have been found.
Although an example of application to a cardiac cycle has been given it should be appreciated that alternatively the described techniques may be applied to other cycles, such as the respiratory cycle, a reference 3D model being obtained for a selected phase point in the respiratory cycle, transformed to other phase points in that cycle and used to determine a match score over a temporal window with 2D images.
Although an example has been given wherein a single 3D reference model is used for one time point in the cycle that is mapped to all time points in the cycle, it should be appreciated that more than one 3D reference model may be used, such as 3D reference models for different time points in the cycle. Reference models for the end systolic and end diastolic phase points may be used for example. First transformations may be determined for each reference model, and mapping of the models to different time points may comprise combining information from the reference models and/or selection between the reference models, for example according to the time distance between the mapped time point and the time points corresponding to the reference models..
As described, the process involves processing of the 3D scan data and combined processing of the 3D scan data and the 2D images. Processing of the 3D scan data may comprise generation of the reference 3D model and determination of the first transformations Tl(j+f). Combined processing of the 3D scan data and the 2D images may comprise determination of the second transformations T2 and generation of images from the reference 3D model using the first and second transformations. In an embodiment, processing of the 3D scan data is performed by a processor in 3D scanner 10 and the combined processing is performed by processor 18. Thus, the processor of 3D scanner 10 and processor 18 form a processing system to perform the process. However, it should be appreciated that a processing system with a different distribution of tasks may be used, such as a processing system wherein processor 18 performs all operations, or a system wherein processor 18 is part of 3D scanner 10, or a system with additional processors that perform parts of the tasks.
The processors in the processing system such as processor 18 may be implemented as a programmable computer programmed with a computer program to make it perform the functions as described. Alternatively, part or all of these functions may be realized by electronic circuits that are specifically designed to perform these functions. Both possibilities will be indicated by stating that the processing system is configured to implement functions.

Claims

Claims
1. An image processing method, comprising the steps of
- performing 3D scans of a moving object for respective time values in a time interval;
- constructing a reference 3D model of the object from data from the 3D scans for a selected time position from the time interval;
- determining first parameters of first spatial transformations mapping the reference 3D spatial model to mapped 3D models that match data from the 3D scans for the respective ones of the time values, the first parameters being determined from the reference 3D model and the 3D scans for respective ones of the time values;
- capturing a stream of 2D images of the object for a series of time points;
- associating the time points to time values in said time interval;
- determining second parameters of a second spatial transformation applicable to a first one of the mapped 3D models for a first one of the associated time values, the second parameters being determined dependent on the 2D images for a plurality of the time points, the second spatial transformation with said second parameters mapping the mapped 3D model for the first one of the associated time values to a registered 3D model that defines derived 2D image data that match the 2D image for the first one of the time values, and the second spatial transformation with said second parameters, or further second parameters with values that are related to the second parameters, mapping the mapped 3D models for plurality of time points to registered 3D models that define derived 2D image data that match a combination of the 2D images for the plurality of time points.
2. An image processing method according to claim 1, wherein the determining of the second parameters comprises executing a search algorithm in a parameter space for the second spatial transformation, using an aggregate of match scores between the 2D images from said stream for a window of the time points in said series and data obtained from the reference 3D model by applying the first spatial transformations for time values associated to the time points in said window, the search operation searching for parameter values at which a maximum of aggregate of match scores arises in said parameter space.
3. An image processing method according to claim 1 or 2, wherein the object comprises a beating heart, the method comprising detecting an end- systole phase or end- diastole phase of the heart and selecting the selected time position equal to a end-systole phase or end-diastole phase time position.
4. An image processing method according to claim 3, wherein said determining of the first parameters comprises determining the first
parameters matching detected movement of larger structures than coronary arteries in the data from the 3D scans.
5. An imaging processing method according to any one of the preceding claims, wherein the performing of the 3D scans comprises computer
tomography.
6. An image processing method according to any one of the preceding claims, wherein the first spatial transformations correspond to a first transformation function that relates spatial positions in the reference 3D model to transformed positions, allowing for deformation of the reference 3D model dependent on the first parameters, whereas the second spatial transformations correspond to a first transformation function that preserves form independent of the second parameters.
7. An image processing method according to claim 6, wherein the first transformation function is a B-spline function of the spatial positions.
8. An image processing method according to claim 6 or 7, wherein the first parameters are determined recursively from successively higher resolution versions of the 3D scans.
9. An image processing method according to claim 1, wherein the determining of the first parameters comprises performing search operations in a parameter space for the first spatial transformations, using a match score between the 3D scans and mapped 3D model data obtained from the reference 3D model by applying the first spatial transformations, the search operation searching for parameter values at which a maximum of the match score arises in said parameter space.
10. An imaging system, comprising a 3D scanner for scanning an object for respective time values in a time interval, an image stream capturing device for capturing a stream of 2D images of the object for a series of time points and a processing system configured to
- construct a reference 3D spatial model of the object from data from the 3D scans for a selected time position from the time interval;
- determine first parameters of first spatial transformations of the reference 3D model, the first parameters being determined from the reference 3D model and the 3D scans for respective ones of the time values, the first spatial transformations mapping the reference 3D spatial model to mapped 3D models that match data from the 3D scans for the respective ones of the time values;
- associate the time points of images in said stream to time values in said time interval;
- determine second parameters of a second spatial transformation applicable to a first one of the mapped 3D models for a first one of the associated time values, the second parameters being determined dependent on the 2D images for a plurality of the time points, the second spatial transformation with said second parameters mapping the mapped 3D model for the first one of the associated time values to a registered 3D model that defines derived 2D image data that match the 2D image for the first one of the time values, and the second spatial transformation with said second parameters, or further second parameters with values that are related to the second parameters, mapping the mapped 3D models for plurality of time points to registered 3D models that define derived 2D image data that match a combination of the 2D images for the plurality of time points.
11. An imaging system according to claim 10 wherein the image stream capturing device comprises an X-ray image capture device.
12. An imaging system according to claim 11, wherein X-ray images capture device has an image capture rate of at least five X-ray images per second.
13. An imaging system according to any one of claims 10 to 12 wherein the 3D scanner comprises a computer tomographic scanner.
14. An imaging system according to any one of claims 10 to 13, comprising a cardiac phase detector, coupled to the 3D scanner, the 3D scanner being configured to determine 3D models for respective cardiac phase values signalled by the cardiac phase detector.
15. An imaging system according to any one of claims 10 to 14, comprising a cardiac phase detector, coupled to processing system, the processing system being configured to associate the time values to the time points in the series according to the detected cardiac phase at the time points.
PCT/NL2009/050549 2009-09-14 2009-09-14 Image processing method and system WO2011031134A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/NL2009/050549 WO2011031134A1 (en) 2009-09-14 2009-09-14 Image processing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/NL2009/050549 WO2011031134A1 (en) 2009-09-14 2009-09-14 Image processing method and system

Publications (1)

Publication Number Publication Date
WO2011031134A1 true WO2011031134A1 (en) 2011-03-17

Family

ID=42115395

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/NL2009/050549 WO2011031134A1 (en) 2009-09-14 2009-09-14 Image processing method and system

Country Status (1)

Country Link
WO (1) WO2011031134A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016131955A1 (en) * 2015-02-20 2016-08-25 Koninklijke Philips N.V. Automatic 3d model based tracking of deformable medical devices with variable appearance
CN108053477A (en) * 2017-12-20 2018-05-18 北京华航无线电测量研究所 The Numerical Methods of deformation in a kind of pipeline
CN108475436A (en) * 2015-12-29 2018-08-31 皇家飞利浦有限公司 Use the registration of the surgical operation image capture device of profile mark
CN111311737A (en) * 2020-03-04 2020-06-19 中南民族大学 Three-dimensional modeling method, device and equipment for heart image and storage medium
CN115553818A (en) * 2022-12-05 2023-01-03 湖南省人民医院(湖南师范大学附属第一医院) Myocardial biopsy system based on fusion positioning

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006119623A1 (en) * 2005-05-09 2006-11-16 Robarts Research Institute System and method for generating dynamic ct images of a target within a subject that experiences motion

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006119623A1 (en) * 2005-05-09 2006-11-16 Robarts Research Institute System and method for generating dynamic ct images of a target within a subject that experiences motion

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
BEYAR R ET AL: "Prospective Motion Correction of X-Ray Images for Coronary Interventions", IEEE TRANSACTIONS ON MEDICAL IMAGING, IEEE SERVICE CENTER, PISCATAWAY, NJ, US LNKD- DOI:10.1109/TMI.2004.839679, vol. 24, no. 4, 1 April 2005 (2005-04-01), pages 441 - 450, XP011129442, ISSN: 0278-0062 *
FIGL M ET AL: "Photo-consistency registration of a 4D cardiac motion model to endoscopic video for image guidance of robotic coronary artery bypass", PROCEEDINGS OF THE SPIE - THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING SPIE - THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING USA LNKD- DOI:10.1117/12.813905, vol. 7261, 8 February 2009 (2009-02-08) - 10 February 2009 (2009-02-10), XP002584879, ISSN: 0277-786X *
METZ C T ET AL: "GPU accelerated alignment of 3-D CTA with 2-D X-ray data for improved guidance in coronary interventions", BIOMEDICAL IMAGING: FROM NANO TO MACRO, 2009. ISBI '09. IEEE INTERNATIONAL SYMPOSIUM ON, IEEE, PISCATAWAY, NJ, USA, 28 June 2009 (2009-06-28), pages 959 - 962, XP031502205 *
MICHAEL FIGL ET AL: "Coronary Motion Modelling for Augmented Reality Guidance of Endoscopic Coronary Artery Bypass", 7 July 2008, BIOMEDICAL SIMULATION; [LECTURE NOTES IN COMPUTER SCIENCE], SPRINGER BERLIN HEIDELBERG, BERLIN, HEIDELBERG, PAGE(S) 197 - 202, XP019092021 *
MICHIEL SCHAAP ET AL: "Coronary Lumen Segmentation Using Graph Cuts and Robust Kernel Regression", 5 July 2009, INFORMATION PROCESSING IN MEDICAL IMAGING, SPRINGER BERLIN HEIDELBERG, BERLIN, HEIDELBERG, PAGE(S) 528 - 539, XP019121692 *
RUECKERT D ET AL: "Nonrigid Registration Using Free-Form Deformations: Application to Breast MR Images", IEEE TRANSACTIONS ON MEDICAL IMAGING, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 18, no. 8, 1 August 1999 (1999-08-01), XP011035886, ISSN: 0278-0062 *
WIERZBICKI M ET AL: "Validation of dynamic heart models obtained using non-linear registration for virtual reality training, planning, and guidance of minimally invasive cardiac surgeries", MEDICAL IMAGE ANALYSIS, OXFORD UNIVERSITY PRESS, OXFORD, GB LNKD- DOI:10.1016/J.MEDIA.2004.06.014, vol. 8, no. 3, 1 September 2004 (2004-09-01), pages 387 - 401, XP004533595, ISSN: 1361-8415 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016131955A1 (en) * 2015-02-20 2016-08-25 Koninklijke Philips N.V. Automatic 3d model based tracking of deformable medical devices with variable appearance
CN108475436A (en) * 2015-12-29 2018-08-31 皇家飞利浦有限公司 Use the registration of the surgical operation image capture device of profile mark
US11564748B2 (en) 2015-12-29 2023-01-31 Koninklijke Philips N.V. Registration of a surgical image acquisition device using contour signatures
CN108053477A (en) * 2017-12-20 2018-05-18 北京华航无线电测量研究所 The Numerical Methods of deformation in a kind of pipeline
CN108053477B (en) * 2017-12-20 2021-07-02 北京华航无线电测量研究所 Numerical processing method for deformation in pipeline
CN111311737A (en) * 2020-03-04 2020-06-19 中南民族大学 Three-dimensional modeling method, device and equipment for heart image and storage medium
CN111311737B (en) * 2020-03-04 2023-03-10 中南民族大学 Three-dimensional modeling method, device and equipment for heart image and storage medium
CN115553818A (en) * 2022-12-05 2023-01-03 湖南省人民医院(湖南师范大学附属第一医院) Myocardial biopsy system based on fusion positioning
CN115553818B (en) * 2022-12-05 2023-03-28 湖南省人民医院(湖南师范大学附属第一医院) Myocardial biopsy system based on fusion positioning

Similar Documents

Publication Publication Date Title
US20230309943A1 (en) Methods and systems for dynamic coronary roadmapping
US10229516B2 (en) Method and apparatus to improve a 3D + time reconstruction
US8126239B2 (en) Registering 2D and 3D data using 3D ultrasound data
JP5368315B2 (en) Imaging system for imaging region of interest with moving object
JP6175073B2 (en) Real-time display of vascular view for optimal device navigation
US8295573B2 (en) Motion-compensated coronary flow from projection imaging
KR101428005B1 (en) Method of motion compensation and phase-matched attenuation correction in pet imaging based on a few low-dose ct images
Shechter et al. Prospective motion correction of X-ray images for coronary interventions
US20100061611A1 (en) Co-registration of coronary artery computed tomography and fluoroscopic sequence
US10426414B2 (en) System for tracking an ultrasonic probe in a body part
US20030220555A1 (en) Method and apparatus for image presentation of a medical instrument introduced into an examination region of a patent
US10052032B2 (en) Stenosis therapy planning
JP2015506188A (en) Video overlay and motion compensation of uncalibrated endoscopes of structures from volumetric modalities
JP2004160221A (en) Method and apparatus for medical intervention procedure planning
US20120136236A1 (en) Imaging method and apparatus
WO2007132388A1 (en) Method and apparatus for reconstructing an image
JP4157302B2 (en) X-ray CT system
KR20150118484A (en) Method and Apparatus for medical image registration
WO2005020155A1 (en) Device and method for generating a three-dimensional vascular model
CN111657978A (en) Pairing of anatomy representations with live images
JP2021166701A (en) Method and system for registering intra-object data with extra-object data
US20230130015A1 (en) Methods and systems for computed tomography
WO2011031134A1 (en) Image processing method and system
JP4777164B2 (en) HEART RATE DETERMINATION DEVICE, PROGRAM, AND X-RAY DIAGNOSIS DEVICE
US8704825B2 (en) Method for generating a four-dimensional representation of a target region of a body, which target region is subject to periodic motion

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09788318

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09788318

Country of ref document: EP

Kind code of ref document: A1