WO2005024721A2 - 2d/3d image registration in image-guided radiosurgery - Google Patents

2d/3d image registration in image-guided radiosurgery Download PDF

Info

Publication number
WO2005024721A2
WO2005024721A2 PCT/US2004/027158 US2004027158W WO2005024721A2 WO 2005024721 A2 WO2005024721 A2 WO 2005024721A2 US 2004027158 W US2004027158 W US 2004027158W WO 2005024721 A2 WO2005024721 A2 WO 2005024721A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
plane
target
accordance
ray
Prior art date
Application number
PCT/US2004/027158
Other languages
French (fr)
Other versions
WO2005024721A3 (en
Inventor
Gopinath Kuduvalli
Dongshan Fu
Shehrzad Qureshi
Original Assignee
Accuray, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Accuray, Inc. filed Critical Accuray, Inc.
Priority to EP04781775A priority Critical patent/EP1667580A4/en
Publication of WO2005024721A2 publication Critical patent/WO2005024721A2/en
Publication of WO2005024721A3 publication Critical patent/WO2005024721A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1049Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N23/00Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
    • G01N23/02Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material
    • G01N23/04Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/32Determination of transform parameters for the alignment of images, i.e. image registration using correlation-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/38Registration of image sequences
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1049Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
    • A61N2005/1061Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam using an x-ray imaging system having a separate imaging source
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1049Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
    • A61N2005/1061Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam using an x-ray imaging system having a separate imaging source
    • A61N2005/1062Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam using an x-ray imaging system having a separate imaging source using virtual X-ray images, e.g. digitally reconstructed radiographs [DRR]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1064Monitoring, verifying, controlling systems and methods for adjusting radiation treatment in response to monitoring
    • A61N5/1065Beam adjustment
    • A61N5/1067Beam adjustment in real time, i.e. during treatment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1064Monitoring, verifying, controlling systems and methods for adjusting radiation treatment in response to monitoring
    • A61N5/1069Target adjustment, e.g. moving the patient support
    • A61N5/107Target adjustment, e.g. moving the patient support in real time, i.e. during treatment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • G06T2207/10124Digitally reconstructed radiograph [DRR]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Definitions

  • Radiosurgery can be used to treat tumors and other lesions by delivering a prescribed dose of high-energy radiation to a target area while minimizing radiation exposure to the surrounding tissue.
  • precisely focused beams of radiation e.g. very intense x-ray beams
  • One goal is to apply a lethal amount of radiation to one or more tumors, without damaging the surrounding healthy tissue.
  • Radiosurgery therefore calls for an ability to accurately target a tumor, so as to deliver high doses of radiation in such a way as to cause only the tumor to receive the desired dose, while avoiding critical structures such as the spinal cord.
  • Conventional radiosurgery may use a rigid and invasive stereotactic frame to immobilize the patient prior to diagnostic CT or MRI scanning. Treatment planning may then be conducted from the diagnostic images.
  • the treatment planning software determines the number, intensity, and direction of the radiosurgical beams that should be cross-fired at the target, in order to ensure that a sufficient dose is administered throughout the tumor so as to destroy it, while minimizing damage to adjacent healthy tissue. Radiation treatment may typically be accomplished on the same day treatment planning takes place. Immobilization of the patient may be necessary in order to maintain the spatial relationship between the target and the radiation source to ensure accurate dose delivery.
  • the frame may be fixed on the patient during the whole treatment process.
  • Image-guided radiosurgery can eliminate the use of invasive frame fixation during treatment, by frequently and quasi-continuously correcting patient position or aligning radiation beam with the patient target.
  • the change in target position at the time of treatment is detected. This can be accomplished by registering the x-ray image acquired at the treatment time with the diagnostic 3D scan data obtained pre- operatively at the time of treatment planning.
  • Medical image registration can be useful in many areas of medicine, including but not limited to radiosurgery and radiotherapy.
  • the positions of the target can be defined by physicians at the time of treatment planning, using the diagnostic 3D scans.
  • 3D imaging modalities such as computed tomography (CT), magnetic resonance imaging (MRI), or positron emission therapy (PET) can be used to generate diagnostic 3D images of the anatomical region containing the targeted area, for treatment planning purposes.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • PET positron emission therapy
  • 3D imaging modalities can be used to generate diagnostic 3D images of the anatomical region containing the targeted area, for treatment planning purposes.
  • CT scans allow an image of the internal structure of a target object to be generated, one cross-sectional slice at a time.
  • the 3D scan data (e.g., CT, MRI, or PET scan data) may be used as a reference, in order to determine the patient position change during treatment.
  • synthesized 2D images such as digitally reconstructed radiographs (DRRs) may be generated from the 3D scan data, and may be used as 2D reference images.
  • DRRs digitally reconstructed radiographs
  • this problem is categorized as a 2D/3D registration.
  • similarity measures can be useful for comparing the image intensities in the x-ray images and the DRR images, so that the change in patient position (and thus in target region position) that has occurred between the diagnostic scanning and the taking of real-time images can be accurately detected.
  • Image-guided radiosurgery typically requires precise and fast positioning of the target at the treatment time. In practice, it is desirable that the accuracy be below 1 mm, and the computation time be on the order of a few seconds. Unfortunately, it can be difficult to meet both requirements simultaneously.
  • An accurate and rapid method and system are presented for tracking target position in image guided radiosurgery.
  • a hierarchical and iterative framework is used to register 2D x-ray images with images that have been reconstructed from 3D scan data.
  • the hierarchical and iterative 2D/3D registration algorithm allows for an accurate and rapid correction of target position, and an accurate and rapid alignment of radiosurgical beams, throughout the treatment procedure.
  • High accuracy can be achieved in both the translational and rotational adjustments of the target position.
  • the total computation time is about an order of magnitude faster than other techniques known in the art.
  • An improved method and system is used to compare the measure of similarity of two digital images. The similarity measure is based on pattern intensity, and provides a robust, accurate, and efficient solution to the 2D/3D medical image registration problem in image guided radiosurgery.
  • a method in image guided surgery for aligning the position of a treatment target relative to a radiosurgical beam generator during treatment includes performing 2D/3D image registration between one or more near real-time 2D x- ray images of a treatment target, and one or more 2D reconstructed images of the target based on pre-treatment 3D scan data .
  • a pre-treatment 3D scan of the target is performed, treating the target as a rigid body, and describing its position with six degrees of freedom.
  • the 3D scan (for example a CT scan, an MRI scan, or a PET scan) shows the position of the target at treatment planning time.
  • One or more 2D x-ray images of the target are generated during treatment, in near real time.
  • the x-ray images show the position of the target at a current time during treatment.
  • two orthogonal x-ray projections images are generated, using imaging beams having a known position, angle, and intensity.
  • a set of 2D reconstructed images are generated offline, based on the 3D scan data.
  • the 2D reconstructed images are DRRs that are generated using the same positions and angles of the imaging beams that are used for the x-ray images, i.e. using the known positions, angles, and intensities of the imaging beams that are used to generate the near real-time x-ray images.
  • the DRRs are registered with the x-ray images, to generate the 3D rigid body transformation parameters that represent the change in position of the target between the 3D scan and the x-ray images.
  • the registration is performed for each orthogonal projection individually, and the results are subsequently combined.
  • in-plane rotations of the DRRs are performed within the image plane of the x-ray images, thereby generating reference DRRs.
  • the x-ray images are processed so that the orientation, image size, and bit depth of the x-ray images match the orientation, image size, and bit depth of the reference DRRs.
  • In-plane and out-of-plane transformation parameters are estimated using different search methods, including 3-D multi-level matching and 1-D searching, then iteratively refined until a desired accuracy is reached.
  • the relative position of the radiosurgical beams and the target are, continuously adjusted in near real time, throughout the treatment, in accordance with the 3D transformation parameters obtained via the 2D/3D registration process.
  • the 2D/3D registration process involves determining the value of the parameters (x, y, ⁇ ) and (r, ) that are required in order to register the x-ray image of the target with the reference DRRs of the target, (x, y, ⁇ ) represent the in-plane translational and rotational parameters within the image plane of the x- ray images, (x, y) indicating the requisite amount of translation within the image plane in the directions of the x- and y- axes, respectively, and ⁇ indicating the requisite amount of rotation within the image plane, (r, ⁇ ) represent the out-of- plane rotational parameters, and indicate the requisite amount of out-of-plane rotations about mutually orthogonal axes that are defined in a 3D coordinate system, and that are orthogonal to the image plane.
  • a 3D multi-level matching is first performed, in order to determine an initial estimate for the in-plane transformation parameters (x, y, ⁇ ). Based on these parameters (x, y, ⁇ ) obtained by 3D multi-level matching, an initial 1-D search is performed for each of the pair of out-of-plane rotation parameters (r, ⁇ ). The in-plane translation parameters (x, y) are then refined, using 2D sub-pixel matching, to increase the accuracy of these parameters.
  • the in-plane rotation parameter ( ⁇ ) is then refined, based on the out-of- plane rotation parameters (r, ⁇ ) obtained from the initial 1 D search, and on the updated in-plane transformation parameters (x,y), in order to increase the accuracy of the in-plane rotation parameter ⁇ .
  • 1D interpolation is used in this step.
  • each of the out-of-plane rotation parameters (r, ⁇ ) are refined separately, based on the refined in-plane translation and rotation parameters. The refining steps are iteratively repeated, until a predetermined accuracy is reached.
  • the out-of-plane rotation parameters (r, ⁇ ) are refined, using 1D interpolation, in order to achieve the desired resolution.
  • the 2D/3D image registration algorithm also features a method for determining a measure of similarity between a first image and a second image of an object.
  • the first image is a real-time 2D x-ray image of the object
  • the second image is an artificially synthesized DRR, constructed from pre-treatment 3D scan data of the object.
  • the similarity measure method includes forming a difference image by subtracting the corresponding pixel values of the second (DRR) image from each pixel value of the first image, i.e. the "live" or near real-time x-ray image.
  • the method further includes applying upon each pixel of the difference image a pattern intensity function, where the pattern intensity function is an asymptotic function of the gradients of the difference image.
  • An image-guided radiosurgical system includes means for generating pre- treatment 3D scan data of the target, for example a CT scanner or an MRI scanner.
  • the system includes a radiosurgical beam generator for generating at least one radiosurgical beam for treating the target.
  • Imaging means are provided for generating 2D x-ray images of the target in near real time.
  • the imaging means include one or more (preferably two) imaging x-ray beam sources for generating at least one imaging beam having a known intensity, position and angle.
  • the imaging means direct the imaging beams toward and through the target, so that the beams can be detected by corresponding image receivers (e.g. cameras) after the beams have traversed the target.
  • the detection signals are processed by an image processor, which generates the x- ray images.
  • a pair of x-ray sources and a corresponding pair of x-ray cameras are provided, so that two orthogonal x-ray projection images are generated.
  • Means are provided for generating a set of 2D DRRs for each x-ray projection image.
  • the DRRs are generated using the known intensity, location, and angle of the imaging beams.
  • Image registration means are provided for registering the DRRs with the x-ray images.
  • the image registration means include a processor for computing a set of 3D transformation parameters that represent the change in position of the target between the 3D scan and the near real time x-ray images.
  • the processor contains software for estimating in-plane and out-of-plane transformation parameters for each projection, using a number of search methods including 3D multi-level matching, 2D sub-pixel matching, and 1D searching, and using two different similarity methods (sum-of-square differences and pattern intensity) at different phases of the registration process.
  • the image registration means includes another processor for determining the measure of similarity of a 2D x-ray image of an object and a 2D DRR of the object generated from previously obtained 3D scan data.
  • the processor for determining the measure of similarity between the 2D x-ray image and the 2D DRR contains software for subtracting each pixel value of the second image from a corresponding pixel value of the first image to form a difference image, and then applying a gradient operator upon each pixel of the difference image to form a pattern intensity function.
  • the pattern intensity function is an asymptotic function of the gradients of the difference image, and permits the pixel values within a neighborhood R defined about each pixel in the difference image to be considered.
  • the gradients are defined over at least four directions.
  • the image-guided radiosurgical system also includes positioning means, responsive to a controller, for adjusting in near real time the relative position of the radiosurgical beams and the target, in accordance with the 3D transformation parameters obtained by the 2D/3D registration process.
  • FIG. 1 illustrates the geometric relations between a 3D treatment target and two orthogonal x-ray projections, including the in-plane translational and rotational parameters (x,y,0), and the out-of-plane rotational parameters (r, ⁇ ), for registering a 2D radiographic image with previously generated 3D scan data.
  • FIG. 2 is a schematic diagram of a methodology for tracking a treatment target during image-guided radiosurgery, in accordance with one embodiment.
  • FIG. 3 illustrates a flowchart of a registration algorithm used in a 2D/3D registration method, in accordance with one embodiment.
  • FIG. 4 illustrates the generation of 2D DRRs from 3D CT scan data of a treatment target within an anatomical region of a patient.
  • FIG. 5 illustrates a multi-resolution image representation for a multi-level matching process used to estimate the in-plane transformation parameters.
  • FIG. 6 schematically illustrates a neighborhood for calculating pattern intensity, in one embodiment.
  • FIG. 7 schematically illustrates an image-guided radiosurgery system, constructed in accordance with one embodiment. DETAILED DESCRIPTION
  • the tracking method and system allow for patient position correction and radiation beam alignment during radiosurgery/radiotherapy of a treatment target, for example a tumor within the brain or skull.
  • a fully automatic tracking process is made possible, with no need for user interaction.
  • the tracking method and system includes an improved 2D/3D image registration algorithm.
  • the 2D/3D registration algorithm can also be used in applications other than radiosurgery and radiotherapy, i.e. in any application in which there a need to track a rigid object by registering 2D radiographic images onto 3D scan data.
  • a similarity measure, based on pattern intensity, is used during the 2D/3D medical image registration. The similarity measure allows for selected phases of the 2D/3D registration process in image-guided radiosurgery to be carried out in a robust, efficient, and powerful manner.
  • the radiosurgery target is treated as a rigid body.
  • a rigid body is defined as an object whose internal geometric relationships remain static or unchanged over time. Because no external forces are imposed on a radiosurgical target during radiation treatment, it is reasonable to treat the target as a rigid body, and to use a 3D rigid transformation for the registration process.
  • the 3D rigid transformation is described using six degrees of freedom: three translations along three mutually orthogonal axes in a 3D scan coordinate system (conventionally labeled using the x-, y-, and z- axes), and three rotations (roll, pitch, yaw) about these three axes.
  • the six degrees of freedom are thus represented by six 3D transformation parameters: (x, y, z, r, p, w), where r represents the rotation about the x-axis, p represents the rotation about the y-axis, and w represents the rotation about the z-axis.
  • FIG. 1 illustrates the geometric relations between a three- dimensional treatment target and two orthogonal 2D x-ray projections (labeled A and B in FIG. 1 ), in an image-guided radiosurgery method and system in accordance with one embodiment.
  • Cameras (or image receivers) A and B receive their x-ray projections from respective x-ray sources (not shown).
  • the 2D x-ray projection images of the target are formed by transmitting imaging beams (having a known intensity, and having known positions and angles with respect to the target), generated from a respective pair of x-ray sources, through the target and onto cameras A and B.
  • a 3D CT coordinate system i.e. a coordinate system for the target as viewed in the frame of the CT scan study (taken at the time of treatment planning), can be defined.
  • the patient assumes a position within the real-time camera coordinate frames (defined by the two x-ray cameras A and B, respectively), that does not necessarily match the position of the patient as seen within the 3D CT coordinate system.
  • the difference in the position and orientation of the target within the respective radiographs correspond to the difference in the three-dimensional position and orientation of the target between the camera- and the CT coordinate frames, and are found by solving for the parameters (x, y, z, r, p, w).
  • the x-axis in the 3D CT coordinate system is directed inward into the paper, and is not referenced.
  • the orthogonal 2D projections A and B are viewed from the directions o A s A and O B SB, respectively.
  • FIG. 1 illustrates respective 2D planar coordinate systems that are fixed with respect to the image plane that characterizes each projection.
  • the image planes A and B for the projections A and B are thus defined by mutually orthogonal axes within the respective coordinate systems. These axes are shown in FIG. 1 as (x ⁇ , y A ) for projection A, and (XB, y ⁇ ) for projection B.
  • each x-ray image for each projection is characterized by a respective image plane, defined by mutually orthogonal x- and y- axes in a coordinate frame defined by the two x-ray cameras A and B: x A and y A for projection A, and s and ys for projection B.
  • the direction of the axis x A in the 2D coordinate system for projection A, and the direction of the x-axis in the 3D scan coordinate system are opposite with respect to each other.
  • the direction of axis x B in the coordinate system for projection B, and the direction of the axis x in the 3D scan coordinate system are the same.
  • each projection is characterized by a respective set of transformation parameters, namely ( ⁇ ⁇ ,y A , ⁇ ⁇ ,r ⁇ , ⁇ A ) for projection A, and ( ⁇ B ,y B , ⁇ B ,r B , ⁇ B ) for projection B.
  • the two out-of-plane rotations (with respect to the image plane) in projections A and B are denoted by (r A , ⁇ A ) and (r B , ⁇ ) respectively, where r denotes the amount of rotation about the x-axis (in the 3D scan coordinate system), and ⁇ denotes the amount of rotation about the o A s A axis (for projection B) or the 0 ⁇ S ⁇ axis (for projection A).
  • the in-plane translations and rotation in projections A and B are denoted (x y A ⁇ ) and (xeye ⁇ B ), respectively.
  • (x B y ⁇ ) denote the amount of translations within the image planes for each projection (A and B) in the directions of the x- and y- axes that define each image plane (x A - and y A - for projection A, and xe- and ye for projection B), while ⁇ A and ⁇ B denote the amount of rotation within each image plane about an axis (not shown) that is perpendicular to both the x A - (or x s -) and y A - (or y s- ) axes.
  • the out-of-plane rotation ⁇ A in projection A is the same as the in-plane rotation ⁇ B in projection B
  • the out-of-plane rotation ⁇ B in projection B is the same as the in-plane rotation ⁇ A in projection A.
  • the use of the two projections A and B thus over-constrains the problem of solving for the six degrees of freedom.
  • x A x B
  • r r B
  • ⁇ A ⁇ B
  • the 2D in- plane transformation (x A , y A , ⁇ A ) can be estimated by the 2D image comparison. Determining the two out-of-plane rotations (r, ⁇ B ) relies on which reference DRR is used for best similarity match. Similarly, the 2D in-plane transformation (x B , y & , ⁇ B ) and the out-of-plane rotations (re, ⁇ B) can be estimated for projection B. [0036] FIG.
  • FIG. 2 is a schematic diagram of a methodology for tracking a treatment target during image-guided radiosurgery, in accordance with one embodiment.
  • two sets of DRRs (or other 2D reconstructed images, for example) are generated as a first step, one set for each of the projections A and B.
  • the process of generating DRRs is carried out after the radiation treatment planning is completed, but before treatment delivery.
  • DRR initialization is performed on the initial DRRs, to create a set of in-plane rotated reference DRR images.
  • the real time x- ray projection images are acquired and pre-processed.
  • the processed x-ray images for each projection are registered with the corresponding set of reference DRR images.
  • the step of generating DRRs is performed offline, and involves specifying a set of rotation angles for each of the out-of-plane rotations r and ⁇ , for each projection A and B.
  • Each set of DRRs for each projection includes DRRs that correspond to different combinations of these out-of-plane rotation angles. Therefore, the total number of DRR images is therefore N * N ⁇ , where N r and N ⁇ respectively denote the number of rotation angles for the two out-of-plane rotations r and ⁇ . Because the out-of-plane rotations are expected to approach zero after patient alignment, the angles are more densely sampled in the range close to zero, and more sparsely sampled in the range of larger rotation angles.
  • DRR initialization is performed, by computing a set of in-plane rotated reference DRRs images (in 0 degree).
  • the most intensive part of computation in registration is the in-plane rotation computation.
  • All the reference DRR images are stored in a memory unit in the radiosurgical system, and are used for registration in each x-ray image acquisition, during target alignment and treatment.
  • the 2D x-ray images of the target that represent the two orthogonal projections A and B onto the respective image planes (shown in FIG.1 ) are acquired in near real time, by transmitting respective imaging beams through the target and onto the cameras A and B.
  • the imaging beams for each projection are detected by the respective cameras, after passing through the target.
  • the detection signals are sent to an image sensor, so that the 2D x-ray projection images are generated.
  • the 2D in- plane transformation (x A , y A , ⁇ A ) can be estimated by the 2D image comparison. Determining the two out-of-plane rotations (r, ⁇ B ) relies on which reference DRR is used for an optimal similarity match. Similarly, the 2D in-plane transformation (XB, ye, ⁇ B ) and the out ⁇ of-plane rotations (r B , ⁇ B) can be estimated for projection B.
  • FIG. 3 illustrates a flowchart of an algorithm used in a 2D/3D registration method, in accordance with one embodiment.
  • the change in the position of the target (or other rigid object) in the radiographic image, as compared to the position of the target in the 3D scan data (as indicated in the reconstructed 2D image) is described using 3D rigid body transformations.
  • Registration is performed by determining the value of the 3D rigid body transformation parameters that represent the difference in the position of the target as shown in the x-ray images, as compared to the position of the target as shown by the 2D images reconstructed from pre-treatment 3D scan data.
  • a preliminary step (step 110 in FIG. 3) the raw x-ray images are pre- processed, before beginning the 2D/3D registration process. Pre-processing the raw x-ray images is necessary, in order to make the x-ray and DRR images have the same orientation, same image size, and same bit depth.
  • the out-of-plane rotations can be detected with a good accuracy only after the in-plane parameters have already been well estimated. It has also been found that the out-of-plane rotations are able to safely converge to the correct values when starting from the nominal position of the DRRs. Accordingly, a separate computation is carried out for the out-of- plane versus in-plane transformation parameters, during the registration process: the two out-of-plane rotations (r, ⁇ ) are estimated from the exact reference DRR images, while the in-plane transformation parameters (x, y, ⁇ ) are computed directly from the 2D images. The in-plane parameters (x, y, ⁇ ) are first computed using the nominal reference DRRs.
  • An initial estimate for the out-of-plane rotations (r, ⁇ ) is then carried out, based on the previously obtained values of the in-plane transformation parameters x, y, ⁇ ).
  • different search strategies are used for estimating the out-of-plane transformations and the in-plane transformations, respectively.
  • multiple similarity measure criteria are used that have been optimized at the different phases during the registration.
  • the registration process is described in terms of six distinct phases (illustrated in FIG. 3 as steps 120, 130, 140, 150, 160, and 170).
  • phase 1 step 120 in FIG. 3
  • the in-plane transformation parameters (x, y, ⁇ ) are initially estimated using a set of in-plane rotated DRR images, which are generated offline from the nominal reference DRR (in 0 degree).
  • the most intensive computation in the registration process is the computation of the in-plane rotation. To achieve a rapid computation, it is desirable to compute as many in-plane rotations as possible for the reference DRRs, before starting the registration process.
  • the process of generating in- plane rotated DRRs is thus carried out offline, after the reference DRRs for out- of-plane rotations are generated. All the reference DRR images are stored in memory, and used for registering each real-time x-ray image that is acquired during patient alignment and treatment.
  • step 120 the three parameters are rapidly searched using a 3D multilevel matching method (described in connection with FIG. 4 below).
  • a sum of absolute differences method (“SAD") is used as the similarity measure.
  • SAD sum of absolute differences method
  • this step there is no floating computation.
  • the pixel accuracy for the translations (x,y) and half-degree accuracy for the in-plane rotation ( ⁇ ) are achieved.
  • step 130 phase 2 of the registration process
  • the two out-of-plane rotations (r, ⁇ ) are separately searched in one dimension, based on the values of the in-plane parameters (x, y, ⁇ ), determined in previous step 120.
  • a more complicated similarity measure based on pattern intensity is used to detect the reference DRR image that corresponds to a combination of two out-of-plane rotations (r, ⁇ ).
  • the search space for the possible rotation angles is the full search range of out-of-plane rotation angles. For an initial estimate, the full search range is sampled at every one-degree interval.
  • step 140 the in-plane translation parameters (x, y) are refined using 2D sub-pixel matching.
  • 2D sub-pixel matching is a full range search method.
  • a set of DRR images (3 x 3 or 5 x 5) is generated by translating the unknown reference DRR, one sub- pixel at a time.
  • the in-plane translations (x, y) in sub-pixel accuracy are refined by finding the best match between the x-ray image and the DRR images.
  • step 150 the in-plane rotation parameter ⁇ is refined using 1D interpolation, based on the updated values for the in-plane translation parameters (x, y) from step 140, and the updated values of the out-of-plane rotation parameters (r, ⁇ ) from step 130.
  • step 160 the out-of-plane rotations are separately refined to a better accuracy using 1 D search, based on the updated values for the in-plane transformation parameters (x, y, ⁇ ), from steps 140 and 150.
  • steps 140, 150, and 160 phases 3, 4, and 5
  • a similarity measure method based on pattern intensity described in more detail in later paragraphs, is used to ensure higher accuracy.
  • Steps 140, 150, and 160 are iteratively repeated until, a sufficient accuracy is obtained. Once the desired accuracy is reached, the final out-of- plane rotations are 1D interpolated, in the final step 170 (6th and last phase) of the registration process.
  • FIG.4 illustrates the generation of a 2D DRR from 3D CT scan data of a treatment target within an anatomical region of a patient.
  • the volumetric 3D CT image of the target is referred to with the aid of reference numeral 260.
  • the DRRs 265A and 265B, shown in FIG. 4 are artificial, synthesized 2D images that represent the radiographic image of the target that would be obtained, if imaging beams were used having the same intensity, position and angle as the beams used to generate the real time x-ray projection images, and if the target were positioned in accordance with the 3D CT scan data.
  • the DRRs are calculated from prior 3D CT data, in an exact emulation of the real-time camera perspectives.
  • the reference numerals 250A and 250B illustrate the hypothetical positions and angles from which the imaging beams would be directed through a target positioned in accordance with the CT volumetric image 260 of the target.
  • DRRs are generated by casting hypothetical beams or rays through the CT volumetric image of the target. Each ray goes through a number of voxels of the 3D CT image 260. By integrating the CT numbers for these voxels along each ray, and projecting onto an imaging plane (shown as 270A and 270B, respectively, in FIG. 4, the resultant image would emulate the radiograph that would be obtained by passing rays from hypothetical camera locations and angles (shown schematically as 250A and 250B, respectively) through a target positioned in accordance with the volumetric 3D image 260. Ray tracing algorithms, known in the art, are generally used to generate the DRRs.
  • FIG. 5 illustrates a multi-resolution image representation for the multi-level matching process, used in the first phase (step 120 in FIG. 3) to initially estimate the in-plane transformation parameters.
  • the full-size image is at the bottom (Level 1).
  • the upper images (Level 2, Level 3 and Level 4) have lower spatial resolution.
  • the lower resolution images are obtained by lower pass filtering, and by sub-sampling of the full-size images.
  • multi-level matching is used for an initial estimate the in-plane transformation parameters.
  • the basic idea of multi-level matching is to match the images at each level successively, starting with the lowest image resolution level (Level 4).
  • the results at the lower resolution level serve to provide rough estimates for the in-plane transformation parameters (x, y, ⁇ ).
  • the output at a lower level is then passed to the subsequent level characterized by a higher resolution.
  • the parameters (x, y, ⁇ ) are refined, using the higher resolution images.
  • the accuracy of the translations depends on the spatial resolution of the image having the highest resolution (Level 1).
  • the accuracy of the rotations depends on the sampling intervals of the in-plane rotations, during the DRR initialization process described above.
  • the search range in the lowest resolution level is the full search range that is calculated from the difference between the DRR and x-ray image W H sizes. Because of the smallest image size — x — at the lowest level, the full 8 8 range search can be completed in a very short time. The same small search range is (-2, +2) pixels for the remaining resolution levels. Because of the small search range, the search can be completed quickly, even at large image sizes.
  • the search range in the lowest resolution level is a full search range, at a denser sampling rate.
  • partial search ranges are used, at a less dense sampling rate.
  • an optimal similarity measure for a 2D/3D registration process should allow for an accurate registration to be achieved, despite such differences.
  • DRR generation relies on a proper attenuation model. Because attenuation is proportional to the mass intensity of the target volume through which the beam passes, the exact relationship between the traversed mass intensity and the CT image intensity needs to be known, in order to obtain an accurate modeling. Establishing this relationship is difficult, however, so the linear attenuation model is often used. As is known, the linear attenuation coefficient of a material is dependent on x-ray energy. CT machines and x-ray machines work at different effective energies, however. As a result, the attenuation coefficients measured by a CT scanner are different from the attenuation of a beam of x-rays passing through the target.
  • the skeletal structures in DRR images cannot be reconstructed very well using the linear model, the DRRs being only synthetic x-ray projection images.
  • the ratio of bone-to-soft-tissue attenuation is much lower than at x-ray radiographic energies.
  • the image contrast from soft tissue will be comparable with the image contrast from bone, reducing the clarity of bone details, for example.
  • x-ray images usually have a large image size (512 x 512). For better registration accuracy, it is desirable to use the full resolution image. Full resolution images are rarely used, in practice, however, because the resulting increase in computation time is excessive, and is incompatible with the requirements of image-guided radiosurgery.
  • similarity measure methods used in 2D/3D registration can be divided into two categories.
  • the first method is based on image features.
  • the image features could be anatomical edges or segmented objects.
  • the registration accuracy depends on the accuracy of edge detection or object segmentation.
  • the main advantage of this method is its fast computation.
  • Feature-based similarity methods register on salient features that have been segmented from each image. They use a reduced amount of data, which makes the algorithms fast, once the segmentation has been undertaken. Because the full information content of the image is not used, however, the accuracy is sacrificed. Errors in the segmentation stage can lead to an error in the final registration.
  • the second method is based on image intensity content. Intensity-based methods compare the voxel and pixel values directly, using measures based on image statistics. The original images are used for registration. Usually, a good accuracy can be achieved. Although these methods require little or no segmentation, intensity-based methods are typically much slower. Because a long time computation is required, it is hard to apply intensity-based similarity measures to clinical practice.
  • a similarity measure method is used that is designed to optimize the 2D/3D image registration procedure described above.
  • This similarity measure method is based on pattern intensity, and provides a powerful and efficient way to solve the 2D/3D image registration procedure, as described above.
  • the pattern intensity based method and system described below is designed for the 1 D search phase (for the out-of-plane parameters), and the iterative refining phases, of the 2D/3D image registration procedure described above.
  • SAD sum of absolute differences
  • the SAD measure is widely used in medical image processing and video processing, in cases where the two images to be matched have high image quality.
  • the main advantage of using SAD is its fast computation and its easy optimization in parallel computation. Its main disadvantage is that the solution is sensitive to image noise, artifacts and intensity difference between the live and DRR images. As a result, SAD is only used in the first search phase to get approximate results.
  • SAD can be expressed as where lnve .j) represents the intensity of the "live" real-time x-ray image, and I D RR ) represents the intensity of the reconstructed DRR image.
  • the pattern intensity similarity measure is more accurate, and less sensitive to image noise, artifacts, and to the intensity difference between the images being compared, compared to other similarity measures known in the art.
  • two images are compared, the first image being a 2D x-ray image of a radiosurgical treatment target, and the second image being a 2D DRR that is reconstructed from 3D CT scan data generated at the time of treatment planning.
  • the two images are discretized, digital images, characterized by first and second 2D arrays of pixel values.
  • the pixel arrays are equi-dimensional, i.e. the number of rows and columns of the first array is equal to the number of rows and columns of the second array.
  • each pixel value of an image is a number representative of the intensity of the image at a unique corresponding 2D area element forming the image.
  • a difference image is formed from the real-time x-ray image and the DRR image, by subtracting the corresponding pixel values of the second image (the DRR image) from each pixel value of the first image (the real-time):
  • I diff ( j) I Live & J) ⁇ I DRR ( j) .
  • lcm(ij) represents the intensity or pixel value of the ij-th pixel of the difference image
  • ive(U) represents the intensity or pixel value of the ij-th pixel of the live x-ray image
  • IDRROJ represents the intensity or pixel value of the ij-th pixel of the artificial DRR image.
  • a pattern intensity function is defined, which operates on the difference image.
  • the pattern intensity function is expressed as an asymptotic function of the gradients of the difference image: 2 ⁇ ⁇ U ⁇ 1 + ( !* ( J ) ⁇ I m ( * + k,j + 1 ))2 ' (1 ) where ⁇ is a weighting constant and R is a neighborhood that is defined using the pixel (i, J) as the center point.
  • the mathematical formulation in equation (1) results in the similarity measure tending to a maximum value, as the number of structures tend to zero, and the similarity measure asymptotically tending to zero, as the number of structures increase. Because of the asymptotic nature of the pattern intensity measure, large differences in intensity have the same effect on the measure, regardless of their magnitude. This makes the measure robust to large differences in pixel intensity.
  • the function is weighted by the weighting constant ⁇ .
  • the constant ⁇ is used to weight the function, so that small deviations in intensity (caused by noise, by way of example) results in the measure remaining proximate to its maximum value.
  • the sensitivity of the solution to the variation of X-ray image can be minimized by careful selection of this constant.
  • the larger the weighting constant the more stable the results become.
  • the choice of the weighting constant is a tradeoff between stability and accuracy. If the value of the weighting constant is too large, the smaller details in the images cannot be reflected in the similarity measure. Based on experimentation, the empirical value of ⁇ is determined to be in the range from about 4 to about 16, although other values of ⁇ are also within the scope of the method and system described above.
  • the pattern intensity function considers a selected neighborhood for each pixel.
  • Fig. 6 schematically illustrates a neighborhood for calculating pattern intensity, in one embodiment.
  • the neighborhood R is defined such that the gradients in four directions are considered: horizontal, vertical, 45° diagonal and -45° diagonal.
  • the (/, y " -7)-th pixel is considered.
  • the (/-7 )-th pixel is considered.
  • the (i-1, j+1)-th pixel is considered.
  • the (i-1,j-1)-t pixel is considered.
  • the pattern intensity expression is given as the sum below: ffcf + (I dif (ij)-I diJ (i -l)f ffcf +(I dlf (i,j)-I dif (i- j)f
  • the formulation of the pattern intensity function provides a number of advantages over other known similarity measures.
  • the difference image filters out the low frequency part that is basically the soft tissues and keeps the high frequency part that is mostly the skeletal structures. This feature makes the algorithm robust to some brightness intensity difference between live and DRR images.
  • the similarity measure is less affected by pixels whose intensity values deviate only slightly from its neighboring pixels. These kinds of pixels are thought to contain random noise, hence undesirable.
  • the asymptotic function quickly approaches to zero when the variable increases, large intensity differences such as image artifacts have the same effects on the similarity measure, regardless of their magnitude. Accordingly, the pattern intensity is less sensitive to image artifacts.
  • FIG. 7 schematically illustrates an image-guided radiosurgery system, constructed in accordance with one embodiment.
  • the image guided radiosurgery system 300 includes a means 301 for generating pre-treatment 3D scan data of the target; radiosurgical beam generator 302; a positioning system 304; imaging means 306; and a controller 308.
  • the system 300 may also include an operator control console and display 340.
  • the means 301 may be a CT scanner, for example, or an MRI system or a PET system.
  • the radiosurgical beam generator 302 generates, when activated, a plurality of collimated radiosurgical beam (e.g. x-ray beam).
  • a plurality of collimated radiosurgical beam e.g. x-ray beam.
  • the cumulative effect of the radiosurgical beams, when properly directed to and focused onto the target, is to necrotize or perform other treatments in a target within the patient's anatomy.
  • the positioning system 304 may be an industrial robot, by way of example, and the beam generator 302 may be a small x-ray linac mounted to an arm of the industrial robot 304.
  • the imaging means 306 is preferably an x-ray imaging system for generating a pair of orthogonal x-ray projection images of the target.
  • the imaging means 306 preferably has a pair of x-ray sources for generating diagnostic imaging beams (having known positions, angles, and intensities), and a corresponding pair of x-ray image detectors which detect the beams after the beams have passed through the target.
  • the controller 308 includes software for generating a set of reconstructed 2D images (preferably DRRs) of the target, based on the 3D scan data from the 3D scanner 301, and the known intensity, location, and angle of the imaging beams.
  • the controller 308 includes software for registering the DRRs with the real time x-ray images.
  • the registration software is able to compute a set of 3D transformation parameters that represent the change in position of the target between the 3D scan and the near real-time x-ray images.
  • the positioning system 304 is responsive to commands from the controller 308, to continuously adjust, in near real time, the relative position of the radiosurgical beam generator and the target by the amount prescribed by the 3D transformation parameters obtained through the registration process.
  • the controller 308 includes a processor 408 for performing 2D/3D registration.
  • the 2D/3D registration processor 408 includes software for determining a set of in-plane transformation parameters (x, y, ⁇ ) and out-of-plane rotational parameters (r, ⁇ ), the parameters representing the difference in the position of the target as shown in the x-ray image, as compared to the position of the target as shown by the 2D reconstructed images.
  • the process 408 further includes 1 ) software for performing a 3D multilevel matching to determine an estimate for the in-plane transformation parameters (x, y, ⁇ ); 2) software for performing a 1 -D search for each of the pair of out-of-plane rotation parameters (r, ⁇ ), based on the estimated in-plane parameters (x, y, 0); and 3) software for iteratively refining the in-plane parameters (x, y, ⁇ ) and the out-of-plane parameters (r, ⁇ ), until a desired accuracy is reached.
  • the controller 308 also includes a processor 508 for determining a measure of similarity between two images.
  • the similarity measure processor 508 is equipped with software for determining the measure of similarity between a first image and a second image, by subtracting each pixel value of the second image from a corresponding pixel value of the first image to form a difference image, and then applying a gradient operator upon each pixel of the difference image to form a pattern intensity function.
  • the pattern intensity function is an asymptotic function of the gradients of the difference image, and permits the pixel values within a neighborhood R defined about each pixel in the difference image to be considered.
  • the gradients are defined over at least four directions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Multimedia (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Processing (AREA)

Abstract

An image-guided radiosurgery method and system are presented that use 2D/3D image registration to keep the radiosurgical beams properly focused onto a treatment target. A pre-treatment 3D scan of the target is generated at or near treatment planning time (CT DATA). A set of 2D DRRs are generated (DDR GENERATION A and DDR GENERATION B), based on the pretreatment 3D scan (CT DATA). At least one 2D x-ray image of the target is generated in near real time during treatment (X-RAY IMAGE A). The DRRs are registered with the x-ray images, by computing a set of 3D transformation parameters that represent the change in target position between the 3D scan and the x-ray images (IMAGE REGISTRATION). The relative position of the radiosurgical beams and the target is continuously adjusted in near real time in accordance with the 3D transformation parameters (GEOMETRIC TRANSFORMATION). A hierarchical and iterative 2D/3D registration algorithm is used, in which the transformation parameters that are in-plane with respect to the image plane of the x-ray images are computed separately from the out-of-plane transformation parameters.

Description

2D/3D IMAGE REGISTRATION IN IMAGE-GUIDED RADIOSURGERY
BACKGROUND
[0001] Radiosurgery can be used to treat tumors and other lesions by delivering a prescribed dose of high-energy radiation to a target area while minimizing radiation exposure to the surrounding tissue. In radiosurgery, precisely focused beams of radiation (e.g. very intense x-ray beams) are delivered to a target region, in order to destroy tumors or to perform other types of treatment. One goal is to apply a lethal amount of radiation to one or more tumors, without damaging the surrounding healthy tissue. Radiosurgery therefore calls for an ability to accurately target a tumor, so as to deliver high doses of radiation in such a way as to cause only the tumor to receive the desired dose, while avoiding critical structures such as the spinal cord.
[0002] Conventional radiosurgery may use a rigid and invasive stereotactic frame to immobilize the patient prior to diagnostic CT or MRI scanning. Treatment planning may then be conducted from the diagnostic images. The treatment planning software determines the number, intensity, and direction of the radiosurgical beams that should be cross-fired at the target, in order to ensure that a sufficient dose is administered throughout the tumor so as to destroy it, while minimizing damage to adjacent healthy tissue. Radiation treatment may typically be accomplished on the same day treatment planning takes place. Immobilization of the patient may be necessary in order to maintain the spatial relationship between the target and the radiation source to ensure accurate dose delivery. The frame may be fixed on the patient during the whole treatment process.
[0003] Image-guided radiosurgery can eliminate the use of invasive frame fixation during treatment, by frequently and quasi-continuously correcting patient position or aligning radiation beam with the patient target. To correct patient position or align radiation beam, the change in target position at the time of treatment, as compared to the position at the time of the diagnostic treatment planning, is detected. This can be accomplished by registering the x-ray image acquired at the treatment time with the diagnostic 3D scan data obtained pre- operatively at the time of treatment planning. Medical image registration can be useful in many areas of medicine, including but not limited to radiosurgery and radiotherapy. The positions of the target can be defined by physicians at the time of treatment planning, using the diagnostic 3D scans.
[0004] Typically, 3D imaging modalities, such as computed tomography (CT), magnetic resonance imaging (MRI), or positron emission therapy (PET) can be used to generate diagnostic 3D images of the anatomical region containing the targeted area, for treatment planning purposes. These tools enable practitioners to identify the anatomical organs of a patient, and to precisely locate any abnormalities such as tumors. For example, CT scans allow an image of the internal structure of a target object to be generated, one cross-sectional slice at a time. The 3D scan data (e.g., CT, MRI, or PET scan data) may be used as a reference, in order to determine the patient position change during treatment. Typically, synthesized 2D images such as digitally reconstructed radiographs (DRRs) may be generated from the 3D scan data, and may be used as 2D reference images. In the field of medical image registration, this problem is categorized as a 2D/3D registration. In the 2D/3D registration process, similarity measures can be useful for comparing the image intensities in the x-ray images and the DRR images, so that the change in patient position (and thus in target region position) that has occurred between the diagnostic scanning and the taking of real-time images can be accurately detected.
[0005] Image-guided radiosurgery typically requires precise and fast positioning of the target at the treatment time. In practice, it is desirable that the accuracy be below 1 mm, and the computation time be on the order of a few seconds. Unfortunately, it can be difficult to meet both requirements simultaneously.
[0006] It is desirable that a method and system be provided in image-guided radiosurgery for tracking the position of the treatment target throughout treatment, that allow for a fast computation time , and at the same time maintains sufficient accuracy and stability. In particular, there is a need for a method and system for performing 2D/3D medical image registration using as little computing time as possible, while at the same time meeting the requisite accuracy for radiosurgical applications. In order to optimize the 2D/3D registration process in image-guided radiosurgery, it is necessary to use an accurate, robust, and efficient similarity measure method and system.
SUMMARY
[0007] An accurate and rapid method and system are presented for tracking target position in image guided radiosurgery. A hierarchical and iterative framework is used to register 2D x-ray images with images that have been reconstructed from 3D scan data. The hierarchical and iterative 2D/3D registration algorithm allows for an accurate and rapid correction of target position, and an accurate and rapid alignment of radiosurgical beams, throughout the treatment procedure. High accuracy can be achieved in both the translational and rotational adjustments of the target position. The total computation time is about an order of magnitude faster than other techniques known in the art. An improved method and system is used to compare the measure of similarity of two digital images. The similarity measure is based on pattern intensity, and provides a robust, accurate, and efficient solution to the 2D/3D medical image registration problem in image guided radiosurgery.
[0008] A method in image guided surgery for aligning the position of a treatment target relative to a radiosurgical beam generator during treatment includes performing 2D/3D image registration between one or more near real-time 2D x- ray images of a treatment target, and one or more 2D reconstructed images of the target based on pre-treatment 3D scan data . A pre-treatment 3D scan of the target is performed, treating the target as a rigid body, and describing its position with six degrees of freedom. The 3D scan (for example a CT scan, an MRI scan, or a PET scan) shows the position of the target at treatment planning time. One or more 2D x-ray images of the target are generated during treatment, in near real time. The x-ray images show the position of the target at a current time during treatment. Preferably, two orthogonal x-ray projections images are generated, using imaging beams having a known position, angle, and intensity.
[0009] A set of 2D reconstructed images, preferably DRRs (digitally reconstructed radiographs), are generated offline, based on the 3D scan data. Preferably, the 2D reconstructed images are DRRs that are generated using the same positions and angles of the imaging beams that are used for the x-ray images, i.e. using the known positions, angles, and intensities of the imaging beams that are used to generate the near real-time x-ray images.
[0010] The DRRs are registered with the x-ray images, to generate the 3D rigid body transformation parameters that represent the change in position of the target between the 3D scan and the x-ray images. The registration is performed for each orthogonal projection individually, and the results are subsequently combined.
[0011] During the 2D/3D registration, in-plane rotations of the DRRs are performed within the image plane of the x-ray images, thereby generating reference DRRs. The x-ray images are processed so that the orientation, image size, and bit depth of the x-ray images match the orientation, image size, and bit depth of the reference DRRs.
[0012] In-plane and out-of-plane transformation parameters are estimated using different search methods, including 3-D multi-level matching and 1-D searching, then iteratively refined until a desired accuracy is reached. The relative position of the radiosurgical beams and the target are, continuously adjusted in near real time, throughout the treatment, in accordance with the 3D transformation parameters obtained via the 2D/3D registration process.
[0013] The 2D/3D registration process involves determining the value of the parameters (x, y, Θ) and (r, ) that are required in order to register the x-ray image of the target with the reference DRRs of the target, (x, y, θ) represent the in-plane translational and rotational parameters within the image plane of the x- ray images, (x, y) indicating the requisite amount of translation within the image plane in the directions of the x- and y- axes, respectively, and θ indicating the requisite amount of rotation within the image plane, (r, φ) represent the out-of- plane rotational parameters, and indicate the requisite amount of out-of-plane rotations about mutually orthogonal axes that are defined in a 3D coordinate system, and that are orthogonal to the image plane.
[0014] In order to determine these parameters, a 3D multi-level matching is first performed, in order to determine an initial estimate for the in-plane transformation parameters (x, y, θ). Based on these parameters (x, y, θ) obtained by 3D multi-level matching, an initial 1-D search is performed for each of the pair of out-of-plane rotation parameters (r, φ). The in-plane translation parameters (x, y) are then refined, using 2D sub-pixel matching, to increase the accuracy of these parameters.
[0015] The in-plane rotation parameter (θ) is then refined, based on the out-of- plane rotation parameters (r, φ) obtained from the initial 1 D search, and on the updated in-plane transformation parameters (x,y), in order to increase the accuracy of the in-plane rotation parameter φ. 1D interpolation is used in this step. Next, each of the out-of-plane rotation parameters (r, φ) are refined separately, based on the refined in-plane translation and rotation parameters. The refining steps are iteratively repeated, until a predetermined accuracy is reached. Finally, the out-of-plane rotation parameters (r, φ) are refined, using 1D interpolation, in order to achieve the desired resolution.
[0016] The 2D/3D image registration algorithm also features a method for determining a measure of similarity between a first image and a second image of an object. In an exemplary embodiment, the first image is a real-time 2D x-ray image of the object, and the second image is an artificially synthesized DRR, constructed from pre-treatment 3D scan data of the object. The similarity measure method includes forming a difference image by subtracting the corresponding pixel values of the second (DRR) image from each pixel value of the first image, i.e. the "live" or near real-time x-ray image. The method further includes applying upon each pixel of the difference image a pattern intensity function, where the pattern intensity function is an asymptotic function of the gradients of the difference image.
[0017] An image-guided radiosurgical system includes means for generating pre- treatment 3D scan data of the target, for example a CT scanner or an MRI scanner. The system includes a radiosurgical beam generator for generating at least one radiosurgical beam for treating the target. Imaging means are provided for generating 2D x-ray images of the target in near real time. The imaging means include one or more (preferably two) imaging x-ray beam sources for generating at least one imaging beam having a known intensity, position and angle. The imaging means direct the imaging beams toward and through the target, so that the beams can be detected by corresponding image receivers (e.g. cameras) after the beams have traversed the target. The detection signals are processed by an image processor, which generates the x- ray images. Preferably, a pair of x-ray sources and a corresponding pair of x-ray cameras are provided, so that two orthogonal x-ray projection images are generated.
[0018] Means are provided for generating a set of 2D DRRs for each x-ray projection image. The DRRs are generated using the known intensity, location, and angle of the imaging beams. Image registration means are provided for registering the DRRs with the x-ray images. The image registration means include a processor for computing a set of 3D transformation parameters that represent the change in position of the target between the 3D scan and the near real time x-ray images. The processor contains software for estimating in-plane and out-of-plane transformation parameters for each projection, using a number of search methods including 3D multi-level matching, 2D sub-pixel matching, and 1D searching, and using two different similarity methods (sum-of-square differences and pattern intensity) at different phases of the registration process.
[0019] The image registration means includes another processor for determining the measure of similarity of a 2D x-ray image of an object and a 2D DRR of the object generated from previously obtained 3D scan data. The processor for determining the measure of similarity between the 2D x-ray image and the 2D DRR contains software for subtracting each pixel value of the second image from a corresponding pixel value of the first image to form a difference image, and then applying a gradient operator upon each pixel of the difference image to form a pattern intensity function. The pattern intensity function is an asymptotic function of the gradients of the difference image, and permits the pixel values within a neighborhood R defined about each pixel in the difference image to be considered. The gradients are defined over at least four directions. [0020] The image-guided radiosurgical system also includes positioning means, responsive to a controller, for adjusting in near real time the relative position of the radiosurgical beams and the target, in accordance with the 3D transformation parameters obtained by the 2D/3D registration process.
BRIEF DESCRIPTION OF THE DRAWINGS
[0021] FIG. 1 illustrates the geometric relations between a 3D treatment target and two orthogonal x-ray projections, including the in-plane translational and rotational parameters (x,y,0), and the out-of-plane rotational parameters (r, φ), for registering a 2D radiographic image with previously generated 3D scan data.
[0022] FIG. 2 is a schematic diagram of a methodology for tracking a treatment target during image-guided radiosurgery, in accordance with one embodiment.
[0023] FIG. 3 illustrates a flowchart of a registration algorithm used in a 2D/3D registration method, in accordance with one embodiment.
[0024] FIG. 4 illustrates the generation of 2D DRRs from 3D CT scan data of a treatment target within an anatomical region of a patient.
[0025] FIG. 5 illustrates a multi-resolution image representation for a multi-level matching process used to estimate the in-plane transformation parameters.
[0026] FIG. 6 schematically illustrates a neighborhood for calculating pattern intensity, in one embodiment.
[0027] FIG. 7 schematically illustrates an image-guided radiosurgery system, constructed in accordance with one embodiment. DETAILED DESCRIPTION
[0028] An accurate and rapid method and system in image-guided radiosurgery are presented for tracking the position of a treatment target in six degrees of freedom. The tracking method and system allow for patient position correction and radiation beam alignment during radiosurgery/radiotherapy of a treatment target, for example a tumor within the brain or skull. A fully automatic tracking process is made possible, with no need for user interaction. The tracking method and system includes an improved 2D/3D image registration algorithm. The 2D/3D registration algorithm can also be used in applications other than radiosurgery and radiotherapy, i.e. in any application in which there a need to track a rigid object by registering 2D radiographic images onto 3D scan data. A similarity measure, based on pattern intensity, is used during the 2D/3D medical image registration. The similarity measure allows for selected phases of the 2D/3D registration process in image-guided radiosurgery to be carried out in a robust, efficient, and powerful manner.
[0029] In one embodiment, the radiosurgery target is treated as a rigid body. As well known, a rigid body is defined as an object whose internal geometric relationships remain static or unchanged over time. Because no external forces are imposed on a radiosurgical target during radiation treatment, it is reasonable to treat the target as a rigid body, and to use a 3D rigid transformation for the registration process. The 3D rigid transformation is described using six degrees of freedom: three translations along three mutually orthogonal axes in a 3D scan coordinate system (conventionally labeled using the x-, y-, and z- axes), and three rotations (roll, pitch, yaw) about these three axes. The six degrees of freedom are thus represented by six 3D transformation parameters: (x, y, z, r, p, w), where r represents the rotation about the x-axis, p represents the rotation about the y-axis, and w represents the rotation about the z-axis.
[0030] In one embodiment, two orthogonal x-ray projections are used to solve for these six parameters. FIG. 1 illustrates the geometric relations between a three- dimensional treatment target and two orthogonal 2D x-ray projections (labeled A and B in FIG. 1 ), in an image-guided radiosurgery method and system in accordance with one embodiment. Cameras (or image receivers) A and B receive their x-ray projections from respective x-ray sources (not shown). The 2D x-ray projection images of the target are formed by transmitting imaging beams (having a known intensity, and having known positions and angles with respect to the target), generated from a respective pair of x-ray sources, through the target and onto cameras A and B.
[0031] As illustrated in FIG. 1 , a 3D CT coordinate system, i.e. a coordinate system for the target as viewed in the frame of the CT scan study (taken at the time of treatment planning), can be defined. During treatment, the patient assumes a position within the real-time camera coordinate frames (defined by the two x-ray cameras A and B, respectively), that does not necessarily match the position of the patient as seen within the 3D CT coordinate system. The difference in the position and orientation of the target within the respective radiographs (real-time x-ray, versus DRR) correspond to the difference in the three-dimensional position and orientation of the target between the camera- and the CT coordinate frames, and are found by solving for the parameters (x, y, z, r, p, w).
[0032] In the embodiment illustrated in FIG. 1 , the x-axis in the 3D CT coordinate system is directed inward into the paper, and is not referenced. In FIG. 1 , the orthogonal 2D projections A and B are viewed from the directions oAsA and OBSB, respectively. For each of the projections A and B, FIG. 1 illustrates respective 2D planar coordinate systems that are fixed with respect to the image plane that characterizes each projection. The image planes A and B for the projections A and B are thus defined by mutually orthogonal axes within the respective coordinate systems. These axes are shown in FIG. 1 as (x^, yA) for projection A, and (XB, yβ) for projection B. In other words, each x-ray image for each projection is characterized by a respective image plane, defined by mutually orthogonal x- and y- axes in a coordinate frame defined by the two x-ray cameras A and B: xA and yA for projection A, and s and ys for projection B. The direction of the axis xA in the 2D coordinate system for projection A, and the direction of the x-axis in the 3D scan coordinate system, are opposite with respect to each other. The direction of axis xB in the coordinate system for projection B, and the direction of the axis x in the 3D scan coordinate system, are the same.
[0033] As shown in FIG. 1 , each projection is characterized by a respective set of transformation parameters, namely (χΛ,yAΛ,rΛA) for projection A, and (χB,yBB,rBB) for projection B. The two out-of-plane rotations (with respect to the image plane) in projections A and B are denoted by (rAA) and (rB,Φβ) respectively, where r denotes the amount of rotation about the x-axis (in the 3D scan coordinate system), and ø denotes the amount of rotation about the oAsA axis (for projection B) or the 0βSβ axis (for projection A). The in-plane translations and rotation in projections A and B are denoted (x yA θ ) and (xeye ΘB), respectively. As easily seen,
Figure imgf000013_0001
(xB yβ) denote the amount of translations within the image planes for each projection (A and B) in the directions of the x- and y- axes that define each image plane (xA- and yA - for projection A, and xe- and ye for projection B), while ΘA and ΘB denote the amount of rotation within each image plane about an axis (not shown) that is perpendicular to both the xA- (or xs-) and yA- (or ys-) axes.
[0034] As can be seen from FIG. 1 , the out-of-plane rotation φA in projection A is the same as the in-plane rotation ΘB in projection B, and the out-of-plane rotation ΦB in projection B is the same as the in-plane rotation ΘA in projection A. The use of the two projections A and B thus over-constrains the problem of solving for the six degrees of freedom. As seen from FIG. 1 , xA = xB, r = rB, ΘA = φB
Figure imgf000013_0002
[0035] For projection A, given a set of reference DRR images which correspond to different combinations of the two out-of-plane rotations (rA, φA), the 2D in- plane transformation (xA, yA, φA) can be estimated by the 2D image comparison. Determining the two out-of-plane rotations (r, ΘB) relies on which reference DRR is used for best similarity match. Similarly, the 2D in-plane transformation (xB, y&, ΘB) and the out-of-plane rotations (re, ΦB) can be estimated for projection B. [0036] FIG. 2 is a schematic diagram of a methodology for tracking a treatment target during image-guided radiosurgery, in accordance with one embodiment. In overview, two sets of DRRs (or other 2D reconstructed images, for example) are generated as a first step, one set for each of the projections A and B. The process of generating DRRs is carried out after the radiation treatment planning is completed, but before treatment delivery. Before patient treatment, DRR initialization is performed on the initial DRRs, to create a set of in-plane rotated reference DRR images. In the course of radiosurgical treatment, the real time x- ray projection images are acquired and pre-processed. The processed x-ray images for each projection are registered with the corresponding set of reference DRR images. The results of the registration, (xA, yA, ΘA, rA, φA) and (XB, ye, ΘB, rB, ΦB) for projections A and B, are combined to produce the final six rigid transformation parameters (x, y, z, r, p, w).
[0037] The step of generating DRRs is performed offline, and involves specifying a set of rotation angles for each of the out-of-plane rotations r and φ, for each projection A and B. Each set of DRRs for each projection includes DRRs that correspond to different combinations of these out-of-plane rotation angles. Therefore, the total number of DRR images is therefore N * Nφ, where Nr and Nφ respectively denote the number of rotation angles for the two out-of-plane rotations r and φ. Because the out-of-plane rotations are expected to approach zero after patient alignment, the angles are more densely sampled in the range close to zero, and more sparsely sampled in the range of larger rotation angles.
[0038] Once the DRRs for out-of-plane rotations are generated, DRR initialization is performed, by computing a set of in-plane rotated reference DRRs images (in 0 degree). The most intensive part of computation in registration is the in-plane rotation computation. To achieve a fast computation, therefore, it is desirable to compute as many as possible in-plane rotations for the reference DRRs, before the registration process. All the reference DRR images are stored in a memory unit in the radiosurgical system, and are used for registration in each x-ray image acquisition, during target alignment and treatment. [0039] The 2D x-ray images of the target that represent the two orthogonal projections A and B onto the respective image planes (shown in FIG.1 ) are acquired in near real time, by transmitting respective imaging beams through the target and onto the cameras A and B. The imaging beams for each projection (attenuated by virtue of having passed through the target) are detected by the respective cameras, after passing through the target. The detection signals are sent to an image sensor, so that the 2D x-ray projection images are generated.
[0040] For projection A, given a set of reference DRR images which correspond to different combinations of the two out-of-plane rotations (rA, φA), the 2D in- plane transformation (xA, yA, φA) can be estimated by the 2D image comparison. Determining the two out-of-plane rotations (r, ΘB) relies on which reference DRR is used for an optimal similarity match. Similarly, the 2D in-plane transformation (XB, ye, ΘB) and the outτof-plane rotations (rB, ΦB) can be estimated for projection B. Given the results (xA, yA, ΘA, rA, φA) for projection A and (xB, ye, ΘB, rB, ΦB) projection B, the 3-D transformation can be obtained by the following expressions :
x = (xA +.xB)/2,y = (yA -yB)/ /2,z = (yA +yB)/ [2, r = (rA +rB)l2,p = (θBA)l 2,w = (θBA)! 2 ,
[0041] In one embodiment, 2D/3D registration is performed using an algorithm that is designed in a unique hierarchical and iterative framework. FIG. 3 illustrates a flowchart of an algorithm used in a 2D/3D registration method, in accordance with one embodiment. The change in the position of the target (or other rigid object) in the radiographic image, as compared to the position of the target in the 3D scan data (as indicated in the reconstructed 2D image) is described using 3D rigid body transformations. Registration is performed by determining the value of the 3D rigid body transformation parameters that represent the difference in the position of the target as shown in the x-ray images, as compared to the position of the target as shown by the 2D images reconstructed from pre-treatment 3D scan data. By using different search methods for different transformation parameters, and by optimizing similarity measures for different phases of the registration procedure, an increased accuracy is achieved in the target tracking process, with a significantly reduced computing time.
[0042] In a preliminary step (step 110 in FIG. 3) the raw x-ray images are pre- processed, before beginning the 2D/3D registration process. Pre-processing the raw x-ray images is necessary, in order to make the x-ray and DRR images have the same orientation, same image size, and same bit depth.
[0043] It has been observed that the out-of-plane rotations can be detected with a good accuracy only after the in-plane parameters have already been well estimated. It has also been found that the out-of-plane rotations are able to safely converge to the correct values when starting from the nominal position of the DRRs. Accordingly, a separate computation is carried out for the out-of- plane versus in-plane transformation parameters, during the registration process: the two out-of-plane rotations (r, φ) are estimated from the exact reference DRR images, while the in-plane transformation parameters (x, y, θ) are computed directly from the 2D images. The in-plane parameters (x, y, θ) are first computed using the nominal reference DRRs. An initial estimate for the out-of-plane rotations (r,φ) is then carried out, based on the previously obtained values of the in-plane transformation parameters x, y, θ). In order to maximize efficiency and accuracy, different search strategies are used for estimating the out-of-plane transformations and the in-plane transformations, respectively. Also, multiple similarity measure criteria are used that have been optimized at the different phases during the registration.
[0044] In the embodiment illustrated in FIG. 3, the registration process is described in terms of six distinct phases (illustrated in FIG. 3 as steps 120, 130, 140, 150, 160, and 170). In phase 1 (step 120 in FIG. 3), the in-plane transformation parameters (x, y, θ) are initially estimated using a set of in-plane rotated DRR images, which are generated offline from the nominal reference DRR (in 0 degree). The most intensive computation in the registration process is the computation of the in-plane rotation. To achieve a rapid computation, it is desirable to compute as many in-plane rotations as possible for the reference DRRs, before starting the registration process. The process of generating in- plane rotated DRRs is thus carried out offline, after the reference DRRs for out- of-plane rotations are generated. All the reference DRR images are stored in memory, and used for registering each real-time x-ray image that is acquired during patient alignment and treatment.
[0045] In step 120, the three parameters are rapidly searched using a 3D multilevel matching method (described in connection with FIG. 4 below). A sum of absolute differences method ("SAD") is used as the similarity measure. In this step, there is no floating computation. The pixel accuracy for the translations (x,y) and half-degree accuracy for the in-plane rotation (θ) are achieved.
[0046] In the next step, i.e. step 130 (phase 2 of the registration process), the two out-of-plane rotations (r,φ) are separately searched in one dimension, based on the values of the in-plane parameters (x, y, θ), determined in previous step 120. A more complicated similarity measure based on pattern intensity, described in later paragraphs below, is used to detect the reference DRR image that corresponds to a combination of two out-of-plane rotations (r,φ). The search space for the possible rotation angles is the full search range of out-of-plane rotation angles. For an initial estimate, the full search range is sampled at every one-degree interval. In step 140 (phase 3), the in-plane translation parameters (x, y) are refined using 2D sub-pixel matching. 2D sub-pixel matching is a full range search method. Based on the updated transformation parameters (x, y, θ, r, φ) obtained from the previous step in the registration, a set of DRR images (3 x 3 or 5 x 5) is generated by translating the unknown reference DRR, one sub- pixel at a time. The in-plane translations (x, y) in sub-pixel accuracy are refined by finding the best match between the x-ray image and the DRR images.
[0047] In step 150 (phase 4), the in-plane rotation parameter θ is refined using 1D interpolation, based on the updated values for the in-plane translation parameters (x, y) from step 140, and the updated values of the out-of-plane rotation parameters (r, φ) from step 130. In step 160 (phase 5), the out-of-plane rotations are separately refined to a better accuracy using 1 D search, based on the updated values for the in-plane transformation parameters (x, y, θ), from steps 140 and 150. In steps 140, 150, and 160 (phases 3, 4, and 5), a similarity measure method based on pattern intensity, described in more detail in later paragraphs, is used to ensure higher accuracy.
[0048] Steps 140, 150, and 160 are iteratively repeated until, a sufficient accuracy is obtained. Once the desired accuracy is reached, the final out-of- plane rotations are 1D interpolated, in the final step 170 (6th and last phase) of the registration process.
[0049] FIG.4 illustrates the generation of a 2D DRR from 3D CT scan data of a treatment target within an anatomical region of a patient. In FIG. 4, the volumetric 3D CT image of the target is referred to with the aid of reference numeral 260. The DRRs 265A and 265B, shown in FIG. 4, are artificial, synthesized 2D images that represent the radiographic image of the target that would be obtained, if imaging beams were used having the same intensity, position and angle as the beams used to generate the real time x-ray projection images, and if the target were positioned in accordance with the 3D CT scan data. In other words, the DRRs are calculated from prior 3D CT data, in an exact emulation of the real-time camera perspectives. The reference numerals 250A and 250B illustrate the hypothetical positions and angles from which the imaging beams would be directed through a target positioned in accordance with the CT volumetric image 260 of the target.
[0050] Typically, DRRs are generated by casting hypothetical beams or rays through the CT volumetric image of the target. Each ray goes through a number of voxels of the 3D CT image 260. By integrating the CT numbers for these voxels along each ray, and projecting onto an imaging plane (shown as 270A and 270B, respectively, in FIG. 4, the resultant image would emulate the radiograph that would be obtained by passing rays from hypothetical camera locations and angles (shown schematically as 250A and 250B, respectively) through a target positioned in accordance with the volumetric 3D image 260. Ray tracing algorithms, known in the art, are generally used to generate the DRRs.
[0051] FIG. 5 illustrates a multi-resolution image representation for the multi-level matching process, used in the first phase (step 120 in FIG. 3) to initially estimate the in-plane transformation parameters. The full-size image is at the bottom (Level 1). The upper images (Level 2, Level 3 and Level 4) have lower spatial resolution. The lower resolution images are obtained by lower pass filtering, and by sub-sampling of the full-size images.
[0052] As a fast search method, multi-level matching is used for an initial estimate the in-plane transformation parameters. The basic idea of multi-level matching is to match the images at each level successively, starting with the lowest image resolution level (Level 4). The results at the lower resolution level serve to provide rough estimates for the in-plane transformation parameters (x, y, θ). The output at a lower level is then passed to the subsequent level characterized by a higher resolution. The parameters (x, y, θ) are refined, using the higher resolution images. In the final results obtained through multi-level matching, the accuracy of the translations depends on the spatial resolution of the image having the highest resolution (Level 1). The accuracy of the rotations depends on the sampling intervals of the in-plane rotations, during the DRR initialization process described above.
[0053] There may be some risks inherent in multi-level matching. The estimates at lower levels may fall within local minima, and far away from global minima. In this case, further matching at subsequent levels (at higher resolutions) may not converge to the global minima. To overcome this risk, multiple candidates of estimates are used. Many candidates for an optimal matching at a lower level are passed on to the higher resolution level. The higher the number of candidates used, the more reliable are the estimates. The best candidates are ranked by the SAD values.
[0054] In FIG. 5, denoting the full image size in Level 1 by Wx H, the image W H W H Λ W H . . . _ . , . . l λ sizes are — x — , — x — and — — in Level 2, Level 3 and Level 4, respectively. 2 2 4 4 8 8 • H J
For translations, the search range in the lowest resolution level is the full search range that is calculated from the difference between the DRR and x-ray image W H sizes. Because of the smallest image size — x — at the lowest level, the full 8 8 range search can be completed in a very short time. The same small search range is (-2, +2) pixels for the remaining resolution levels. Because of the small search range, the search can be completed quickly, even at large image sizes.
For the rotations, the search range in the lowest resolution level is a full search range, at a denser sampling rate. In the higher resolution levels, partial search ranges are used, at a less dense sampling rate.
[0055] Applications such as image-guided radiosurgery require that the comparison between the DRRs (that contain the 3D CT scan information) and the real-time x-ray images, and consequent adjustment of the position of the x- ray source, be made very rapidly and accurately. In practice, the accuracy should be below 1 mm, and the computation time should be on the order of a few seconds. Unfortunately, it is difficult to meet both requirements simultaneously, because of several reasons. First, the two different modality images, i.e. CT scan images and x-ray images, have different spatial resolution and image quality. Generally, x-ray image resolution and quality are superior to the resolution and quality of DRR images, which are only synthesized images. Typically, some structures in the DRR may appear more blurred (especially normal to the CT slice plane), compared to the x-ray image. Ideally, an optimal similarity measure for a 2D/3D registration process should allow for an accurate registration to be achieved, despite such differences.
[0056] Second, DRR generation relies on a proper attenuation model. Because attenuation is proportional to the mass intensity of the target volume through which the beam passes, the exact relationship between the traversed mass intensity and the CT image intensity needs to be known, in order to obtain an accurate modeling. Establishing this relationship is difficult, however, so the linear attenuation model is often used. As is known, the linear attenuation coefficient of a material is dependent on x-ray energy. CT machines and x-ray machines work at different effective energies, however. As a result, the attenuation coefficients measured by a CT scanner are different from the attenuation of a beam of x-rays passing through the target. The skeletal structures in DRR images cannot be reconstructed very well using the linear model, the DRRs being only synthetic x-ray projection images. At CT energies, the ratio of bone-to-soft-tissue attenuation is much lower than at x-ray radiographic energies. Thus, in a DRR produced from a 3D CT volume, the image contrast from soft tissue will be comparable with the image contrast from bone, reducing the clarity of bone details, for example.
[0057] Finally, x-ray images usually have a large image size (512 x 512). For better registration accuracy, it is desirable to use the full resolution image. Full resolution images are rarely used, in practice, however, because the resulting increase in computation time is excessive, and is incompatible with the requirements of image-guided radiosurgery.
[0058] Generally, similarity measure methods used in 2D/3D registration can be divided into two categories. The first method is based on image features. The image features could be anatomical edges or segmented objects. The registration accuracy depends on the accuracy of edge detection or object segmentation. The main advantage of this method is its fast computation. Feature-based similarity methods register on salient features that have been segmented from each image. They use a reduced amount of data, which makes the algorithms fast, once the segmentation has been undertaken. Because the full information content of the image is not used, however, the accuracy is sacrificed. Errors in the segmentation stage can lead to an error in the final registration.
[0059] The second method is based on image intensity content. Intensity-based methods compare the voxel and pixel values directly, using measures based on image statistics. The original images are used for registration. Usually, a good accuracy can be achieved. Although these methods require little or no segmentation, intensity-based methods are typically much slower. Because a long time computation is required, it is hard to apply intensity-based similarity measures to clinical practice.
[0060] In one embodiment, a similarity measure method is used that is designed to optimize the 2D/3D image registration procedure described above. This similarity measure method is based on pattern intensity, and provides a powerful and efficient way to solve the 2D/3D image registration procedure, as described above. In particular, the pattern intensity based method and system described below is designed for the 1 D search phase (for the out-of-plane parameters), and the iterative refining phases, of the 2D/3D image registration procedure described above.
[0061] For the 3D multi-level search phase, the "sum of absolute differences" (SAD) measure is used, which is a known, simple similarity measure. The SAD measure is widely used in medical image processing and video processing, in cases where the two images to be matched have high image quality. The main advantage of using SAD is its fast computation and its easy optimization in parallel computation. Its main disadvantage is that the solution is sensitive to image noise, artifacts and intensity difference between the live and DRR images. As a result, SAD is only used in the first search phase to get approximate results. SAD can be expressed as
Figure imgf000022_0001
where lnve .j) represents the intensity of the "live" real-time x-ray image, and IDRR ) represents the intensity of the reconstructed DRR image.
[0062] The pattern intensity similarity measure is more accurate, and less sensitive to image noise, artifacts, and to the intensity difference between the images being compared, compared to other similarity measures known in the art. In the exemplary embodiment described in the following paragraphs, two images are compared, the first image being a 2D x-ray image of a radiosurgical treatment target, and the second image being a 2D DRR that is reconstructed from 3D CT scan data generated at the time of treatment planning. In one embodiment, the two images are discretized, digital images, characterized by first and second 2D arrays of pixel values. The pixel arrays are equi-dimensional, i.e. the number of rows and columns of the first array is equal to the number of rows and columns of the second array. As well known, each pixel value of an image is a number representative of the intensity of the image at a unique corresponding 2D area element forming the image.
[0063] A difference image is formed from the real-time x-ray image and the DRR image, by subtracting the corresponding pixel values of the second image (the DRR image) from each pixel value of the first image (the real-time):
Idiff ( j) = I Live & J) ~ IDRR ( j) . where lcm(ij) represents the intensity or pixel value of the ij-th pixel of the difference image, ive(U) represents the intensity or pixel value of the ij-th pixel of the live x-ray image; and
IDRROJ) represents the intensity or pixel value of the ij-th pixel of the artificial DRR image.
[0064] A pattern intensity function is defined, which operates on the difference image. The pattern intensity function is expressed as an asymptotic function of the gradients of the difference image: 2 ϊ ιU σ1 + (!* ( J) ~ Im (* + k,j + 1))2 ' (1 ) where σ is a weighting constant and R is a neighborhood that is defined using the pixel (i, J) as the center point. The mathematical formulation in equation (1) results in the similarity measure tending to a maximum value, as the number of structures tend to zero, and the similarity measure asymptotically tending to zero, as the number of structures increase. Because of the asymptotic nature of the pattern intensity measure, large differences in intensity have the same effect on the measure, regardless of their magnitude. This makes the measure robust to large differences in pixel intensity.
[0065] The function is weighted by the weighting constant σ. The constant σ is used to weight the function, so that small deviations in intensity (caused by noise, by way of example) results in the measure remaining proximate to its maximum value. The sensitivity of the solution to the variation of X-ray image can be minimized by careful selection of this constant. The larger the weighting constant, the more stable the results become. However, the choice of the weighting constant is a tradeoff between stability and accuracy. If the value of the weighting constant is too large, the smaller details in the images cannot be reflected in the similarity measure. Based on experimentation, the empirical value of σ is determined to be in the range from about 4 to about 16, although other values of σ are also within the scope of the method and system described above.
[0066] The pattern intensity function considers a selected neighborhood for each pixel. Fig. 6 schematically illustrates a neighborhood for calculating pattern intensity, in one embodiment. In the embodiment illustrated in FIGl 6, the neighborhood R is defined such that the gradients in four directions are considered: horizontal, vertical, 45° diagonal and -45° diagonal. As shown in FIG. 6, in the horizontal direction, the (/, y"-7)-th pixel is considered. In the vertical direction, the (/-7 )-th pixel is considered. In the 45° diagonal direction, the (i-1, j+1)-th pixel is considered. In the -45° direction, the (i-1,j-1)-t pixel is considered.
[0067] Based on the definition of the neighborhood R as shown in FIG. 6, the pattern intensity expression is given as the sum below:
Figure imgf000024_0001
ffcf +(Idif(ij)-IdiJ(i -l)f ffcf +(Idlf(i,j)-Idif(i- j)f
fj +(idif(u)-ιdif(i-υ- > if<? Hidif(u)-ιdif(i-υ+v?
[0068] The formulation of the pattern intensity function, given in equation (2) above, provides a number of advantages over other known similarity measures. First, the difference image filters out the low frequency part that is basically the soft tissues and keeps the high frequency part that is mostly the skeletal structures. This feature makes the algorithm robust to some brightness intensity difference between live and DRR images. Second, because of the asymptotic nature of the pattern intensity function, the similarity measure is less affected by pixels whose intensity values deviate only slightly from its neighboring pixels. These kinds of pixels are thought to contain random noise, hence undesirable. Third, because the asymptotic function quickly approaches to zero when the variable increases, large intensity differences such as image artifacts have the same effects on the similarity measure, regardless of their magnitude. Accordingly, the pattern intensity is less sensitive to image artifacts.
[0069] FIG. 7 schematically illustrates an image-guided radiosurgery system, constructed in accordance with one embodiment. In overview, the image guided radiosurgery system 300 includes a means 301 for generating pre-treatment 3D scan data of the target; radiosurgical beam generator 302; a positioning system 304; imaging means 306; and a controller 308. The system 300 may also include an operator control console and display 340. The means 301 may be a CT scanner, for example, or an MRI system or a PET system.
[0070] The radiosurgical beam generator 302 generates, when activated, a plurality of collimated radiosurgical beam (e.g. x-ray beam). The cumulative effect of the radiosurgical beams, when properly directed to and focused onto the target, is to necrotize or perform other treatments in a target within the patient's anatomy. The positioning system 304 may be an industrial robot, by way of example, and the beam generator 302 may be a small x-ray linac mounted to an arm of the industrial robot 304.
[0071] The imaging means 306 is preferably an x-ray imaging system for generating a pair of orthogonal x-ray projection images of the target. The imaging means 306 preferably has a pair of x-ray sources for generating diagnostic imaging beams (having known positions, angles, and intensities), and a corresponding pair of x-ray image detectors which detect the beams after the beams have passed through the target.
[0072] The controller 308 includes software for generating a set of reconstructed 2D images (preferably DRRs) of the target, based on the 3D scan data from the 3D scanner 301, and the known intensity, location, and angle of the imaging beams. The controller 308 includes software for registering the DRRs with the real time x-ray images. The registration software is able to compute a set of 3D transformation parameters that represent the change in position of the target between the 3D scan and the near real-time x-ray images.
[0073] The positioning system 304 is responsive to commands from the controller 308, to continuously adjust, in near real time, the relative position of the radiosurgical beam generator and the target by the amount prescribed by the 3D transformation parameters obtained through the registration process.
[0074] In one embodiment, the controller 308 includes a processor 408 for performing 2D/3D registration. The 2D/3D registration processor 408 includes software for determining a set of in-plane transformation parameters (x, y, θ) and out-of-plane rotational parameters (r, φ), the parameters representing the difference in the position of the target as shown in the x-ray image, as compared to the position of the target as shown by the 2D reconstructed images.
[0075] The process 408 further includes 1 ) software for performing a 3D multilevel matching to determine an estimate for the in-plane transformation parameters (x, y, θ); 2) software for performing a 1 -D search for each of the pair of out-of-plane rotation parameters (r, φ), based on the estimated in-plane parameters (x, y, 0); and 3) software for iteratively refining the in-plane parameters (x, y, θ) and the out-of-plane parameters (r, φ), until a desired accuracy is reached.
[0076] In practice, a high accuracy is obtained for both translations and rotations after just a few iterations, using the method and system described above. For translations, an accuracy of 0.5mm or better is reached, and for rotations, an accuracy of 0.5 degrees or better is reached. The total computing time is a few seconds, which is an order of magnitude faster than other methods in the prior art.
[0077] In one embodiment, the controller 308 also includes a processor 508 for determining a measure of similarity between two images. The similarity measure processor 508 is equipped with software for determining the measure of similarity between a first image and a second image, by subtracting each pixel value of the second image from a corresponding pixel value of the first image to form a difference image, and then applying a gradient operator upon each pixel of the difference image to form a pattern intensity function. The pattern intensity function is an asymptotic function of the gradients of the difference image, and permits the pixel values within a neighborhood R defined about each pixel in the difference image to be considered. The gradients are defined over at least four directions.
[0078] The method and system described above provide numerous advantages over the prior art. For example, a fully automatic tracking process is achieved, and no user intervention is necessary. Also, the registration process is optimized to allow the use of images having a full resolution. In this way, the full informational content of each image can be utilized, without having to leave out any image features. These results are achieved while reducing the total computation time for the tracking process to a few seconds, which is an order of magnitude faster than the existing prior art methods. At the same time, a high accuracy is achieved for both the translation parameters (below about 0.5mm) and the rotation parameters (below about 0.5 degrees).
[0079] While the invention has been particularly shown and described with reference to specific preferred embodiments, it should be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims

What is claimed is:
1. A method in image guided radiosurgery for aligning the position of a treatment target relative to a radiosurgical beam generator during treatment, the method comprising: a) generating a pre-treatment 3D scan showing the position of said target at treatment planning time; b) generating a set of 2D reconstructed images from said 3D scan; c) generating in near real time one or more 2D x-ray images of said target, wherein said x-ray images show the position of said target at a current time during treatment; d) registering said reconstructed images with said x-ray images by computing a set of 3D transformation parameters that represent the change in position of target between said 3D scan and said x-ray images; and e) in near real time, adjusting the relative position of said radiosurgical beam generator and said target by the amount prescribed by said 3D transformation parameters computed in step d; wherein said target is allowed six degrees of freedom of position.
2. A method in accordance with claim 1 , wherein said 3D transformation parameters represent the difference between the position of the target at said treatment planning time, and the position of the target at said current time.
3. A method in accordance with claim 1 , further comprising repeating steps c) through e) quasi-continuously during treatment, whereby one or more radiosurgical beams generated by said beam generator remain properly focused onto said target throughout said radiosurgical treatment.
4. A method in accordance with claim 1 , further comprising the step of creating a treatment plan after step a) and before step b).
5. A method in accordance with claim 4, wherein said treatment plan specifies the number, intensity, and direction of said one or more radiosurgical beams that are required in order to administer a sufficient radiation dose to said target while minimizing the radiation to adjacent tissue.
6. A method in accordance with claim 1 , further comprising the step of processing said x-ray images, after step c and before step d, so as to match the orientation, image size, and bit depth of said x-ray images with the orientation, image size, and bit depth of said reconstructed 2D images.
7. A method in accordance with claim 1 , wherein said x-ray images are generated by transmitting one or more x-ray imaging beams through said target, said imaging beams having a known intensity, position, and angle; and wherein said 2D reconstructed images are DRRs (digitally reconstructed radiographs) that represent the synthesized radiographic image of said target that would be obtained with said imaging beams at said known intensity and from said known positions and angles, if said target were positioned in accordance with said pre-treatment 3D scan.
8. A method in accordance with claim 1 , wherein said 3D transformation parameters are 3D rigid body transformation parameters, and wherein said 3D transformation parameters are represented by three translations and three rotations (x, y ,z, r, p, w); wherein x, y, z represent the translations of said target in the directions of three mutually orthogonal axes, respectively, and wherein r, p, w represent three rotations (roll, pitch, yaw) about said three orthogonal axes.
9. A method in accordance with claim 1 , wherein said x-ray images generated in step c comprises x-ray projection images that represent at least two orthogonal projections A and B of said target onto respective projection image planes, said x-ray projection images being formed by transmitting at least two x- ray imaging beams through said target and onto said respective image planes, wherein each imaging beam is received by a respective x-ray camera after passing through said target.
10. A method in accordance with claim 9, wherein step b of generating reconstructed images comprises: generating two sets of reconstructed images, one set for each of said projections A and B.
11. A method in accordance with claim 10, wherein step d of registering said reconstructed images with said x-ray images comprises:
A individually registering each x-ray projection image A and B with their respective set of reconstructed images, by determining a separate set of transformation parameters for each projection x-ray image; and
B) combining the resulting parameters for each projection to obtain said 3D transformation parameters.
12. A method in accordance with claim 11 , wherein said transformation parameters for each of said projections A and B are described by two out-of- plane rotational parameters
Figure imgf000030_0001
(rBB) respectively, and by three in- plane transformation parameters
Figure imgf000030_0002
B,yBB), respectively.
13. A method in accordance with claim 12, wherein said 2D reconstructed images are DRRs, and wherein step b of generating said 2D reconstructed images comprises: i) for each projection, specifying a set of rotation angles for each of said out- of-plane rotation parameters r and ø, Nr being the number of rotation angles for rotation parameter r, and Nø being the number of rotation angles for rotation parameter ø; and ii) generating two sets of DRRs, one set for each of said projections A and B; wherein each set includes DRRs that correspond to different combinations of said out-of-plane rotation angles, so that the number of DRRs in each set is Nr*Nφ.
14. A method in accordance with claim 13, wherein the step of generating 2D reconstructed images further comprises the step of computing a set of in-plane rotated DRR images by performing a plurality of in-plane rotations on said DRRs, thereby creating a set of in-plane rotated reference DRRs for each projection.
15. A method in accordance with claim 14, wherein said step of creating reference DRRs is performed offline.
16. A method in accordance with claim 9, wherein the step of computing said 3D transformation parameters comprises: i) individually computing the transformation parameters (xA, yAA) and (xB, yBB)for each projection image A and B; and ii) combining the transformation parameters for projection A with the transformation parameters for projection B so as to obtain said 3D transformation parameters; and wherein said 3D transformation parameters are represented by three translations and three rotations (x, y, z, r, p, w).
17. A method in accordance with claim 16, wherein said 3D transformation parameters are related to the transformation parameters for projections A and B by the following relationship:
x = (xA +xB)/2, y = (yA -yB)l 2, z = (yA +yB)l 2,
r = (rA +rB)l2, p = (θBΛ)l 2, w = (θBA)l 2
18. A method in accordance with claim 16, wherein the step of computing the transformation parameters for each projection comprises: i) computing the in-plane transformation parameters using said in-plane rotated reference DRRs; and thereafter ii) estimating the out-of-plane rotation parameters using the in-plane transformation parameters computed in step i) above; and thereafter iii) iteratively refining said in-plane and out-of-plane transformation parameters, until said parameters converge to a sufficient accuracy.
19. A method according to claim 18, wherein step i) is performed using 3D multi-level matching, and a sum of absolute difference similarity measure.
20. A method according to claim 18, wherein step ii) is performed using a 1 D search and a pattern intensity similarity measure.
21. A method according to claim 20, wherein step iii) comprises: a) refining the in-plane translation parameters x and y using 2-D sub-pixel matching; and thereafter b) refining the in-plane rotation parameter using 1-D interpolation,
22. A method in accordance with claim 1 , wherein said 3D scan comprises at least one of: a CT scan, an MRI scan, an ultrasound scan, and a PET scan.
23. An image guided radiosurgical system for radiosurgical treatment of a target, the system comprising: a. means for providing pre-treatment 3D scan data of said target; b. radiosurgical beam generator for generating at least one radiosurgical beam; c. imaging means for generating one or more 2D x-ray images of said target in near real time, said imaging means including: i) an imaging beam source for generating at least one imaging beam having a known intensity, and having a known position and angle relative to said target; and ii) means for directing said imaging beam towards and through said target from said known location and angle, and at said known intensity; iii) at least one image receiver for detecting the attenuated imaging beam after said beam has passed through said target; and iv) an image processor for processing data from said image receiver to generate said x-ray image; d. a controller, including: i) means for generating at least one reconstructed 2D image of said target, based on said 3D scan data, and using said known intensity, location, and angle of said imaging beam; ii) registration means for registering said reconstructed 2D image with said near real time x-ray image, said registration means including means for computing a set of 3D transformation parameters that represent the change in position of said target between said 3D scan and said near real time x-ray image; and e. positioning means, responsive to said controller, for adjusting in near real time the relative position of said radiosurgical beam generator and said target by the amount prescribed by said 3D transformation parameters.
24. A system in accordance with claim 23, wherein said 2D reconstructed images comprises DRRs.
25. A system in accordance with claim 23, wherein said 3D scan data comprise at least one of CT scan data, MRI scan data, and PET scan data.
26. A system in accordance with claim 23, wherein said one or more 2D x-ray images of said target comprise x-ray projection images that represent at least two orthogonal projections A and B of said target onto respective projection image planes, and wherein said x-ray projection images are formed by transmitting at least two x- ray imaging beams through said target and onto said respective image planes, wherein each imaging beam is received by a respective x-ray camera after passing through said target.
27. A system in accordance with claim 23, wherein said means for generating at least one reconstructed 2D image comprises means for generating two sets of reconstructed images, one set for each of said projections A and B.
28. A system in accordance with claim 23, wherein said registration means comprises:
A) means for individually registering each x-ray projection image A and B with their respective set of reconstructed images by determining a separate set of transformation parameters for each projection x-ray image; and
B) combining the resulting parameters for each projection to obtain said 3D transformation parameters.
29. A system in accordance with claim 28, wherein said transformation parameters for each of said projections A and B are described by two out-of- plane rotational parameters (rAA)and (rBB) respectively, and by three in- plane transformation parameters (xA,yAA)anό (χB,yBB), respectively.
30. A system in accordance with claim 29, wherein said means for generating at least one reconstructed 2D image of said target comprises: i) means for specifying, for each projection A and B, a set of rotation angles for each of said out-of-plane rotation parameters r and ø, wherein the number of rotation angles for rotation parameter r is Nr, and the number of rotation angles for rotation parameter ø is Nø; and ii) means for generating two sets of DRRs, one set for each of said projections A and B; wherein each set includes DRRs that correspond to different combinations of said out-of-plane rotation angles, so that the number of DRRs in each set is
Nr*N0.
31. A method of registering a 2D (two-dimensional) x-ray image of a target with previously generated 3D scan data of said target, said x-ray image being characterized by an image plane defined by mutually orthogonal x- and y- coordinates, the method comprising:
A) generating at least one reconstructed image from said 3D scan data; and
B) determining the value of in-plane transformation parameters (x, y, θ) and out-of-plane rotational parameters (r, ø) for registering said reconstructed image onto said x-ray image, said parameters representing the difference in the position of the target as shown in said x-ray image as compared to the position of the target as shown by said image reconstructed from said 3D scan data; wherein r and ø represent the rotations of said target about first and second mutually orthogonal axes, said rotations being out-of-plane with respect to said image plane, said out-of-plane rotations representing the projection of said target onto said image plane; wherein x and y represent the amount of translation of said target within said image plane in the directions of said x- and y- axes, respectively, and θ represents the amount of rotation of said target within said image plane about an axis perpendicular to both said x- and said y- axes; and wherein step B comprises: a) obtaining an initial estimate for said in-plane transformation parameters (x, y, θ) by multi-level matching in 3D (three dimensions), between said x-ray image and said reconstructed image; b) based on said parameters (x, y, θ) estimated in step a, performing an initial search in one dimension for each of said pair of out-of-plane rotation parameters (r, ø); and c) iteratively refining said in-plane parameters (x, y, θ) and said out-of-plane parameters (r, ø), until said parameters converge to a desired accuracy.
32. A method in accordance with claim 31 , wherein said 3D multi-level matching is performed sequentially in each of a succession of a plurality of image resolution levels, starting at the lowest resolution level and ending at the highest resolution level.
33. A method in accordance with claim 31 , further wherein said 2D x-ray image of said target is obtained by transmitting through said target an imaging beam having a known position and angle relative to said target, and wherein said reconstructed image is a 2D synthesized DRR (digitally reconstructed radiographs) representing the radiographic image of said target that would be obtained with said imaging beam at said known position and angle, if said target were positioned in accordance with said 3D scan data.
34. A method in accordance with claim 31 , further comprising the steps of
A) determining a plurality Nr and Nφ of out-of-plane rotation angles, respectively, for said rotational parameters (r, ø);
B) generating a plurality Nr* N0 of 2D reference images, one reference image for each of said plurality Nr and N0 of said out-of-plane rotation angles.
35. A method in accordance with claim 1 , further comprising the step of generating offline, before step a, a plurality of in-plane rotated 2D reference images, by performing a series of in-plane rotations on said reconstructed image.
36. A method in accordance with claim 35, wherein said 3D matching process in step a is performed upon said in-plane rotated 2D reference images.
37. A method in accordance with claim 31 , wherein said 3D matching process in step a is performed using a similarity measure method.
38. A method in accordance with claim 37, wherein said similarity measure method is based on a sum of absolute differences.
39. A method in accordance with claim 31 , wherein step c of iteratively refining said in-plane and out-of-plane parameters comprises: d. refining the in-plane translation parameters (x, y), to increase the accuracy of said parameters; e. refining the in-plane rotation parameter (θ) based on said out-of-plane rotation parameters (r, ø) searched in step b, and on said refined in-plane transformation parameters (x, y) from step d; f. separately refining each of the out-of-plane rotation parameters (r, ø), based on said refined in-plane translation parameters from step d, and said refined rotation parameter from step e; g. iteratively and sequentially repeating steps d, e, and f, until a predetermined accuracy is reached; and h. refining once more said out-of-plane rotation parameters (r, ø).
40. A method in accordance with claim 39, wherein step d of initially refining the in-plane translation parameters is performed by sub-pixel matching in two dimensions.
41. A method in accordance with claim 39, wherein step e of refining the in- plane rotation parameters is performed by 1 D (one dimensional) interpolation.
42. A method in accordance with claim 39, wherein step f of separately refining said out-of-plane rotation parameters is performed through a 1 D (one dimensional) search.
43. A method in accordance with claim 39, wherein step h of refining said out- of-plane rotation parameters (r, ø) is performed by 1 D interpolation.
44. A method in accordance with claim 31 , wherein said predetermined accuracy is sufficient to achieve a resolution of less than about 1mm.
45. A method in accordance with claim 31 , wherein said 3D scan data comprise at least one of CT scan data, MRI scan data, and PET (positron emission tomography) data.
46. A method in accordance with claim 31 , wherein said 1 D search for said out-of-plane rotation parameters in step b is performed using a similarity measure.
47. A method in accordance with claim 46, wherein said similarity measure is based on pattern intensity.
48. A method in accordance with claim 31 , wherein the search space for said 1 D search in step B is the full search range of out-of-plane rotation angles, and said full search range is sampled by one degree increments.
49. A method in accordance with claim 39, wherein steps d, e, and f are performed using a similarity measure based on pattern intensity.
50. A method in accordance with claim 31 , further comprising the step of processing said 2D x-ray image, after step A and before step B, so as to match the orientation, image size, and bit depth of said x-ray image with the orientation, image size, and bit depth of said reconstructed 2D image.
51. A system for registering at least one 2D radiographic image of a target with at least one image reconstructed from previously generated 3D scan data of said target, said radiographic image being characterized by an image plane defined by mutually orthogonal x- and y- axes, the system comprising: a. means for providing said 3D scan data of said target; b. a radiation source for generating at least one radiographic imaging beam having a known intensity, and having a known position and angle relative to said target; c. an imaging system for generating a 2D radiographic image of said target in near real time; and d. a controller, including: i) means for generating said at least one reconstructed 2D image of said target, using said 3D scan data, and using said known location, angle, and intensity of said imaging beam; and ii) software for determining a set of in-plane transformation parameters (x, y, θ) and out-of-plane rotational parameters (r, ø), said parameters representing the difference in the position of the target as shown in said x-ray image as compared to the position of the target as shown by said 2D reconstructed images; wherein r and ø represent the rotations of said target about first and second mutually orthogonal axes, said rotations being out-of-plane with respect to said image plane, said out-of-plane rotations representing the projection of said target onto said image plane; and wherein x and y represent the amount of translation of said target within said image plane in the directions of said x- and y- axes, respectively, and θ represents the amount of rotation of said target within said image plane about an axis perpendicular to both said x- and said y- axes.
52. A system in accordance with claim 51 , wherein said software for determining said in-plane and out-of-plane rotational parameters comprises: means for performing a 3D multi-level matching to determine an initial estimate for said in-plane transformation parameters (x, y, 0); means for performing a 1 D search for each of said pair of out-of-plane rotation parameters (r, ø) based on said initially estimated in-plane parameters (x, y, θ), and means for iteratively refining said in-plane parameters (x, y, θ) and said out-of- plane parameters (r, ø), until a desired accuracy is reached.
53. A system in accordance with claim 51 , wherein said radiation source comprises an x-ray source, said 2D radiographic image comprises a 2D x-ray image, and said reconstructed image comprises a 2D DRR.
54. A system in accordance with claim 51 , wherein said controller further comprises:
A. means for determining a plurality Nr and N0 of out-of-plane rotation angles, respectively, for said rotational parameters (r, ø); and
B. means for generating a plurality Nr * Nψ of 2D reference images, one reference image for each of said plurality Nr and φ of said out-of-plane rotation angles.
55. A system in accordance with claim 51 , wherein said controller further comprises means for generating offline a plurality of in-plane rotated 2D reference images by performing a series of in-plane rotations on said reconstructed image.
56. A system in accordance with claim 52, wherein said 3D multi-level matching means performs sequentially in each of a succession of a plurality of resolution levels, starting at the lowest resolution level and ending at the highest resolution level.
57. A system in accordance with claim 52, wherein said 3D multi-level matching means comprises similarity measure means based on a sum of absolute differences.
58. A system in accordance with claim 52, wherein said means for iteratively refining said in-plane and out-of-plane parameters comprises: d. means for refining the in-plane translation parameters (x, y), to increase the accuracy of said parameters; e. means for refining the in-plane rotation parameter (0) based on said out- of-plane rotation parameters (r, ø) searched in step b, and on said refined in- plane transformation parameters (x, y) from step d; f. means for separately refining each of the out-of-plane rotation parameters (r, ø), based on said refined in-plane translation parameters from step d, and said refined rotation parameter from step e; and g. means for iteratively and sequentially repeating steps d, e, and f, until a predetermined accuracy is reached, and for refining once more said out-of-plane rotation parameters (r, ø).
59. A system in accordance with claim 52, wherein said means for refining the in-plane translation parameters comprises 2D sub-pixel matching means.
60. A system in accordance with claim 52, wherein said means for refining the in-plane rotation parameters comprises 1 D (one dimensional) interpolation means.
61. A system in accordance with claim 52, wherein said means for separately refining said out-of-plane rotation parameters comprises means for performing one or more 1 D searches.
62. A system in accordance with claim 52, wherein said means for refining said out-of-plane rotation parameters (r, ø) comprises 1 D interpolation means.
63. A system in accordance with claim 52, wherein said desired accuracy is sufficient to achieve a resolution of less than about 1mm.
64. A system in accordance with claim 51 , wherein said 3D scan data comprise at least one of CT scan data, MRI scan data, and PET (positron emission tomography) data.
65. A system in accordance with claim 51 , wherein said means for performing a 1 D search for said out-of-plane rotation parameters comprises means for performing a similarity measure based on pattern intensity.
66. A system in accordance with claim 52, wherein said means for refining the in-plane translation parameters (x,y), said means for refining the in-plane rotation parameter (0), and said means for separately refining said out-of-plane rotation parameters (r, ø) comprises means for performing one or more similarity measures based on pattern intensity.
67. A system in accordance with claim 52, further comprising means for processing said 2D x-ray image so as to match the orientation, image size, and bit depth of said x-ray image with the orientation, image size, and bit depth of said reconstructed image.
68. A method in image-guided surgery for determining the measure of similarity of a first image of an object and a second image of said object, the method comprising: a. forming a difference image by subtracting the corresponding pixel values of the second image from each pixel value of the first image; wherein said first image is an x-ray image of said object generated in near real time, and said second image is a DRR (digitally reconstructed radiograph) synthesized from previously generated 3D scan data of said object; and b. forming a pattern intensity function by summing asymptotic functions of the gradients of said difference image over all the pixels within a neighborhood
R; wherein said neighborhood R is defined so that said gradients of said difference image can be considered in at least four directions.
69. A method in accordance with claim 68, wherein said pattern intensity function is an asymptotic function of the gradients of said difference image.
70. A method in accordance with claim 68, wherein said first and second images, and said difference image, are digital images.
71. A method in accordance with claim 68, wherein said first and second images are discretized images respectively characterized by a first and a second 2D (two-dimensional) array of pixel values; and wherein said difference image is a discretized image characterized by a third 2D array of pixel values.
72. A method in accordance with claim 71 , wherein each pixel value of an image is a number representative of the intensity of said image at a corresponding 2D array element.
73. A method in accordance with claim 71 , wherein the number of rows of said first array is equal to the number of rows of said second array and said third array, and the number of columns of said first array is equal to the number of columns of said second array and said third array.
74. A method in accordance with claim 71 , wherein the number of rows and columns of said first and second arrays is about 512.
75. A method in accordance with claim 68, wherein said x-ray image of said target is obtained by transmitting through said target an imaging beam having a known intensity and a known position and angle relative to said target, and wherein said 2D DRR (digitally reconstructed radiographs) representing the radiographic image of said target that would be obtained with said imaging beam at said known intensity, position and angle, if said target were positioned in accordance with said 3D scan data.
76. A method in accordance with claim 71 , wherein the pixel value for each image represents the intensity of said image, and wherein the pixel value at the i-th row and j-th column of said third array of pixel values for said difference image is given by: if ( J) = ive ( J) ~ I DRR ii, j) . wherein luve(i ) represents the (ij)-th pixel value of a real-time x-ray image of said object, and IDRR(U) represents the (i,j)th pixel value of a digitally reconstructed image of said object synthesized from previously generated 3D scan data of said object.
77. A method in accordance with claim 68, wherein said pattern intensity function is characterized by a mathematical formulation given by: ^. 2 y y
where Im ,j) represents the array of pixel values for said difference image, where σ is a weighting constant for weighting said function, and where R is a neighborhood defined around the pixel (i,j) as a center point.
78. A method in accordance with claim 77, wherein σ is from about 4 to about 16.
79. A method in accordance with claim 68, wherein said at least four directions comprise: a. a substantially horizontal direction; b. a substantially vertical direction; c. a diagonal direction of about 45 degrees; and d. a diagonal direction of about -45 degrees.
80. A method in accordance with claim 79, wherein the result of the sampling of said pattern intensity in said neighborhood R, the pattern intensity function is given by:
Figure imgf000044_0001
r2 % σ2 + (Idif (i,j) -Idif (i -\ -l)f + Σ <" σ + (Id!f i -Idif i -i, / + !))
81. A method in accordance with claim 68, wherein said 3D scan data comprise at least one of CT scan data, MRI scan data, ultrasound scan data, and PET (positron emission tomography) data.
82. A system for determining the measure of similarity of a 2D x-ray image of an object and a 2D DRR of said object generated from previously obtained 3D scan data, said x-ray image and said DRR being discretized images characterized by a first and second 2D array of pixel values, the system comprising: a. means for generating 3D scan data of said object; b. an x-ray source for generating at least one imaging beam; c. imaging means for generating a 2D radiographic image of said object in near real time, by directing said imaging beam towards and through said object from a known location and angle and at a known intensity, and detecting said imaging beam after said beam has passed through said object; d. a controller, including: i) software for generating a set of 2D DRR images of said object, using said 3D scan data and said known location, angle, and intensity of said imaging beam; and ii) software for determining the measure of similarity between said 2D x-ray image and said 2D DRR, by subtracting each pixel value of said second image from a corresponding pixel value of said first image to form a difference image, by adding the asymptotic functions of the gradients of the difference image over all the pixels within a neighborhood R; wherein said pattern intensity function is an asymptotic function of the gradients of said difference image; and wherein said neighborhood R is defined so that said gradients of said difference image can be considered in at least four directions.
83. A system in accordance with claim 82, wherein said at least four directions comprise: a) a substantially horizontal direction; b) a substantially vertical direction; c) a diagonal direction of about 45 degrees; and d) a diagonal direction of about -45 degrees.
84. A system in accordance with claim 82, wherein said pattern intensity function is given by:
Figure imgf000046_0001
where Iditfi ,j) represents the array of pixel values for said difference image, where σ is a weighting constant, and wherein said neighborhood R uses the pixel (i,j) as a center point.
85. An apparatus for aligning the position of a treatment target relative to a radiosurgical beam generator during image guided radiosurgery, the system comprising: a 3D scanner configured to generated a pre-treatment 3D scan that shows the position of the target at treatment planning time; an image reconstructor configured to generate a set of 2D reconstructed images from the pre-treatment 3D scan; an x-ray imaging system for generating in near real time one or more 2D x-ray images of the target, wherein the x-ray images show the position of the target at a current time during treatment; an image registration system configured to register the 2D reconstructed images with the near real time x-ray images, by computing a set of 3D transformation parameters that represent the change in position of the target between the pre-treatment 3D scan and the near real time x-ray images; and a position adjustor configured to adjust, in near real time, the relative position of the radiosurgical beam generator and the target by the amount prescribed by the 3D transformation parameters; wherein the target is allowed six degrees of freedom in position.
86. An apparatus in accordance with claim 85, further comprising an x-ray image processor configured to process the near real time x-ray images so that the orientation, image size, and bit depth of the x-ray images match the orientation, image size, and bit depth of the reconstructed 2D images.
87. An apparatus in accordance with claim 85, wherein the 3D scanner comprises at least one of: a CT scanner; an MRI scanner; and a PET scanner.
88. An apparatus in accordance with claim 85, wherein the x-ray imaging system includes a) an x-ray imaging beam generator configured to generate one or more x-ray imaging beams; and b) an x-ray beam receiver configured to receive each x-ray imaging beam.
89. An apparatus in accordance with claim 88, wherein the x-ray beam receiver comprises one or more x-ray cameras.
90. An apparatus in accordance with claim 89, wherein the x-ray imaging system is configured to generate x-ray projection images that represent at least two orthogonal projections A and B of the target onto respective projection image planes, the x-ray projection images being formed by transmitting at least two x- ray imaging beams through the target and onto the respective image planes, and each x-ray imaging beam being received by a respective x-ray camera after passing through the target.
PCT/US2004/027158 2003-08-29 2004-08-20 2d/3d image registration in image-guided radiosurgery WO2005024721A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP04781775A EP1667580A4 (en) 2003-08-29 2004-08-20 2d/3d image registration in image-guided radiosurgery

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US10/652,786 US7204640B2 (en) 2003-08-29 2003-08-29 Apparatus and method for registering 2D radiographic images with images reconstructed from 3D scan data
US10/652,717 US7187792B2 (en) 2003-08-29 2003-08-29 Apparatus and method for determining measure of similarity between images
US10/652,785 US7756567B2 (en) 2003-08-29 2003-08-29 Image guided radiosurgery method and apparatus using registration of 2D radiographic images with digitally reconstructed radiographs of 3D scan data
US10/652,717 2003-08-29
US10/652,785 2003-08-29
US10/652,786 2003-08-29

Publications (2)

Publication Number Publication Date
WO2005024721A2 true WO2005024721A2 (en) 2005-03-17
WO2005024721A3 WO2005024721A3 (en) 2005-11-17

Family

ID=34279865

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2004/027158 WO2005024721A2 (en) 2003-08-29 2004-08-20 2d/3d image registration in image-guided radiosurgery

Country Status (3)

Country Link
US (5) US7756567B2 (en)
EP (1) EP1667580A4 (en)
WO (1) WO2005024721A2 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005030646A1 (en) * 2005-06-30 2007-01-11 Siemens Ag Method for contour visualization of regions of interest in 2D fluoroscopic images
WO2007005445A2 (en) 2005-06-29 2007-01-11 Accuray Incorporated Precision registration of x-ray images to cone-beam ct scan for image-guided radiation treatment
WO2008128551A1 (en) * 2007-04-18 2008-10-30 Elekta Ab (Publ) Radiotherapeutic apparatus and methods
EP2032039A2 (en) * 2006-06-28 2009-03-11 Accuray Incorporated Parallel stereovision geometry in image-guided radiosurgery
WO2012017427A1 (en) * 2010-08-04 2012-02-09 P-Cure Ltd. Teletherapy control system and method
US8755489B2 (en) 2010-11-11 2014-06-17 P-Cure, Ltd. Teletherapy location and dose distribution control system and method
US9165362B2 (en) 2013-05-07 2015-10-20 The Johns Hopkins University 3D-2D image registration for medical imaging
JP2016059606A (en) * 2014-09-18 2016-04-25 株式会社島津製作所 Positioning device and positioning method
JP2016059612A (en) * 2014-09-18 2016-04-25 株式会社島津製作所 Drr image creation method and drr image creation device
WO2016128014A1 (en) * 2015-02-09 2016-08-18 Brainlab Ag X-ray patient position monitoring
US9427286B2 (en) 2013-09-24 2016-08-30 The Johns Hopkins University Method of image registration in a multi-source/single detector radiographic imaging system, and image acquisition apparatus
EP3121789A1 (en) * 2015-07-22 2017-01-25 Siemens Medical Solutions USA, Inc. Method and system for convolutional neural network regression based 2d/3d image registration
US10582972B2 (en) 2011-04-07 2020-03-10 3Shape A/S 3D system and method for guiding objects

Families Citing this family (171)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004023150A1 (en) * 2002-09-03 2004-03-18 Loughborough University Enterprises Limited Marking of objects for speed and spin measurements
WO2004069040A2 (en) * 2003-02-04 2004-08-19 Z-Kat, Inc. Method and apparatus for computer assistance with intramedullary nail procedure
EP1605810A2 (en) * 2003-02-04 2005-12-21 Z-Kat, Inc. Computer-assisted knee replacement apparatus and method
US7756567B2 (en) * 2003-08-29 2010-07-13 Accuray Incorporated Image guided radiosurgery method and apparatus using registration of 2D radiographic images with digitally reconstructed radiographs of 3D scan data
JP3787783B2 (en) * 2003-09-02 2006-06-21 松下電器産業株式会社 Object traveling direction detection method
US7362920B2 (en) * 2003-09-22 2008-04-22 Siemens Medical Solutions Usa, Inc. Method and system for hybrid rigid registration based on joint correspondences between scale-invariant salient region features
US20050089205A1 (en) * 2003-10-23 2005-04-28 Ajay Kapur Systems and methods for viewing an abnormality in different kinds of images
KR100995398B1 (en) * 2004-01-20 2010-11-19 삼성전자주식회사 Global motion compensated deinterlaing method considering horizontal and vertical patterns
US20050267353A1 (en) * 2004-02-04 2005-12-01 Joel Marquart Computer-assisted knee replacement apparatus and method
JP5110881B2 (en) 2004-02-20 2012-12-26 ユニバーシティ オブ フロリダ リサーチ ファウンデーション,インコーポレイティド System for delivering conformal radiation therapy while simultaneously imaging soft tissue
US7596283B2 (en) * 2004-04-12 2009-09-29 Siemens Medical Solutions Usa, Inc. Fast parametric non-rigid image registration based on feature correspondences
US7653226B2 (en) * 2004-04-21 2010-01-26 Siemens Medical Solutions Usa, Inc. Flexible generation of digitally reconstructed radiographs
US7724943B2 (en) * 2004-04-21 2010-05-25 Siemens Medical Solutions Usa, Inc. Rapid and robust 3D/3D registration technique
US7522779B2 (en) * 2004-06-30 2009-04-21 Accuray, Inc. Image enhancement method and system for fiducial-less tracking of treatment targets
US7366278B2 (en) * 2004-06-30 2008-04-29 Accuray, Inc. DRR generation using a non-linear attenuation model
US8306185B2 (en) * 2004-08-13 2012-11-06 Koninklijke Philips Electronics N.V. Radiotherapeutic treatment plan adaptation
US8989349B2 (en) * 2004-09-30 2015-03-24 Accuray, Inc. Dynamic tracking of moving targets
WO2006095324A1 (en) * 2005-03-10 2006-09-14 Koninklijke Philips Electronics N.V. Image processing system and method for registration of two-dimensional with three-dimensional volume data during interventional procedures
WO2006117737A2 (en) * 2005-05-04 2006-11-09 Koninklijke Philips Electronics N. V. X-ray imaging apparatus and method
DE102005021068B4 (en) * 2005-05-06 2010-09-16 Siemens Ag Method for presetting the acquisition parameters when creating two-dimensional transmitted X-ray images
WO2006130659A2 (en) * 2005-05-31 2006-12-07 Board Of Regents, The University Of Texas System Methods, program product and system for enhanced image guided stereotactic radiotherapy
US20070016008A1 (en) * 2005-06-23 2007-01-18 Ryan Schoenefeld Selective gesturing input to a surgical navigation system
US7840256B2 (en) 2005-06-27 2010-11-23 Biomet Manufacturing Corporation Image guided tracking array and method
US20070014448A1 (en) * 2005-06-30 2007-01-18 Wheeler Frederick W Method and system for lateral comparative image analysis and diagnosis
US20070038059A1 (en) * 2005-07-07 2007-02-15 Garrett Sheffer Implant and instrument morphing
US7869663B2 (en) * 2005-08-01 2011-01-11 Bioptigen, Inc. Methods, systems and computer program products for analyzing three dimensional data sets obtained from a sample
US20070073133A1 (en) * 2005-09-15 2007-03-29 Schoenefeld Ryan J Virtual mouse for use in surgical navigation
US7643862B2 (en) * 2005-09-15 2010-01-05 Biomet Manufacturing Corporation Virtual mouse for use in surgical navigation
DE102005044652B4 (en) * 2005-09-19 2009-12-10 Siemens Ag Method for generating 2D reconstruction images from a 3D image data set of an examination object, in particular taken by means of a magnetic resonance device, in the context of image post-processing
US20070100223A1 (en) * 2005-10-14 2007-05-03 Rui Liao Method and system for cardiac imaging and catheter guidance for radio frequency (RF) ablation
US7656998B2 (en) * 2005-11-14 2010-02-02 Accuray Incorporated Unified quality assurance for a radiation treatment delivery system
US7835500B2 (en) * 2005-11-16 2010-11-16 Accuray Incorporated Multi-phase registration of 2-D X-ray images to 3-D volume studies
US7684647B2 (en) * 2005-11-16 2010-03-23 Accuray Incorporated Rigid body tracking for radiosurgery
EP1960966B1 (en) * 2005-12-08 2009-05-27 Koninklijke Philips Electronics N.V. System and method for enabling selection of an image registration transformation
DE102005059210B4 (en) * 2005-12-12 2008-03-20 Siemens Ag Radiotherapeutic device
US7453984B2 (en) * 2006-01-19 2008-11-18 Carestream Health, Inc. Real-time target confirmation for radiation therapy
JP4310319B2 (en) * 2006-03-10 2009-08-05 三菱重工業株式会社 Radiotherapy apparatus control apparatus and radiation irradiation method
US8165659B2 (en) 2006-03-22 2012-04-24 Garrett Sheffer Modeling method and apparatus for use in surgical navigation
SE529451C2 (en) * 2006-05-22 2007-08-14 Xcounter Ab Tomosynthesis imaging apparatus, e.g. used in mammography, general body examinations, material testing, or baggage checking, includes X-ray apparatus, reconstruction device, and a projection image construction device
US20080021300A1 (en) * 2006-06-29 2008-01-24 Allison John W Four-dimensional target modeling and radiation treatment
US7620147B2 (en) * 2006-12-13 2009-11-17 Oraya Therapeutics, Inc. Orthovoltage radiotherapy
US7535991B2 (en) * 2006-10-16 2009-05-19 Oraya Therapeutics, Inc. Portable orthovoltage radiotherapy
US20080144903A1 (en) * 2006-10-25 2008-06-19 Bai Wang Real-time hardware accelerated contour generation based on VOI mask
US8788012B2 (en) * 2006-11-21 2014-07-22 General Electric Company Methods and apparatus for automatically registering lesions between examinations
EP2100267A4 (en) * 2006-11-28 2012-05-16 Calgary Scient Inc Texture-based multi-dimensional medical image registration
US20080177280A1 (en) * 2007-01-09 2008-07-24 Cyberheart, Inc. Method for Depositing Radiation in Heart Muscle
WO2008086434A2 (en) * 2007-01-09 2008-07-17 Cyberheart, Inc. Depositing radiation in heart muscle under ultrasound guidance
US7594753B2 (en) * 2007-03-29 2009-09-29 Accuray Incorporated Phantom insert for quality assurance
US20090014015A1 (en) * 2007-04-17 2009-01-15 University Of Washington Intraoperative dosimetry for prostate brachytherapy using transrectal ultrasound and x-ray fluoroscopy
US8934961B2 (en) 2007-05-18 2015-01-13 Biomet Manufacturing, Llc Trackable diagnostic scope apparatus and methods of use
US8363783B2 (en) 2007-06-04 2013-01-29 Oraya Therapeutics, Inc. Method and device for ocular alignment and coupling of ocular structures
US8512236B2 (en) * 2008-01-11 2013-08-20 Oraya Therapeutics, Inc. System and method for positioning and stabilizing an eye
US20080319491A1 (en) 2007-06-19 2008-12-25 Ryan Schoenefeld Patient-matched surgical component and methods of use
SE0702061L (en) * 2007-09-17 2009-03-18 Xcounter Ab Method for creating, displaying and analyzing X-rays and device for implementing the method
US20090088625A1 (en) * 2007-10-01 2009-04-02 Kenneth Oosting Photonic Based Non-Invasive Surgery System That Includes Automated Cell Control and Eradication Via Pre-Calculated Feed-Forward Control Plus Image Feedback Control For Targeted Energy Delivery
SE531416C2 (en) * 2007-10-09 2009-03-31 Xcounter Ab Device and method for recording radiation image data of an object
EP2070478B1 (en) 2007-12-13 2011-11-23 BrainLAB AG Detection of the position of a moving object and treatment method
US7801271B2 (en) 2007-12-23 2010-09-21 Oraya Therapeutics, Inc. Methods and devices for orthovoltage ocular radiotherapy and treatment planning
US7792249B2 (en) 2007-12-23 2010-09-07 Oraya Therapeutics, Inc. Methods and devices for detecting, controlling, and predicting radiation delivery
US7720196B2 (en) * 2008-01-07 2010-05-18 Accuray Incorporated Target tracking using surface scanner and four-dimensional diagnostic imaging data
US8571637B2 (en) * 2008-01-21 2013-10-29 Biomet Manufacturing, Llc Patella tracking method and apparatus for use in surgical navigation
US8825136B2 (en) * 2008-03-14 2014-09-02 Baylor Research Institute System and method for pre-planning a radiation treatment
US8848974B2 (en) * 2008-09-29 2014-09-30 Restoration Robotics, Inc. Object-tracking systems and methods
US8457372B2 (en) * 2008-09-30 2013-06-04 Accuray Incorporated Subtraction of a segmented anatomical feature from an acquired image
JP5729907B2 (en) * 2009-02-23 2015-06-03 株式会社東芝 X-ray diagnostic equipment
US20100220910A1 (en) * 2009-03-02 2010-09-02 General Electric Company Method and system for automated x-ray inspection of objects
JP2010246883A (en) * 2009-03-27 2010-11-04 Mitsubishi Electric Corp Patient positioning system
EP2414042A4 (en) 2009-03-31 2013-01-30 Matthew R Witten System and method for radiation therapy treatment planning using a memetic optimization algorithm
US7934869B2 (en) * 2009-06-30 2011-05-03 Mitsubishi Electric Research Labs, Inc. Positioning an object based on aligned images of the object
US9205279B2 (en) 2009-07-17 2015-12-08 Cyberheart, Inc. Heart tissue surface contour-based radiosurgical treatment planning
CN102510735A (en) 2009-07-17 2012-06-20 计算机心脏股份有限公司 Heart treatment kit, system, and method for radiosurgically alleviating arrhythmia
JP5507181B2 (en) * 2009-09-29 2014-05-28 富士フイルム株式会社 Radiographic imaging apparatus and method of operating radiographic imaging apparatus
JP5580030B2 (en) * 2009-12-16 2014-08-27 株式会社日立製作所 Image processing apparatus and image alignment method
US9687200B2 (en) 2010-06-08 2017-06-27 Accuray Incorporated Radiation treatment delivery system with translatable ring gantry
WO2011106433A1 (en) 2010-02-24 2011-09-01 Accuray Incorporated Gantry image guided radiotherapy system and related treatment delivery methods
US8744159B2 (en) * 2010-03-05 2014-06-03 Bioptigen, Inc. Methods, systems and computer program products for collapsing volume data to lower dimensional representations thereof using histogram projection
US8693634B2 (en) 2010-03-19 2014-04-08 Hologic Inc System and method for generating enhanced density distribution in a three dimensional model of a structure for use in skeletal assessment using a limited number of two-dimensional views
CN101843500B (en) * 2010-04-07 2012-05-23 重庆伟渡医疗设备股份有限公司 Method for judging accuracy of positioning result given by image guidance system
GB201008281D0 (en) * 2010-05-19 2010-06-30 Nikonovas Arkadijus Indirect analysis and manipulation of objects
WO2011156526A2 (en) 2010-06-08 2011-12-15 Accuray, Inc. Imaging methods and target tracking for image-guided radiation treatment
US20130094742A1 (en) * 2010-07-14 2013-04-18 Thomas Feilkas Method and system for determining an imaging direction and calibration of an imaging apparatus
WO2012019162A1 (en) 2010-08-06 2012-02-09 Accuray, Inc. Systems and methods for real-time tumor tracking during radiation treatment using ultrasound imaging
JP2012070880A (en) * 2010-09-28 2012-04-12 Mitsubishi Heavy Ind Ltd Radiation therapy system control device and radiation therapy system control method
US11231787B2 (en) 2010-10-06 2022-01-25 Nuvasive, Inc. Imaging system and method for use in surgical and interventional medical procedures
US8526700B2 (en) 2010-10-06 2013-09-03 Robert E. Isaacs Imaging system and method for surgical and interventional medical procedures
US9785246B2 (en) 2010-10-06 2017-10-10 Nuvasive, Inc. Imaging system and method for use in surgical and interventional medical procedures
JP5859958B2 (en) * 2010-11-11 2016-02-16 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Image processing apparatus, image processing method, and program
EP2656307B1 (en) * 2010-12-20 2017-07-05 Koninklijke Philips N.V. System and method for automatic generation of initial radiation treatment plans
US8911453B2 (en) 2010-12-21 2014-12-16 Restoration Robotics, Inc. Methods and systems for directing movement of a tool in hair transplantation procedures
US9498289B2 (en) 2010-12-21 2016-11-22 Restoration Robotics, Inc. Methods and systems for directing movement of a tool in hair transplantation procedures
US8536547B2 (en) 2011-01-20 2013-09-17 Accuray Incorporated Ring gantry radiation treatment delivery system with dynamically controllable inward extension of treatment head
US8712177B2 (en) * 2011-01-25 2014-04-29 Siemens Aktiengesellschaft Motion compensated overlay
US9262830B2 (en) * 2011-03-04 2016-02-16 Koninklijke Philips N.V. 2D/3D image registration
JP2012249960A (en) * 2011-06-06 2012-12-20 Toshiba Corp Medical image processor
TWI446897B (en) * 2011-08-19 2014-08-01 Ind Tech Res Inst Ultrasound image registration apparatus and method thereof
US9662064B2 (en) * 2011-10-01 2017-05-30 Brainlab Ag Automatic treatment planning method using retrospective patient data
US10734116B2 (en) 2011-10-04 2020-08-04 Quantant Technology, Inc. Remote cloud based medical image sharing and rendering semi-automated or fully automated network and/or web-based, 3D and/or 4D imaging of anatomy for training, rehearsing and/or conducting medical procedures, using multiple standard X-ray and/or other imaging projections, without a need for special hardware and/or systems and/or pre-processing/analysis of a captured image data
US9105200B2 (en) 2011-10-04 2015-08-11 Quantant Technology, Inc. Semi-automated or fully automated, network and/or web-based, 3D and/or 4D imaging of anatomy for training, rehearsing and/or conducting medical procedures, using multiple standard X-ray and/or other imaging projections, without a need for special hardware and/or systems and/or pre-processing/analysis of a captured image data
US9510771B1 (en) 2011-10-28 2016-12-06 Nuvasive, Inc. Systems and methods for performing spine surgery
US9314160B2 (en) * 2011-12-01 2016-04-19 Varian Medical Systems, Inc. Systems and methods for real-time target validation for image-guided radiation therapy
US9700276B2 (en) * 2012-02-28 2017-07-11 Siemens Healthcare Gmbh Robust multi-object tracking using sparse appearance representation and online sparse appearance dictionary update
US10561861B2 (en) 2012-05-02 2020-02-18 Viewray Technologies, Inc. Videographic display of real-time medical treatment
CN102670237B (en) * 2012-05-17 2014-12-10 西安一体医疗科技有限公司 Gamma radiation positioning system
CN102697560A (en) * 2012-05-17 2012-10-03 深圳市一体医疗科技股份有限公司 Non-invasive tumor locating system and method
WO2014010073A1 (en) * 2012-07-13 2014-01-16 三菱電機株式会社 X-ray positioning apparatus, x-ray positioning method, and image-of-interest imaging method
CA2888993A1 (en) 2012-10-26 2014-05-01 Viewray Incorporated Assessment and improvement of treatment using imaging of physiological responses to radiation therapy
US9091628B2 (en) 2012-12-21 2015-07-28 L-3 Communications Security And Detection Systems, Inc. 3D mapping with two orthogonal imaging views
CN103065322B (en) * 2013-01-10 2015-03-25 合肥超安医疗科技有限公司 Two dimensional (2D) and three dimensional (3D) medical image registration method based on double-X-ray imaging
DE102013200337B4 (en) * 2013-01-11 2021-11-11 Siemens Healthcare Gmbh Method, computer tomograph and computer program product for determining intensity values of an X-ray radiation for dose modulation
US10070828B2 (en) 2013-03-05 2018-09-11 Nview Medical Inc. Imaging systems and related apparatus and methods
US10846860B2 (en) 2013-03-05 2020-11-24 Nview Medical Inc. Systems and methods for x-ray tomosynthesis image reconstruction
US9039706B2 (en) 2013-03-13 2015-05-26 DePuy Synthes Products, Inc. External bone fixation device
RU2015143523A (en) 2013-03-13 2017-04-19 Депуи Синтез Продактс, Инк. DEVICE FOR EXTERNAL BONE FIXING
US8864763B2 (en) 2013-03-13 2014-10-21 DePuy Synthes Products, LLC External bone fixation device
US9446263B2 (en) 2013-03-15 2016-09-20 Viewray Technologies, Inc. Systems and methods for linear accelerator radiotherapy with magnetic resonance imaging
KR20150027881A (en) * 2013-08-29 2015-03-13 삼성전자주식회사 X-ray imaging apparatus and control method thereof
US9848922B2 (en) 2013-10-09 2017-12-26 Nuvasive, Inc. Systems and methods for performing spine surgery
US9377291B2 (en) 2013-12-05 2016-06-28 Bioptigen, Inc. Image registration, averaging, and compounding for high speed extended depth optical coherence tomography
US10292772B2 (en) * 2014-01-31 2019-05-21 Edda Technology, Inc. Method and system for determining optimal timing for surgical instrument insertion in image-guided surgical procedures
US10083278B2 (en) * 2014-02-12 2018-09-25 Edda Technology, Inc. Method and system for displaying a timing signal for surgical instrument insertion in surgical procedures
JP6400307B2 (en) * 2014-03-10 2018-10-03 キヤノンメディカルシステムズ株式会社 X-ray diagnostic imaging equipment
US9230322B2 (en) 2014-04-04 2016-01-05 Kabushiki Kaisha Toshiba Image processor, treatment system, and image processing method
US10470732B2 (en) * 2014-09-30 2019-11-12 Siemens Healthcare Gmbh System and method for generating a time-encoded blood flow image from an arbitrary projection
JP6732340B2 (en) 2014-12-10 2020-07-29 エレクタ、インク.Elekta, Inc. Magnetic resonance projection for constructing four-dimensional image information
CN104574420A (en) * 2015-01-29 2015-04-29 中国石油大学(华东) Nanoscale shale digital core building method
GB2536650A (en) 2015-03-24 2016-09-28 Augmedics Ltd Method and system for combining video-based and optic-based augmented reality in a near eye display
US11372889B2 (en) * 2015-04-22 2022-06-28 The Bank Of New York Mellon Multi-modal-based generation of data synchronization instructions
WO2017046796A1 (en) * 2015-09-14 2017-03-23 Real Imaging Ltd. Image data correction based on different viewpoints
US10702708B2 (en) * 2015-09-25 2020-07-07 Varian Medical Systems, Inc. Accounting for imaging-based radiation doses
KR20180087310A (en) 2015-11-24 2018-08-01 뷰레이 테크놀로지스 인크. Radiation beam collimation system and method
US10413751B2 (en) 2016-03-02 2019-09-17 Viewray Technologies, Inc. Particle therapy with magnetic resonance imaging
JP6668902B2 (en) * 2016-04-12 2020-03-18 株式会社島津製作所 Positioning device and method of operating positioning device
CN115407252A (en) 2016-06-22 2022-11-29 优瑞技术公司 Low field strength magnetic resonance imaging
US10835318B2 (en) 2016-08-25 2020-11-17 DePuy Synthes Products, Inc. Orthopedic fixation control and manipulation
US10342996B2 (en) 2016-08-29 2019-07-09 Accuray Incorporated Online angle selection in rotational imaging and tracking systems
JP6849966B2 (en) * 2016-11-21 2021-03-31 東芝エネルギーシステムズ株式会社 Medical image processing equipment, medical image processing methods, medical image processing programs, motion tracking equipment and radiation therapy systems
CN118141398A (en) 2016-12-13 2024-06-07 优瑞技术公司 Radiation therapy system and method
DE102017205113A1 (en) * 2017-03-27 2018-09-27 Siemens Aktiengesellschaft Determining the pose of an X-ray unit relative to an object based on a digital model of the object
US10434335B2 (en) * 2017-03-30 2019-10-08 Shimadzu Corporation Positioning apparatus and method of positioning by generation of DRR image from X-ray CT image data
US11058892B2 (en) 2017-05-05 2021-07-13 Zap Surgical Systems, Inc. Revolving radiation collimator
DE102017211081A1 (en) * 2017-06-29 2019-01-03 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Apparatus and method for finding a change in a scene
JP6461257B2 (en) * 2017-07-24 2019-01-30 キヤノン株式会社 Image processing apparatus and method
US10818019B2 (en) * 2017-08-14 2020-10-27 Siemens Healthcare Gmbh Dilated fully convolutional network for multi-agent 2D/3D medical image registration
CN108401421B (en) 2017-09-06 2022-12-20 睿谱外科系统股份有限公司 Self-shielding integrated control radiosurgery system
US11610346B2 (en) 2017-09-22 2023-03-21 Nview Medical Inc. Image reconstruction using machine learning regularizers
CN111247424A (en) 2017-09-28 2020-06-05 株式会社先机 Inspection position specifying method, three-dimensional image generating method, and inspection device
DE102017125671B4 (en) 2017-11-03 2021-07-01 Sicat Gmbh & Co. Kg Holding device for X-ray films
CN116036499A (en) 2017-12-06 2023-05-02 优瑞技术公司 Optimization of multi-modality radiation therapy
US10893842B2 (en) 2018-02-08 2021-01-19 Covidien Lp System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
US11980507B2 (en) 2018-05-02 2024-05-14 Augmedics Ltd. Registration of a fiducial marker for an augmented reality system
US11209509B2 (en) 2018-05-16 2021-12-28 Viewray Technologies, Inc. Resistive electromagnet systems and methods
US10832422B2 (en) 2018-07-02 2020-11-10 Sony Corporation Alignment system for liver surgery
EP3824815A4 (en) * 2018-07-19 2022-03-09 Our United Corporation Tumor positioning method and device
US11766296B2 (en) 2018-11-26 2023-09-26 Augmedics Ltd. Tracking system for image-guided surgery
US11684446B2 (en) 2019-02-27 2023-06-27 Zap Surgical Systems, Inc. Device for radiosurgical treatment of uterine fibroids
US11439436B2 (en) 2019-03-18 2022-09-13 Synthes Gmbh Orthopedic fixation strut swapping
US11304757B2 (en) 2019-03-28 2022-04-19 Synthes Gmbh Orthopedic fixation control and visualization
FR3095508B1 (en) * 2019-04-26 2021-05-14 Tiama PROCESS AND INSTALLATION OF ONLINE DIMENSIONAL CONTROL OF MANUFACTURED OBJECTS
US11850051B2 (en) 2019-04-30 2023-12-26 Biosense Webster (Israel) Ltd. Mapping grid with high density electrode array
US11980506B2 (en) 2019-07-29 2024-05-14 Augmedics Ltd. Fiducial marker
US11103729B2 (en) 2019-08-13 2021-08-31 Elekta ltd Automatic gating with an MR linac
US11382712B2 (en) 2019-12-22 2022-07-12 Augmedics Ltd. Mirroring in image guided surgery
CN111275813B (en) * 2020-01-20 2021-09-17 北京字节跳动网络技术有限公司 Data processing method and device and electronic equipment
US11334997B2 (en) 2020-04-03 2022-05-17 Synthes Gmbh Hinge detection for orthopedic fixation
EP3915477A1 (en) 2020-05-29 2021-12-01 Biosense Webster (Israel) Ltd Electrode apparatus for diagnosis of arrhythmias
CN111659031A (en) * 2020-06-24 2020-09-15 刘希军 Radiotherapy planning system with two-dimensional image fusion function
CN115485017A (en) * 2020-08-12 2022-12-16 西安大医集团股份有限公司 Image display control method, image display control device, electronic device, and computer storage medium
US11896445B2 (en) 2021-07-07 2024-02-13 Augmedics Ltd. Iliac pin and adapter
US20230149069A1 (en) 2021-11-16 2023-05-18 Biosense Webster (Israel) Ltd. Planar catheter with overlapping electrode pairs
US20230210433A1 (en) 2021-12-31 2023-07-06 Biosense Webster (Israel) Ltd. Reconfigurable electrode apparatus for diagnosis of arrhythmias
WO2024057210A1 (en) 2022-09-13 2024-03-21 Augmedics Ltd. Augmented reality eyewear for image-guided medical intervention

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5117829A (en) * 1989-03-31 1992-06-02 Loma Linda University Medical Center Patient alignment system and procedure for radiation treatment
US5588430A (en) * 1995-02-14 1996-12-31 University Of Florida Research Foundation, Inc. Repeat fixation for frameless stereotactic procedure
AU3880397A (en) * 1996-07-11 1998-02-09 Board Of Trustees Of The Leland Stanford Junior University High-speed inter-modality image registration via iterative feature matching
US6597818B2 (en) * 1997-05-09 2003-07-22 Sarnoff Corporation Method and apparatus for performing geo-spatial registration of imagery
JP3053389B1 (en) 1998-12-03 2000-06-19 三菱電機株式会社 Moving object tracking irradiation device
US6501981B1 (en) * 1999-03-16 2002-12-31 Accuray, Inc. Apparatus and method for compensating for respiratory and patient motions during treatment
US6470207B1 (en) * 1999-03-23 2002-10-22 Surgical Navigation Technologies, Inc. Navigational guidance via computer-assisted fluoroscopic imaging
US6235038B1 (en) * 1999-10-28 2001-05-22 Medtronic Surgical Navigation Technologies System for translation of electromagnetic and optical localization systems
US7024237B1 (en) * 1999-10-29 2006-04-04 University Of Florida Research Foundation, Inc. Mask system and method for stereotactic radiotherapy and image guided procedures
DE19953177A1 (en) * 1999-11-04 2001-06-21 Brainlab Ag Method to position patient exactly for radiation therapy or surgery; involves comparing positions in landmarks in X-ray image and reconstructed image date, to determine positioning errors
US6665555B2 (en) * 2000-04-05 2003-12-16 Georgetown University School Of Medicine Radiosurgery methods that utilize stereotactic methods to precisely deliver high dosages of radiation especially to the spine
US6782287B2 (en) * 2000-06-27 2004-08-24 The Board Of Trustees Of The Leland Stanford Junior University Method and apparatus for tracking a medical instrument based on image registration
US6907281B2 (en) 2000-09-07 2005-06-14 Ge Medical Systems Fast mapping of volumetric density data onto a two-dimensional screen
US7260426B2 (en) 2002-11-12 2007-08-21 Accuray Incorporated Method and apparatus for tracking an internal target region without an implanted fiducial
US6889695B2 (en) * 2003-01-08 2005-05-10 Cyberheart, Inc. Method for non-invasive heart treatment
US7171257B2 (en) 2003-06-11 2007-01-30 Accuray Incorporated Apparatus and method for radiosurgery
US7756567B2 (en) * 2003-08-29 2010-07-13 Accuray Incorporated Image guided radiosurgery method and apparatus using registration of 2D radiographic images with digitally reconstructed radiographs of 3D scan data
US7327865B2 (en) 2004-06-30 2008-02-05 Accuray, Inc. Fiducial-less tracking with non-rigid image registration
US7522779B2 (en) * 2004-06-30 2009-04-21 Accuray, Inc. Image enhancement method and system for fiducial-less tracking of treatment targets

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MURPHY MARTIN J: "Medical Physics", vol. 24, 1 June 1997, AIP, article "An automatic six-degree-of-freedom image registration algorithm for image-guided frameless stereotaxic radiosurgery", pages: 857
ZOLLEI L: "Thesis", 1 August 2008, THE MASSACHUSETTS INSTITUTE OF TECHNOLOGY, article "2D-3D Rigid-Body Registration of X-ray Fluoroscopy and CT Images", pages: 1 - 113

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1899894A4 (en) * 2005-06-29 2010-03-17 Accuray Inc Precision registration of x-ray images to cone-beam ct scan for image-guided radiation treatment
WO2007005445A2 (en) 2005-06-29 2007-01-11 Accuray Incorporated Precision registration of x-ray images to cone-beam ct scan for image-guided radiation treatment
EP1899894A2 (en) * 2005-06-29 2008-03-19 Accuray, Inc. Precision registration of x-ray images to cone-beam ct scan for image-guided radiation treatment
US7831073B2 (en) 2005-06-29 2010-11-09 Accuray Incorporated Precision registration of X-ray images to cone-beam CT scan for image-guided radiation treatment
US8306297B2 (en) 2005-06-29 2012-11-06 Accuray Incorporated Precision registration of X-ray images to cone-beam CT scan for image-guided radiation treatment
DE102005030646B4 (en) * 2005-06-30 2008-02-07 Siemens Ag A method of contour visualization of at least one region of interest in 2D fluoroscopic images
US7689042B2 (en) 2005-06-30 2010-03-30 Siemens Aktiengesellschaft Method for contour visualization of regions of interest in 2D fluoroscopy images
DE102005030646A1 (en) * 2005-06-30 2007-01-11 Siemens Ag Method for contour visualization of regions of interest in 2D fluoroscopic images
EP2032039A2 (en) * 2006-06-28 2009-03-11 Accuray Incorporated Parallel stereovision geometry in image-guided radiosurgery
EP2032039A4 (en) * 2006-06-28 2012-11-28 Accuray Inc Parallel stereovision geometry in image-guided radiosurgery
WO2008128551A1 (en) * 2007-04-18 2008-10-30 Elekta Ab (Publ) Radiotherapeutic apparatus and methods
US9227087B2 (en) 2010-08-04 2016-01-05 P-Cure Ltd. Teletherapy control system and method
WO2012017427A1 (en) * 2010-08-04 2012-02-09 P-Cure Ltd. Teletherapy control system and method
US8755489B2 (en) 2010-11-11 2014-06-17 P-Cure, Ltd. Teletherapy location and dose distribution control system and method
US10716634B2 (en) 2011-04-07 2020-07-21 3Shape A/S 3D system and method for guiding objects
US10582972B2 (en) 2011-04-07 2020-03-10 3Shape A/S 3D system and method for guiding objects
US9165362B2 (en) 2013-05-07 2015-10-20 The Johns Hopkins University 3D-2D image registration for medical imaging
US9427286B2 (en) 2013-09-24 2016-08-30 The Johns Hopkins University Method of image registration in a multi-source/single detector radiographic imaging system, and image acquisition apparatus
JP2016059606A (en) * 2014-09-18 2016-04-25 株式会社島津製作所 Positioning device and positioning method
JP2016059612A (en) * 2014-09-18 2016-04-25 株式会社島津製作所 Drr image creation method and drr image creation device
US10413752B2 (en) 2015-02-09 2019-09-17 Brainlab Ag X-ray patient position monitoring
US20180021597A1 (en) * 2015-02-09 2018-01-25 Brainlab Ag X-Ray Patient Position Monitoring
WO2016128014A1 (en) * 2015-02-09 2016-08-18 Brainlab Ag X-ray patient position monitoring
US10235606B2 (en) 2015-07-22 2019-03-19 Siemens Healthcare Gmbh Method and system for convolutional neural network regression based 2D/3D image registration
CN106651750A (en) * 2015-07-22 2017-05-10 美国西门子医疗解决公司 Method and system used for 2D/3D image registration based on convolutional neural network regression
EP3121789A1 (en) * 2015-07-22 2017-01-25 Siemens Medical Solutions USA, Inc. Method and system for convolutional neural network regression based 2d/3d image registration
CN106651750B (en) * 2015-07-22 2020-05-19 西门子保健有限责任公司 Method and system for 2D/3D image registration based on convolutional neural network regression

Also Published As

Publication number Publication date
US20050049478A1 (en) 2005-03-03
US20050049477A1 (en) 2005-03-03
US20050047544A1 (en) 2005-03-03
US20070116341A1 (en) 2007-05-24
WO2005024721A3 (en) 2005-11-17
US20100239153A1 (en) 2010-09-23
US7480399B2 (en) 2009-01-20
US7756567B2 (en) 2010-07-13
US7204640B2 (en) 2007-04-17
EP1667580A2 (en) 2006-06-14
EP1667580A4 (en) 2010-09-01
US7187792B2 (en) 2007-03-06
US8280491B2 (en) 2012-10-02

Similar Documents

Publication Publication Date Title
WO2005024721A2 (en) 2d/3d image registration in image-guided radiosurgery
US7835500B2 (en) Multi-phase registration of 2-D X-ray images to 3-D volume studies
US7684647B2 (en) Rigid body tracking for radiosurgery
US7831073B2 (en) Precision registration of X-ray images to cone-beam CT scan for image-guided radiation treatment
US7327865B2 (en) Fiducial-less tracking with non-rigid image registration
EP2032039B1 (en) Parallel stereovision geometry in image-guided radiosurgery
US7426318B2 (en) Motion field generation for non-rigid image registration
US7453983B2 (en) Radiation therapy method with target detection
US20080037843A1 (en) Image segmentation for DRR generation and image registration
US20060002631A1 (en) ROI selection in image registration
US20060002615A1 (en) Image enhancement method and system for fiducial-less tracking of treatment targets
US20060002601A1 (en) DRR generation using a non-linear attenuation model
Leszczynski et al. An image registration scheme applied to verification of radiation therapy.
Davis et al. Collision-avoiding imaging trajectories for linac mounted cone-beam CT

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2004781775

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2004781775

Country of ref document: EP