WO2004044612A2 - A method and apparatus for tracking an internal target region without an implanted fiducial - Google Patents
A method and apparatus for tracking an internal target region without an implanted fiducial Download PDFInfo
- Publication number
- WO2004044612A2 WO2004044612A2 PCT/US2003/035801 US0335801W WO2004044612A2 WO 2004044612 A2 WO2004044612 A2 WO 2004044612A2 US 0335801 W US0335801 W US 0335801W WO 2004044612 A2 WO2004044612 A2 WO 2004044612A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- target region
- images
- generating
- time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1049—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/12—Arrangements for detecting or locating foreign bodies
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/467—Arrangements for interfacing with the operator or the patient characterised by special input means
- A61B6/469—Arrangements for interfacing with the operator or the patient characterised by special input means for selecting a region of interest [ROI]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2072—Reference field transducer attached to an instrument or patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/376—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/376—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
- A61B2090/3762—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7285—Specific aspects of physiological measurement analysis for synchronizing or triggering a physiological measurement or image acquisition with a physiological event or waveform, e.g. an ECG signal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/54—Control of apparatus or devices for radiation diagnosis
- A61B6/541—Control of apparatus or devices for radiation diagnosis involving acquisition triggered by a physiological signal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/54—Control of the diagnostic device
- A61B8/543—Control of the diagnostic device involving acquisition triggered by a physiological signal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1049—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
- A61N2005/1061—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam using an x-ray imaging system having a separate imaging source
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1049—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
- A61N2005/1061—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam using an x-ray imaging system having a separate imaging source
- A61N2005/1062—Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam using an x-ray imaging system having a separate imaging source using virtual X-ray images, e.g. digitally reconstructed radiographs [DRR]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1064—Monitoring, verifying, controlling systems and methods for adjusting radiation treatment in response to monitoring
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/10—X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
- A61N5/1048—Monitoring, verifying, controlling systems and methods
- A61N5/1064—Monitoring, verifying, controlling systems and methods for adjusting radiation treatment in response to monitoring
- A61N5/1065—Beam adjustment
- A61N5/1067—Beam adjustment in real time, i.e. during treatment
Definitions
- This invention relates generally to an apparatus for improving the accuracy and efficacy of surgical treatments and more particularly to tracking the location of the target region that is displaced during the treatment due to respiratory and other patient motions.
- a tumor may be destroyed by a beam of ionizing radiation that kills the cells in the tumor.
- accurate aiming of the beam at the tumor is of paramount importance in radiation treatments.
- the goal is to focus a high dose of radiation to the tumor while minimizing the exposure of the surrounding healthy tissue to radiation.
- the direction of the radiation beam is typically adjusted during the treatment to track the tumor.
- the most advanced modern radiosurgery systems such as the Cyberknife System of Accuray, Inc., use stereo online x-ray imaging during treatment to enhance the accuracy of radiation treatment.
- the position of the patient's bony landmarks e.g., a skull
- this highly accurate x-ray camera system can be used to treat a target region if the position of the target region relative to a bony landmark remains constant.
- this x-ray camera system cannot be used to determine the position of a target region if the position of the target region relative to a bony landmark changes because the target, e.g., a tumor, is generally not visible in x-ray images. For example, a target region in a patient's abdomen or chest cannot be treated with this method alone.
- While accurate aiming of the beam is not difficult when the tumor is in a body part that does not move, such as the brain, aiming becomes a challenge when the tumor is in or near a body part that moves, such as the lungs.
- a tumor located near the lungs moves as the patient inhales and exhales, necessitating continuous adjustment of the radiation beam direction.
- a tumor cannot be easily tracked based on external measurements alone. For example, placing external sensors on a patient's chest and tracking the movement of the sensors does not provide accurate information about the position of the tumor inside the chest cavity because a certain soft tissue structure may move in one direction while bones move in another direction.
- the tumor be located with x-ray systems, because in most cases, neither the target region nor the surrounding soft tissues are visible in the x-ray images.
- the two-dimensional nature of x-ray images compromises the accuracy with which the radiation is applied to the target region.
- real-time tracking of target region cannot be performed with x-ray imaging alone because of the excess radiation that the patient would be exposed to.
- U.S. Patent No. 6,144,875 discloses a method for tracking a target region by implanting small gold markers that are visible in x-ray images into a patient's abdomen prior to radiation treatment. Once the internal fiducials are implanted, they are periodically imaged with a stereo x-ray camera system so that their positions are accurately determined. Based on the position of the markers, the position of the tumor can be accurately determined.
- the x-ray imaging process is too slow and too invasive to frack the respiration motion in realtime.
- the x-ray system allows the location of a tumor to be determined only at certain time intervals, e.g., every 10 seconds, and not continuously.
- the implanting of the fiducials is an invasive and expensive procedure because the procedure usually takes place under the surveillance of a computer tomography (CT) device and in the presence of a surgeon.
- CT computer tomography
- the required presence of the surgeon not only drives up the cost of the procedure for the patient but also exposes the surgeon to ionizing radiation.
- a method and apparatus for locating an internal target region during treatment without using implanted fiducials comprises producing a plurality of first images that each shows an internal volume including the internal target region, then producing a live image of the internal volume during treatment and matching this live image to one of the plurality of first images. Since the first images show the internal target region, matching the live image to one of the first images identifies the position of the target region regardless of whether the second image itself shows the position of the target region.
- the first images may be any three-dimensional images such as CT scans, magnetic resonance imaging, and ultrasound.
- the live image may be, for example, an x-ray image.
- the invention may be used in conjunction with one or more real-time sensors to track the position of the target region on a real-time basis.
- the signal from the real-time sensor is correlated with the position of the target region.
- the correlation model is produced by simultaneously taking an x-ray and reading the signal from the real-time sensor, then using the x-ray to identify the best-matching three-dimensional image that shows the target position. Once the correlation is established, the position of the target region can be tracked real time during treatment by reading the signal from the realtime sensor almost continuously.
- FIG. 1 depicts an example of a radiosurgical treatment system that may be used with the present invention
- FIG. 2 is a block diagram depicting the treatment system of FIG. 1;
- FIG. 3 depicts a respiration pattern including a plurality of respiration cycles
- FIG. 4A is a flow chart of a pre-treatment procedure that is executed before real-time tracking of a target region in accordance with the invention
- FIG. 4B is a flow chart of an alternative pre-treatment procedure that is executed before real-time tracking of a target region in accordance with the invention
- FIG. 5 schematically depicts a position of a target region at point A in the respiration
- FIG. 6 schematically depicts a position of a target region at point B in the respiration
- FIG. 7 schematically depicts formation of intermediate three-dimensional images by deformation of a first three-dimensional image into a second three-dimensional image
- FIG. 8 is a flow chart depicting a fiducial-less target tracking procedure for determining a target position during treatment without an implanted fiducial;
- FIG. 9 is a flow chart depicting a correlation process for correlating a real-time sensor readings to target positions.
- FIG. 10 is a flow chart depicting the real time tracking procedure for the real time tracking of the target region during treatment.
- the invention is particularly applicable to an apparatus and method for directing a radiation beam towards an internal target region without implanting an internal fiducial in or near the target region to determine its location, and it is in this context that the invention will be described. It will be appreciated, however, that the apparatus and method in accordance with the invention has greater utility, such as to other types of medical procedures with other types of medical instruments, such as positioning biopsy needles, ablative, ultrasound or other focused energy treatments, or positioning a laser beam for laser beam treatment.
- a typical radiosurgery device will be described to provide a better understanding of the invention.
- a target region is the region to which a treatment (e.g., radiation) is to be directed.
- a target region is located in a "relevant volume,” which refers to an internal region surrounding the target region, and which may include bones and soft tissues around the target region.
- the method according to the invention includes steps that may be executed prior to treatment and steps that are executed during treatment.
- a series of at least ' two CT scans are taken or computed. Each CT scan corresponds to a particular point in the respiratory cycle of the patient.
- Each of the series of CT scans may be an actual CT scan or a combination of actual CT scans and computer-generated intermediate CT images.
- In order to generate intermediate CT images at least two CT scans are first taken.
- each CT scan and computer- generated CT image shows the target region. This target region may be marked prior to treatment.
- a set of digitally reconstructed radiographs (DRRs) is computed for each CT scan and/or computer-generated CT image. Each DRR shows the target region from a particular angle.
- live stereo x-ray images are taken periodically, e.g., once every 10 seconds.
- the target region may not be clearly visible in the x-ray images.
- the exact location of the target region can be determined by comparing each live x-ray image with the set of DRRs, finding the DRR that best matches the live x-ray image, and identifying the CT scan or CT image from which the DRR was generated.
- the CT scan or CT image shows the target region and hence the position of the target region. Based on the viewing angle associated with the best-matching DRR, the exact angle or translational shift the patient was in when the live x-ray image was taken is also determined.
- both a translational/rotational shift of the patient's body and the current respiratory state of the patient may be inferred from the live x-ray images.
- the fiducial- less target tracking method according to the invention may be combined with real-time sensors in order to track the target region real time.
- Real-time sensors may be external markers that are coupled to a patient's body part that moves when the target region moves but not necessarily in the same direction or to the same degree.
- a correlation model correlating signals from the real-time sensors with the position of the target region is generated, preferably prior to treatment.
- the correlation model is generated by taking an x-ray image of the target region and reading the signal from the real-time sensor simultaneously, then using the x-ray image to identify the best-matching DRR and the associated CT scan or CT image.
- the CT scan or the CT image identifies the position of the target region at the time the signal was read from the realtime sensor. After a set of data points are taken, the position of the target region can be correlated with the signal reading from the real-time sensor.
- This pre-treatment correlation procedure is similar to the live fiducial-less target tracking method that is executed during treatment in that an x-ray image is matched up with a DRR to identify the position of the target region.
- This correlation procedure differs from the live target region tracking that is done during treatment in that the signal from the real-time sensor is read at the same time an x-ray image is taken.
- the position of the target region can be inferred from the real-time sensor signals.
- real-time sensor signals are easily obtained and can be read more frequently than x-ray images can be processed, use of a real-time sensor allows the target region to be tracked almost continuously, or real-time. Further details of the correlation procedure are provided in U.S. Patent No. 6,144,875, which is incorporated herein in its entirety.
- FIG. 1 depicts an example of a stereotaxic radiation treatment device 10.
- the radiation treatment device 10 may include a data processor 12, such as a microprocessor, and a memory unit 13 which may store image data and mathematical data pertaining to a target region inside a patient 14. The image data may be loaded into the data processor either prior to treatment or during the surgical procedure.
- the radiation treatment device 10 may also include a beaming apparatus 20 which, when activated, emits a collimated surgical ionizing beam directed at a target region inside patient 14.
- the collimated surgical ionizing beam may have sufficient strength to cause the target region to become necrotic.
- a variety of different beaming apparatus may be used which generate an ionizing radiation or heavy particle beam such as a linear accelerator and preferably an x-ray linear accelerator. Such an x-ray beaming apparatus is commercially available.
- the beaming apparatus maybe activated by the operator using a switch (not shown) that is connected to the beaming apparatus 20.
- the radiation treatment device 10 may also include a stereo x-ray imaging apparatus for passing a first diagnostic beam 26 and a second diagnostic beam 28 through an internal target region 30.
- the diagnostic beams may be positioned at a predetermined non-zero angle with respect to each other.
- the diagnostic beams may be generated by a first x-ray generator 32a and a second x-ray generator 32b, respectively.
- An image receiver 34 may receive the diagnostic beams 26, 28 to generate an image from the diagnostic beams which is fed into the microprocessor 12 so that the diagnostic images maybe compared to the three-dimensional image. In some embodiments, two separate receivers may each receive one the diagnostic beams 26 and 28.
- the radiation treatment device 10 may also include a device for adjusting the relative positions of the beaming apparatus 20 and the patient 14 so that the ionizing beam is continuously focused on the target region.
- the positioning of the beaming apparatus relative to the patient may be altered by a processor controllable robotic arm mechanism 40 and/or a moveable operating table with a tilting top 38.
- the robotic arm mechanism permits the beaming apparatus to be moved freely about the patient's body including up, down, longitudinally along or laterally along the body of the patient.
- the radiation treatment device 10 may also include a real-time sensing system for monitoring an external movement of the patient 14.
- the real-time sensing system includes one or more real-time sensors 42 that are coupled to an external body part of the patient 14 and a sensor reader 44 that takes a reading from the real-time sensors 42 periodically. Readings of the real-time sensors 42 indicate the movement of an external body part of the patient 14.
- This real time sensing system may be any system that can be used for correlating the real-time sensors 42 to respiration pattern with a response/reactivation time of less than 250 ms.
- Some commercially available sensors that may be used as the real time sensor 42 include infrared tracking systems made by Northern Digital, Inc.
- the current state of respiration maybe measured by viewing video images of the chest and/or abdomen movement, or sensing the flow of air or temperature emulating from the mouth and/or nose of the patient 14.
- the real-time sensing system is coupled to the processor 12 so that the processor 12 can use the readings of the real-time sensors 42 to establish a correlation.
- FIG. 2 is a block diagram of the radiation treatment device 10 including the microprocessor 12, the memory unit 13 (e.g., a tape drive), the beaming apparatus 20, the robotic arm 40, the x-ray cameras 30, 32, 34 and 36, and the operator control console 24 as described above.
- the device 10 may include an real-time sensor tracking system 50 to frack the position of an real-time sensor attached to the skin of patient 14.
- the device 10 may also include an operator display 48 for tracking the progress of the treatment and controlling the treatment. Any further details of the radiosurgery device may be found in U.S. Patent No. 5,207,223 which is owned by the assignee of this application and which is incorporated herein by reference.
- FIG. 3 depicts a plot 70 of a patient's respiration pattern.
- the depicted portion of the respiration pattern includes two respiratory cycles, namely cycle 72 and cycle 74.
- a single respiratory cycle includes the entire range of diaphragm and chest wall movement in the respiration pattern.
- point A to point A is one respiration cycle, as is from point B to point B.
- point B in respiration cycle 72 is associated with substantially similar internal anatomy as point B in respiration cycle 74.
- the target region is in a substantially same location at each point A and in a substantially same location at each point B.
- the target region shifts from the position that it is in at point A to the position that it is in at point B in substantially the same manner in every cycle.
- a set of images of the relevant volume such that there is an image that represents the actual position of the relevant volume at almost every point in a patient's respiratory pattern.
- Such respiratory pattern represents the individual anatomy of each patient.
- the processor 12 may be programmed to issue a particular command at one or more preselected points in the respiratory pattern.
- the processor 12 may be programmed so that CT scans are taken at certain points in the respiratory pattern.
- This programmed CT scanning may include two general steps: determining the respiratory pattern of a patient and programming the processor 12 to trigger the CT scanner at certain points in this respiratory pattern.
- the respiratory pattern of a patient may be established by using any of the well-known methods or tools, such as the real-time sensing system shown in FIG. 1.
- Real-time sensors 42 see FIG. 1), which emit signals indicating the movements of a body part (e.g., chest wall), are coupled to the patient's body. After a critical number of signals are received, the signals are processed to reveal a respiratory pattern.
- certain points on the pattern are selected based on the number of CT scans that is desired, and the processor 12 is programmed to trigger the CT scanner at the selected points.
- a first CT scan may be taken at point A
- a second CT scan may be taken at point B
- three CT scans may be taken at an equal time interval between point A and point B, resulting in a total of five CT scans that are taken at different points in the respiratory cycle.
- the three CT scans between point A and point B may be taken at points C, D, and E shown in FIG. 3.
- the points on the pattern may be selected based on the position of a body part that is being tracked.
- a first CT scan may be taken at a point when the chest wall is at its highest level
- a second CT scan may be taken when the chest wall is a distance Ad below the highest level
- a third CT scan may be taken when the chest wall is a distance 20d below the highest level
- this respiratory pattern-based triggering method is not limited to being used with a CT scanner, and that the processor 12 may issue a command to a different device or execute a set of instructions itself.
- the processor 12 may trigger the CT scanner at point A and point B, and generate synthetic images of the scanned relevant volume for other points in the respiratory cycle between point A and point B.
- This method of selecting certain points on the respiratory pattern and programming processor 12 to trigger the CT scanner at the selected points is not limited to being used in the context of radiosurgery.
- this method can be used to improve conventional CT scans, the quality of which are known to be adversely affected by patient movement (e.g., movement due to respiration).
- This method allows one to obtain improved three-dimensional images that are free of artifacts of movement, for example for the purpose of diagnosis.
- a CT scan is taken using this method, it is clearly known at what point in the respiratory cycle the CT scan is taken.
- the internal location determining process in accordance with the invention includes a pre-treatment procedure 100a and an alternative pre-treatment procedure 100b, one of which is preferably performed prior to the radiation treatment.
- FIG. 4 A and FIG. 4B each depicts this pre-treatment procedure 100a and alternative pre-treatment procedure 100b, respectively.
- FIG. 5 and FIG. 6 schematically illustrate exemplary three-dimensional images obtained during the pretreatment procedure 100a, 100b.
- FIG. 7 schematically illustrates a deformation process that may be used to generate some of the three-dimensional images in the alternative pre-treatment procedure 100b.
- FIG. 4A is a flowchart of a pre-treatment procedure 100a that is executed before a treatment for determining the location of a target region in accordance with the invention.
- the pre-treatment procedure 100a begins when a patient undergoes a plurality of CT scans to produce three-dimensional images that show the patient's relevant volume (i.e., bones, soft tissue, and the target region) at different points in his respiratory cycle (stage 102).
- the CT scans, each of which shows the target region may be taken in the manner described above in reference to FIG. 3.
- the target region position is determined for each of these CT scans (stage 110) and stored (stage 112).
- a set of DRRs are generated for each CT scan, each DRR representing the way the relevant volume looks from a particular angle.
- each CT scan taken in stage 102 represents the patient's internal volume at a specific point in his respiratory cycle.
- the first CT scan may be taken at point A in the respiratory cycle and the second CT scan may be taken at point B.
- point A and point B may be, for example, the points of maximum and minimum inhalation in the patient's respiratory cycle.
- the position of the target region is identified in each of these CT scans (stage 110), for example as coordinates in the treatment room. Then, the identified position is stored (stage 112) in the memory 13 of the radiation treatment device 10 (see FIG. 1).
- the pre-treatment procedure 100 is described as including CT scans, the CT scans may be replaced by other three- dimensional images that show the bones and tissues of the relevant volume, such as magnetic resonance (MR) images or ultrasound images.
- MR magnetic resonance
- each of the CT scans is used to generate a set of digitally reconstructed radiographs (DRRs) (stage 108).
- DRRs digitally reconstructed radiographs
- Each DRR is an image obtained by computing a two- dimensional projection through a three-dimensional image.
- a DRR is a synthetic image obtained by computation.
- a two-dimensional projection through the three-dimensional CT scan resembles the physical process of taking an x-ray image.
- a DRR looks similar to an x-ray image.
- each DRR is a synthetic two-dimensional image that shows what the three-dimensional images prepared in stages 102 and 104 look like from a particular angle.
- a set of DRRs all computed from one three-dimensional image but from different angles, resemble a set of x-ray images taken from these angles.
- the DRRs show the target region from a set of angles from which the x-ray generators 32a, 32b (see FIG. 1) view the target region.
- DRRs are needed to show what the relevant volume looks like from different angles.
- there are enough DRRs in a set such that there is a DRR that corresponds to almost every position that the patient 14 can shift into during treatment, and a set of DRRs may include as many DRRs as a person of ordinary skill in the art deems adequate.
- FIG. 4B depicts an alternative pre-treatment procedure 100b in accordance with the invention.
- This alternative pre-treatment procedure 100b is similar to the pre-treatment procedure 100a of FIG. 4 A except that the three-dimensional images are a combination of actual CT scans and computer-generated intermediate three-dimensional images.
- stage 102 a plurality (e.g., two) CT scans are taken of the relevant volume at point A and point B of the respiratory cycle. These CT scans are then used to compute a series of intermediate three-dimensional images (stage 104) by computing synthetic deformation images from the actual CT scans taken during stage 102.
- Each of the intermediate three-dimensional images shows the position of the target region at a respiratory state between point A and point B.
- the computation for producing these intermediate three-dimensional images may be performed offline, by any of the well-known methods such as thin-plate splines, warping, interpolation, or extrapolation.
- the position of the target region is marked in each of these CT scans and intermediate three-dimensional images (stage 106). This marking may also be done offline.
- Both the CT scans taken in stage 102 and the intermediate three-dimensional images taken in stage 104 are herein referred to as "three-dimensional images.”
- each three-dimensional image representing a point in the respiratory cycle is viewed from 40 different angles.
- the DRRs are used to match up a live x-ray image with a three-dimensional image, which is in turn used to determine the position of the target region.
- FIG. 5 schematically depicts a CT scan 210a that is taken at point A of the respiration cycle in stage 102 of the pre-treatment procedure 100.
- FIG. 6 depicts a CT scan 210b taken at point B of the respiration cycle, also in stage 102 of the pre-treatment procedure 100.
- FIGs. 5 and 6 show a cross sectional view of a relevant volume 200, soft tissues 202a and 202b, and a target region 204 located between soft tissues 202a and 202b.
- the soft tissues 202a and 202b are aligned in the x-direction according to a coordinate system 214.
- the target region 204 is located closer to soft tissue 202a than to soft tissue 202b, as shown in FIG. 5.
- the target region 204 is closer to soft tissue 202b than to soft tissue 202a, as shown in FIG. 6.
- the target region 204 was displaced along the x-direction between point A and point B of the respiration cycle.
- the shape of the relevant volume 200 is different in FIG. 6 than in FIG. 5. More specifically, the relevant volume 200 and the soft tissues 202a and 202b shrank along the y-direction in going from point A (depicted in FIG. 5) to point B (depicted in FIG. 6) in the respiration cycle.
- the relevant volume 200 shrinks along the y-direction and the target region 204 is displaced along the x-direction.
- FIG. 7 schematically depicts the generation of intermediate three-dimensional images in stage 104 of FIG. 4B.
- Two exemplary intermediate three-dimensional images 220 and 222 are generated based on two CT scans 210a and 210b.
- the three-dimensional images 220 and 222 are formed by continuously deforming the CT scan 210a taken at point A of the respiration cycle into the CT scan 210b taken at point B.
- the three dimensional images 220 and 222 depict the intermediate stages the relevant volume 200 and the target region 204 go through while transitioning from point A to point B in the respiration cycle.
- point A is the point of maximum inhalation
- point B is the point of maximum exhalation for the purpose of illustration herein.
- the relevant volume 200 becomes progressively smaller along the y-direction and the target region 204 becomes progressively closer to soft tissue 202b as the patient 14 exhales.
- FIG. 8 is a flow chart depicting a fiducial-less target tracking procedure 130, which allows the position of the target region to be determined without implanted fiducials in accordance with the invention.
- live stereo x-ray images are taken periodically, at time interval -t (stage 131). Since the x-ray imaging themselves do not show the target region, the live x-rays have to be associated with a proper three-dimensional image in order for the position of the target region to be determined.
- Each of these x-ray images are compared to the DRRs prepared during the pre-treatment procedure 100a, 100b (stage 132).
- This comparison may be made by processor 12 for each live x-ray image, using any well known image comparison technique including but not limited to mutual information, cross correlation, and image subtraction. With one of these techniques, each DRR and the live x- ray image are compared pixel by pixel. This comparison may entail subtracting the gray level pixel values in both images for each pixel location. The accumulated differences in gray levels give an error signal characterizing the distance between the DRR and the x-ray image. A person of ordinary skill in the art would understand how to implement a suitable comparison technique.
- a DRR that best matches the x-ray image(s) is selected (stage 134). Since every DRR is associated with a three-dimensional image, the associated three-dimensional image is identified (stage 136). In addition, the correct angle associated with the best-matching DRR must be identified (stage 138). Based on the identified three- dimensional image and viewing angle, target region position is determined (stage 140). Then, the viewing angle is added to this target position (also stage 140). Once the position of the target region as seen from the angle of the x-ray imaging devices is known, the position of the target region can be determined accurately. The location of the target region is then inferred and determined with respect to the treatment room coordinates (stage 128). Since the respiratory pattern is substantially cyclical, the location of the target region can even be predicted after a critical number of data points are taken.
- the fiducial-less procedure 130 of FIG. 8 affords the significant advantage of locating a target region without implanted fiducials, it does not allow real time tracking because x-ray imaging alone may be too slow for detecting fast respiration.
- the time interval -t at which x-ray images are taken may be as long as 10 seconds since too frequent of x-ray imaging could expose the patient to excess radiation. Locating the target region every 10 seconds does not provide accurate beam directing because the target region can move out of the beam radius within 10 seconds.
- the treatment beam needs to be adjusted between the x- ray images, at a time interval that is shorter than At.
- easily trackable real-time sensors may be implemented to provide measurement data in real time, i.e., with negligible lag time between a movement of the patient's body part and the "reporting" of the movement. Determining the location of the target region based on the position of the realtime sensors allows a real-time determination of the location of the target region. Since the use of real-time sensors is non-invasive, cheaper, and overall much less complicated than the use of internal fiducials, the ability to determine the position of the tumor based on real-time sensors without using internal fiducials is desirable.
- real-time sensors may be used in conjunction with the fiducial-less target fracking procedure 130.
- the real-time sensors In order for the real-time sensors to be used with fuducial-less target fracking procedure 130 to locate the target region real-time, a correlation has to be established between real-time sensors and the position of the target region.
- FIG. 9 is a flow chart depicting a sensor-target region correlation procedure 120 which is a procedure for establishing a correlation model between real-time sensor readings and the position of the target region.
- a real time sensor may be coupled to an external body part (e.g., the skin) of the patient, or activated (stage 122) after the patient 14 is placed in the treatment room.
- this real-time sensor may be any sensor showing a correlation to respiration with response or reactivation time of less than 250 ms.
- the real-time sensor should emit a new signal at least ten times per second.
- the signal from the real time sensor can be read at a time interval ⁇ t senSor that is shorter than ⁇ t.
- a stereo x-ray image is taken (stage 124).
- the signal from the realtime sensor is read, and the reading may be time-stamped.
- Stage 124 is repeated at a time interval ⁇ t.
- Sensor reading interval ⁇ t sensor does not have to be absolutely constant as long as each consecutive sensor readings are taken sufficiently closely in time (i.e., ⁇ t S e nsor is small enough). The same holds for x-ray imaging.
- the stereo x-ray image is then compared with the DRRs that were obtained during the pre-treatment procedure 100a, 100b (stage 126). The comparison identifies the best matching DRR which points to a three-dimensional image from which this DRR was generated. Since the position of the three-dimensional image was marked during the pre-treatment procedure, the position of the target region is determined from the three-dimensional image (stage 128). Using the determined target region positions, the data points collected in stage 124 can be converted into data points of the target region position and corresponding real-time sensor readings, producing a point cloud.
- the processor 12 (see FIG. 1) in the radiosurgery device may fit a first curve to the points generated by the real-time sensors and a second curve to the points generated for the target position. These curves permit the real time sensor readings and target position to be correlated to each other (stage 130) to produce a correlation model that is eventually used during treatment to track the target region.
- Another way to perform the correlation of the position of the target region position and the real-time sensor(s) is to use a neural network trained to perform interpolation or other known mathematical interpolation methods for establishing the correspondence between two sets of data after having computed the point clouds.
- FIG. 10 depicts the real time tracking procedure 150 for the real-time tracking of the target region during treatment.
- the real time tracking procedure 150 begins with taking a reading from a real-time sensor at times t sen so r ⁇ , tsensor 2 , t sen sor3, •• ⁇ each separated by a time interval that is not necessarily constant (stage 152).
- the time interval herein is expressed as time interval ⁇ t se nsor wherein ⁇ t sen so r is a range of time rather than an exact and constant value.
- ⁇ tse n s or may be, for example 50 ms, while ⁇ t may be 10 seconds.
- the system reads a signal s from the real-time sensor at time t sen so r ⁇ - No x-ray image is taken at time sor ⁇ - Then, the position of the target region at time t sen sor l is obtained based on this real-time sensor reading s (stage 156) based on the correlation between the real-time sensor reading and the position of the target region that was established during the sensor- target region correlation procedure 120. More specifically, the signal s is fit to the previously generated first curve of the real-time sensor readings.
- a position y of the target region that corresponds to the sensor signal s is determined by identifying a point on the second curve that corresponds to the position of s on the first curve, or by a well-known interpolation method. If there are multiple real-time sensors, this process may be performed for each realtime sensor. This way, the position of each real-time sensor that was obtained in stage 152 is matched to one of the realtime sensor readings in the correlation model, and the position of the target region is inferred from the real-time sensor reading. As previously mentioned, the real-time sensor signals are read frequently (e.g., every 50 ms). Based on the sensor-target position correlation, the position of the target region to be determined as frequently as the sensor signals are read. Thus, with the present invention, it is not necessary to actually image the internal target region on a real-time basis in order to frack the target region almost continuously.
- a live x-ray image may be taken at time to (stage 154), when the signal s is read from the real-time sensors.
- the x-ray time interval ⁇ t is a multiple of the sensor reading time interval ⁇ t senso r so that after a certain number of sensor readings, the x-ray imaging and the sensor reading occurs simultaneously.
- This new x-ray image which is an additional data point, may be added to the point cloud and be used to modify or update the correlation model. This constant updating prevents any change in respiratory pattern during treatment from compromising the accuracy of the treatment.
- the invention is not limited to a specific number of sensors or sensors that track respiratory motion.
- the number of sensors to be used may be determined based on the degree of accuracy or certainty that is desired. Multiple sensors can lead to multilevel correlation, enliancing the reliability and accuracy of tracking.
- three sensors may be used: a first sensor that is read at a time interval of ⁇ t sen sor, a second sensor that is read at a time interval ⁇ t, and a third sensor that is read at another time interval between ⁇ t senso r and ⁇ t.
- the extra sensor(s) may track a patient motion that is caused by something other than respiration and that also moves the target region.
- aortic pulsation or heart cycles may be tracked to take into account the movement of the target region due to the motion of the heart. If the motion being tracked by the extra sensor(s) is cyclical, it can be handled in a manner similar to the manner in which motion due to respiration is handled. If desired, sensor(s) may be used to frack only motions other than respiratory motion.
- This invention allows real-time tracking of the target region during treatment based on the location of the real-time sensors.
- this invention allows the target region to be tracked during treatment based only on x-ray images.
- the x-ray image is obtained when the position of the real-time sensors is determined.
- each DRR is associated with an intermediate three dimensional image prior to the treatment.
- the only task processor 12 needs to perform during treatment is finding the DRR that best matches the x-ray that is just taken.
- the simplicity of the task and the short process time minimizes the time lag between when processor 12 determines the location of the target region and when the beaming apparatus 20 (see FIG. 1) physically adjusts its beam direction according to the new location of the target region.
- the invention allows a noninvasive and inexpensive alternative to improving the accuracy of radiation treatments.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Heart & Thoracic Surgery (AREA)
- Radiology & Medical Imaging (AREA)
- Pathology (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- High Energy & Nuclear Physics (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP03783278A EP1565106A4 (en) | 2002-11-12 | 2003-11-12 | METHOD AND DEVICE FOR OBSERVING AN INTERNAL TARGET REGION WITHOUT IMPLANTED REFERENCE MARK |
| AU2003290696A AU2003290696A1 (en) | 2002-11-12 | 2003-11-12 | A method and apparatus for tracking an internal target region without an implanted fiducial |
| JP2004551996A JP2006515187A (ja) | 2002-11-12 | 2003-11-12 | 埋め込みフィデューシャルを用いることなく内部標的部位を追跡するための装置及び方法 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US10/293,216 US7260426B2 (en) | 2002-11-12 | 2002-11-12 | Method and apparatus for tracking an internal target region without an implanted fiducial |
| US10/293,216 | 2002-11-12 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2004044612A2 true WO2004044612A2 (en) | 2004-05-27 |
| WO2004044612A3 WO2004044612A3 (en) | 2004-09-23 |
Family
ID=32229626
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2003/035801 Ceased WO2004044612A2 (en) | 2002-11-12 | 2003-11-12 | A method and apparatus for tracking an internal target region without an implanted fiducial |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US7260426B2 (enExample) |
| EP (1) | EP1565106A4 (enExample) |
| JP (1) | JP2006515187A (enExample) |
| AU (1) | AU2003290696A1 (enExample) |
| WO (1) | WO2004044612A2 (enExample) |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2006137294A1 (ja) * | 2005-06-21 | 2006-12-28 | National University Corporation Kanazawa University | X線診断支援装置、プログラム及び記録媒体 |
| JP2007330302A (ja) * | 2006-06-12 | 2007-12-27 | Hitachi Medical Corp | X線撮影装置 |
| JP2008080131A (ja) * | 2006-09-28 | 2008-04-10 | Accuray Inc | 4次元イメージングデータを用いた放射線治療計画 |
| JP2008514352A (ja) * | 2004-09-30 | 2008-05-08 | アキュレイ インコーポレイテッド | 運動中の標的の動的追跡 |
| JP2009517113A (ja) * | 2005-11-24 | 2009-04-30 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 高コントラスト対象の動き補償ct再構成 |
| JP2009526620A (ja) * | 2006-02-14 | 2009-07-23 | アキュレイ・インコーポレーテッド | 適応x線制御 |
| US9248312B2 (en) | 2007-10-26 | 2016-02-02 | Accuray Incorporated | Automatic correlation modeling of an internal target |
Families Citing this family (175)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7158610B2 (en) * | 2003-09-05 | 2007-01-02 | Varian Medical Systems Technologies, Inc. | Systems and methods for processing x-ray images |
| US6937696B1 (en) | 1998-10-23 | 2005-08-30 | Varian Medical Systems Technologies, Inc. | Method and system for predictive physiological gating |
| US20020193685A1 (en) | 2001-06-08 | 2002-12-19 | Calypso Medical, Inc. | Guided Radiation Therapy System |
| US7620444B2 (en) | 2002-10-05 | 2009-11-17 | General Electric Company | Systems and methods for improving usability of images for medical applications |
| US7660623B2 (en) * | 2003-01-30 | 2010-02-09 | Medtronic Navigation, Inc. | Six degree of freedom alignment display for medical procedures |
| US20040199072A1 (en) * | 2003-04-01 | 2004-10-07 | Stacy Sprouse | Integrated electromagnetic navigation and patient positioning device |
| US7398116B2 (en) | 2003-08-11 | 2008-07-08 | Veran Medical Technologies, Inc. | Methods, apparatuses, and systems useful in conducting image guided interventions |
| US8150495B2 (en) | 2003-08-11 | 2012-04-03 | Veran Medical Technologies, Inc. | Bodily sealants and methods and apparatus for image-guided delivery of same |
| US7756567B2 (en) * | 2003-08-29 | 2010-07-13 | Accuray Incorporated | Image guided radiosurgery method and apparatus using registration of 2D radiographic images with digitally reconstructed radiographs of 3D scan data |
| US8571639B2 (en) * | 2003-09-05 | 2013-10-29 | Varian Medical Systems, Inc. | Systems and methods for gating medical procedures |
| US20050053267A1 (en) * | 2003-09-05 | 2005-03-10 | Varian Medical Systems Technologies, Inc. | Systems and methods for tracking moving targets and monitoring object positions |
| US10342558B2 (en) * | 2003-09-30 | 2019-07-09 | Koninklijke Philips N.V. | Target tracking method and apparatus for radiation treatment planning and delivery |
| DE50304977D1 (de) * | 2003-12-05 | 2006-10-19 | Moeller Wedel Gmbh | Verfahren und Vorrichtung zum Beobachten von Objekten mit einem Mikroskop |
| US7853308B2 (en) * | 2004-02-17 | 2010-12-14 | Siemens Medical Solutions Usa, Inc. | System and method for patient positioning for radiotherapy in the presence of respiratory motion |
| JP5110881B2 (ja) | 2004-02-20 | 2012-12-26 | ユニバーシティ オブ フロリダ リサーチ ファウンデーション,インコーポレイティド | 軟組織を同時に画像化しながら等角放射線治療を送達するためのシステム |
| DE102004011156A1 (de) * | 2004-03-08 | 2005-10-06 | Siemens Ag | Verfahren zur endoluminalen Bildgebung mit Bewegungskorrektur |
| DE102004025685A1 (de) * | 2004-05-26 | 2005-12-22 | Siemens Ag | Verfahren zur Bilderzeugung mit einer bildgebenden Modalität |
| JP2006006604A (ja) * | 2004-06-25 | 2006-01-12 | Ge Medical Systems Global Technology Co Llc | 手術支援システム |
| US7522779B2 (en) * | 2004-06-30 | 2009-04-21 | Accuray, Inc. | Image enhancement method and system for fiducial-less tracking of treatment targets |
| US7231076B2 (en) * | 2004-06-30 | 2007-06-12 | Accuray, Inc. | ROI selection in image registration |
| US7366278B2 (en) * | 2004-06-30 | 2008-04-29 | Accuray, Inc. | DRR generation using a non-linear attenuation model |
| US7426318B2 (en) * | 2004-06-30 | 2008-09-16 | Accuray, Inc. | Motion field generation for non-rigid image registration |
| US7327865B2 (en) * | 2004-06-30 | 2008-02-05 | Accuray, Inc. | Fiducial-less tracking with non-rigid image registration |
| JP2008507367A (ja) * | 2004-07-23 | 2008-03-13 | カリプソー メディカル テクノロジーズ インコーポレイテッド | 患者内のターゲットを治療するための統合放射線治療システム及び方法 |
| WO2006017593A2 (en) * | 2004-08-03 | 2006-02-16 | The University Of Vermont And State Agricultural College | Noninvasive pulmonary performance measurement method and system |
| EP1778353B1 (en) * | 2004-08-13 | 2012-09-12 | Koninklijke Philips Electronics N.V. | Radiotherapeutic treatment plan adaptation |
| US8027715B2 (en) * | 2004-10-02 | 2011-09-27 | Accuray Incorporated | Non-linear correlation models for internal target movement |
| FR2876896B1 (fr) * | 2004-10-21 | 2007-10-26 | Gen Electric | Procede d'utilisation d'un dispositif de tomographie pour l'obtention d'images radioscopiques et dispositif pour la mise en oeuvre dudit procede |
| US20060089626A1 (en) * | 2004-10-22 | 2006-04-27 | Vlegele James W | Surgical device guide for use with an imaging system |
| US7452357B2 (en) * | 2004-10-22 | 2008-11-18 | Ethicon Endo-Surgery, Inc. | System and method for planning treatment of tissue |
| US7833221B2 (en) * | 2004-10-22 | 2010-11-16 | Ethicon Endo-Surgery, Inc. | System and method for treatment of tissue using the tissue as a fiducial |
| JP4679567B2 (ja) * | 2005-02-04 | 2011-04-27 | 三菱電機株式会社 | 粒子線照射装置 |
| US7330578B2 (en) * | 2005-06-23 | 2008-02-12 | Accuray Inc. | DRR generation and enhancement using a dedicated graphics device |
| US20070053491A1 (en) * | 2005-09-07 | 2007-03-08 | Eastman Kodak Company | Adaptive radiation therapy method with target detection |
| EP1924198B1 (en) | 2005-09-13 | 2019-04-03 | Veran Medical Technologies, Inc. | Apparatus for image guided accuracy verification |
| US20070066881A1 (en) | 2005-09-13 | 2007-03-22 | Edwards Jerome R | Apparatus and method for image guided accuracy verification |
| US9498647B2 (en) * | 2005-09-23 | 2016-11-22 | Allen B. Kantrowitz | Fiducial marker system for subject movement compensation during medical treatment |
| US7697969B2 (en) * | 2005-10-11 | 2010-04-13 | University Of Florida Research Foundation, Inc. | Preplanning of guided medical procedures |
| US7518619B2 (en) * | 2005-11-07 | 2009-04-14 | General Electric Company | Method and apparatus for integrating three-dimensional and two-dimensional monitors with medical diagnostic imaging workstations |
| US7684647B2 (en) * | 2005-11-16 | 2010-03-23 | Accuray Incorporated | Rigid body tracking for radiosurgery |
| US7835500B2 (en) * | 2005-11-16 | 2010-11-16 | Accuray Incorporated | Multi-phase registration of 2-D X-ray images to 3-D volume studies |
| WO2007100262A1 (en) * | 2006-03-03 | 2007-09-07 | Sinvent As | Method for integration of additional data for increasing the available information during medical imaging |
| US20070249928A1 (en) * | 2006-04-19 | 2007-10-25 | General Electric Company | Method and system for precise repositioning of regions of interest in longitudinal magnetic resonance imaging and spectroscopy exams |
| US20070247454A1 (en) * | 2006-04-19 | 2007-10-25 | Norbert Rahn | 3D visualization with synchronous X-ray image display |
| WO2007136745A2 (en) | 2006-05-19 | 2007-11-29 | University Of Hawaii | Motion tracking system for real time adaptive imaging and spectroscopy |
| US20080021300A1 (en) * | 2006-06-29 | 2008-01-24 | Allison John W | Four-dimensional target modeling and radiation treatment |
| US7570738B2 (en) * | 2006-08-04 | 2009-08-04 | Siemens Medical Solutions Usa, Inc. | Four-dimensional (4D) image verification in respiratory gated radiation therapy |
| DE102006038927B4 (de) * | 2006-08-18 | 2010-03-25 | Universität Zu Lübeck | Verfahren zur Bestimmung der Schnittführung bei Lebertumorresektion |
| US9451928B2 (en) | 2006-09-13 | 2016-09-27 | Elekta Ltd. | Incorporating internal anatomy in clinical radiotherapy setups |
| US8400312B2 (en) * | 2006-10-10 | 2013-03-19 | Saga University | Operation assisting system |
| US7620147B2 (en) | 2006-12-13 | 2009-11-17 | Oraya Therapeutics, Inc. | Orthovoltage radiotherapy |
| US7496174B2 (en) | 2006-10-16 | 2009-02-24 | Oraya Therapeutics, Inc. | Portable orthovoltage radiotherapy |
| US7894649B2 (en) * | 2006-11-02 | 2011-02-22 | Accuray Incorporated | Target tracking using direct target registration |
| US8831706B2 (en) * | 2006-11-03 | 2014-09-09 | Accuray Incorporated | Fiducial-less tracking of a volume of interest |
| US8108025B2 (en) * | 2007-04-24 | 2012-01-31 | Medtronic, Inc. | Flexible array for use in navigated surgery |
| US8301226B2 (en) * | 2007-04-24 | 2012-10-30 | Medtronic, Inc. | Method and apparatus for performing a navigated procedure |
| US8311611B2 (en) * | 2007-04-24 | 2012-11-13 | Medtronic, Inc. | Method for performing multiple registrations in a navigated procedure |
| US9289270B2 (en) * | 2007-04-24 | 2016-03-22 | Medtronic, Inc. | Method and apparatus for performing a navigated procedure |
| US20090012509A1 (en) * | 2007-04-24 | 2009-01-08 | Medtronic, Inc. | Navigated Soft Tissue Penetrating Laser System |
| US8734466B2 (en) * | 2007-04-25 | 2014-05-27 | Medtronic, Inc. | Method and apparatus for controlled insertion and withdrawal of electrodes |
| US8849373B2 (en) * | 2007-05-11 | 2014-09-30 | Stanford University | Method and apparatus for real-time 3D target position estimation by combining single x-ray imaging and external respiratory signals |
| US8363783B2 (en) | 2007-06-04 | 2013-01-29 | Oraya Therapeutics, Inc. | Method and device for ocular alignment and coupling of ocular structures |
| US8920406B2 (en) | 2008-01-11 | 2014-12-30 | Oraya Therapeutics, Inc. | Device and assembly for positioning and stabilizing an eye |
| WO2009012576A1 (en) | 2007-07-20 | 2009-01-29 | Resonant Medical Inc. | Methods and systems for guiding the acquisition of ultrasound images |
| US8249317B2 (en) | 2007-07-20 | 2012-08-21 | Eleckta Ltd. | Methods and systems for compensating for changes in anatomy of radiotherapy patients |
| US8135198B2 (en) | 2007-08-08 | 2012-03-13 | Resonant Medical, Inc. | Systems and methods for constructing images |
| WO2009051847A1 (en) * | 2007-10-19 | 2009-04-23 | Calin Caluser | Three dimensional mapping display system for diagnostic ultrasound machines and method |
| EP2191775A3 (en) | 2007-12-13 | 2010-07-28 | BrainLAB AG | Detection of the position of a moving object |
| WO2009085204A2 (en) | 2007-12-23 | 2009-07-09 | Oraya Therapeutics, Inc. | Methods and devices for detecting, controlling, and predicting radiation delivery |
| US7801271B2 (en) | 2007-12-23 | 2010-09-21 | Oraya Therapeutics, Inc. | Methods and devices for orthovoltage ocular radiotherapy and treatment planning |
| US7720196B2 (en) * | 2008-01-07 | 2010-05-18 | Accuray Incorporated | Target tracking using surface scanner and four-dimensional diagnostic imaging data |
| US8064642B2 (en) * | 2008-01-10 | 2011-11-22 | Accuray Incorporated | Constrained-curve correlation model |
| US8825136B2 (en) * | 2008-03-14 | 2014-09-02 | Baylor Research Institute | System and method for pre-planning a radiation treatment |
| US8189738B2 (en) | 2008-06-02 | 2012-05-29 | Elekta Ltd. | Methods and systems for guiding clinical radiotherapy setups |
| US9237860B2 (en) | 2008-06-05 | 2016-01-19 | Varian Medical Systems, Inc. | Motion compensation for medical imaging and associated systems and methods |
| US10667727B2 (en) | 2008-09-05 | 2020-06-02 | Varian Medical Systems, Inc. | Systems and methods for determining a state of a patient |
| US8130907B2 (en) * | 2008-09-12 | 2012-03-06 | Accuray Incorporated | Controlling X-ray imaging based on target motion |
| US8396248B2 (en) * | 2008-09-16 | 2013-03-12 | Varian Medical Systems, Inc. | Sequential stereo imaging for estimating trajectory and monitoring target position |
| US8457372B2 (en) | 2008-09-30 | 2013-06-04 | Accuray Incorporated | Subtraction of a segmented anatomical feature from an acquired image |
| EP2189943B1 (de) * | 2008-11-19 | 2014-09-24 | Brainlab AG | Bestimmung vital-bewegter Regionen eines Analysebildes |
| JP2010154874A (ja) * | 2008-12-26 | 2010-07-15 | Hitachi Ltd | 放射線治療システム |
| JP5729907B2 (ja) * | 2009-02-23 | 2015-06-03 | 株式会社東芝 | X線診断装置 |
| ES2641598T3 (es) * | 2009-03-24 | 2017-11-10 | Masmec S.P.A. | Sistema asistido por ordenador para guiar un instrumento quirúrgico durante operacioes percutáneas de diagnóstico o terapéuticas |
| US9196046B2 (en) * | 2009-03-27 | 2015-11-24 | Koninklijke Philips N.V. | Medical imaging |
| US10542962B2 (en) | 2009-07-10 | 2020-01-28 | Elekta, LTD | Adaptive radiotherapy treatment using ultrasound |
| WO2011025943A2 (en) * | 2009-08-28 | 2011-03-03 | Dartmouth College | System and method for providing patient registration without fiducials |
| JP5538862B2 (ja) * | 2009-12-18 | 2014-07-02 | キヤノン株式会社 | 画像処理装置、画像処理システム、画像処理方法、及びプログラム |
| US20110172526A1 (en) * | 2010-01-12 | 2011-07-14 | Martin Lachaine | Feature Tracking Using Ultrasound |
| US9248316B2 (en) | 2010-01-12 | 2016-02-02 | Elekta Ltd. | Feature tracking using ultrasound |
| US8331532B2 (en) | 2010-02-18 | 2012-12-11 | Varian Medical Systems International Ag | Method and system for treating moving target |
| US9014424B2 (en) * | 2010-03-02 | 2015-04-21 | Brainlab Ag | Tracking representations of indicator body parts |
| WO2012169990A2 (en) | 2010-05-04 | 2012-12-13 | Pathfinder Therapeutics, Inc. | System and method for abdominal surface matching using pseudo-features |
| EP2605693B1 (en) | 2010-08-20 | 2019-11-06 | Veran Medical Technologies, Inc. | Apparatus for four dimensional soft tissue navigation |
| US9098904B2 (en) | 2010-11-15 | 2015-08-04 | Dartmouth College | System and method for registering ultrasound and magnetic resonance images |
| US8768019B2 (en) * | 2011-02-03 | 2014-07-01 | Medtronic, Inc. | Display of an acquired cine loop for procedure navigation |
| US20120226152A1 (en) * | 2011-03-03 | 2012-09-06 | Porikli Fatih M | Tumor Tracking System and Method for Radiotherapy |
| US9014454B2 (en) * | 2011-05-20 | 2015-04-21 | Varian Medical Systems, Inc. | Method and apparatus pertaining to images used for radiation-treatment planning |
| US9433389B2 (en) * | 2011-07-12 | 2016-09-06 | University Of Maryland, Baltimore | Method for monitoring the accuracy of tissue motion prediction from surrogates |
| WO2013032933A2 (en) | 2011-08-26 | 2013-03-07 | Kinecticor, Inc. | Methods, systems, and devices for intra-scan motion correction |
| US11109835B2 (en) | 2011-12-18 | 2021-09-07 | Metritrack Llc | Three dimensional mapping display system for diagnostic ultrasound machines |
| WO2013107472A1 (en) * | 2012-01-20 | 2013-07-25 | Elekta Ab (Publ) | Radiotherapeutic apparatus |
| EP4056111A3 (en) | 2012-02-22 | 2022-12-07 | Veran Medical Technologies, Inc. | Systems, methods, and devices for four dimensional soft tissue navigation |
| US10561861B2 (en) | 2012-05-02 | 2020-02-18 | Viewray Technologies, Inc. | Videographic display of real-time medical treatment |
| CN102697560A (zh) * | 2012-05-17 | 2012-10-03 | 深圳市一体医疗科技股份有限公司 | 一种无创肿瘤定位系统及定位方法 |
| CN102670234B (zh) * | 2012-05-17 | 2013-11-20 | 西安一体医疗科技有限公司 | 一种伽玛辐射摆位验证装置及方法 |
| CN102670237B (zh) * | 2012-05-17 | 2014-12-10 | 西安一体医疗科技有限公司 | 一种伽玛辐射定位系统 |
| WO2014048490A1 (en) | 2012-09-28 | 2014-04-03 | Brainlab Ag | Isocentric Patient Rotation for Detection of the Position of a Moving Object |
| WO2014066853A1 (en) | 2012-10-26 | 2014-05-01 | Viewray Incorporated | Assessment and improvement of treatment using imaging of physiological responses to radiation therapy |
| US20140123388A1 (en) * | 2012-11-05 | 2014-05-08 | Reto W. Filiberti | Automated initial setup positioning for speeding patient throughput |
| CN103099630B (zh) * | 2012-11-18 | 2015-03-11 | 吴大可 | 肿瘤放射治疗中靶器官的定位装置 |
| US9305365B2 (en) | 2013-01-24 | 2016-04-05 | Kineticor, Inc. | Systems, devices, and methods for tracking moving targets |
| US9717461B2 (en) | 2013-01-24 | 2017-08-01 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
| US10327708B2 (en) | 2013-01-24 | 2019-06-25 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
| EP2950714A4 (en) | 2013-02-01 | 2017-08-16 | Kineticor, Inc. | Motion tracking system for real time adaptive motion compensation in biomedical imaging |
| US9446263B2 (en) | 2013-03-15 | 2016-09-20 | Viewray Technologies, Inc. | Systems and methods for linear accelerator radiotherapy with magnetic resonance imaging |
| JP5785214B2 (ja) * | 2013-05-08 | 2015-09-24 | 富士フイルム株式会社 | 型、手術支援セット、手術支援装置、手術支援方法および手術支援プログラム |
| US20150015582A1 (en) * | 2013-07-15 | 2015-01-15 | Markus Kaiser | Method and system for 2d-3d image registration |
| WO2015021779A1 (zh) | 2013-08-12 | 2015-02-19 | 深圳迈瑞生物医疗电子股份有限公司 | 超声扫描设备、呼吸机、医疗系统和相关方法 |
| US20170065832A1 (en) | 2014-02-26 | 2017-03-09 | Brainlab Ag | Tracking Soft Tissue in Medical Images |
| WO2015148391A1 (en) | 2014-03-24 | 2015-10-01 | Thomas Michael Ernst | Systems, methods, and devices for removing prospective motion correction from medical imaging scans |
| US20150305650A1 (en) | 2014-04-23 | 2015-10-29 | Mark Hunter | Apparatuses and methods for endobronchial navigation to and confirmation of the location of a target tissue and percutaneous interception of the target tissue |
| US20150305612A1 (en) | 2014-04-23 | 2015-10-29 | Mark Hunter | Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter |
| EP3145413A1 (en) * | 2014-05-23 | 2017-03-29 | Koninklijke Philips N.V. | Motion gated-ultrasound thermometry using adaptive frame selection. |
| CN106714681A (zh) | 2014-07-23 | 2017-05-24 | 凯内蒂科尔股份有限公司 | 用于在医学成像扫描期间追踪和补偿患者运动的系统、设备和方法 |
| US9616251B2 (en) | 2014-07-25 | 2017-04-11 | Varian Medical Systems, Inc. | Imaging based calibration systems, devices, and methods |
| US20170245830A1 (en) * | 2014-09-19 | 2017-08-31 | Think Surgical, Inc. | System and process for ultrasonic determination of long bone orientation |
| EP3256213B1 (en) * | 2015-02-09 | 2019-08-21 | Brainlab AG | X-ray patient position monitoring |
| GB2536650A (en) | 2015-03-24 | 2016-09-28 | Augmedics Ltd | Method and system for combining video-based and optic-based augmented reality in a near eye display |
| US9943247B2 (en) | 2015-07-28 | 2018-04-17 | The University Of Hawai'i | Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan |
| EP3380007A4 (en) | 2015-11-23 | 2019-09-04 | Kineticor, Inc. | SYSTEMS, APPARATUS AND METHOD FOR TRACKING AND COMPENSATING THE PATIENT MOVEMENT DURING IMAGING MEDICAL TRACING |
| WO2017091621A1 (en) | 2015-11-24 | 2017-06-01 | Viewray Technologies, Inc. | Radiation beam collimating systems and methods |
| US9990711B2 (en) * | 2016-01-25 | 2018-06-05 | Accuray Incorporated | Manipulation of a respiratory model via adjustment of parameters associated with model images |
| EP3416561B1 (en) | 2016-02-16 | 2020-05-13 | Brainlab AG | Determination of dynamic drrs |
| JP6533991B2 (ja) * | 2016-02-16 | 2019-06-26 | 東芝エネルギーシステムズ株式会社 | 医用画像処理装置、方法、プログラム及び放射線治療装置 |
| US10413751B2 (en) | 2016-03-02 | 2019-09-17 | Viewray Technologies, Inc. | Particle therapy with magnetic resonance imaging |
| JP6813592B2 (ja) * | 2016-04-12 | 2021-01-13 | キヤノン ユーエスエイ, インコーポレイテッドCanon U.S.A., Inc | 臓器の動作補償 |
| JP6668902B2 (ja) * | 2016-04-12 | 2020-03-18 | 株式会社島津製作所 | 位置決め装置および位置決め装置の作動方法 |
| CA3028716C (en) | 2016-06-22 | 2024-02-13 | Viewray Technologies, Inc. | Magnetic resonance imaging at low field strength |
| JP6811960B2 (ja) * | 2016-11-15 | 2021-01-13 | 株式会社島津製作所 | X線透視方法およびx線透視装置 |
| US10102640B2 (en) | 2016-11-29 | 2018-10-16 | Optinav Sp. Z O.O. | Registering three-dimensional image data of an imaged object with a set of two-dimensional projection images of the object |
| RU2019121943A (ru) | 2016-12-13 | 2021-01-15 | Вьюрэй Текнолоджиз, Инк. | Системы и способы лучевой терапии |
| US10434335B2 (en) * | 2017-03-30 | 2019-10-08 | Shimadzu Corporation | Positioning apparatus and method of positioning by generation of DRR image from X-ray CT image data |
| JP2021503364A (ja) | 2017-11-16 | 2021-02-12 | エバメッド・エセアー | 心臓不整脈非侵襲的治療装置及び方法 |
| US11033758B2 (en) | 2017-12-06 | 2021-06-15 | Viewray Technologies, Inc. | Radiotherapy systems, methods and software |
| US12458411B2 (en) | 2017-12-07 | 2025-11-04 | Augmedics Ltd. | Spinous process clamp |
| EP3787543A4 (en) | 2018-05-02 | 2022-01-19 | Augmedics Ltd. | REGISTRATION OF A FIDUCIAL MARKER FOR AN AUGMENTED REALITY SYSTEM |
| US11209509B2 (en) | 2018-05-16 | 2021-12-28 | Viewray Technologies, Inc. | Resistive electromagnet systems and methods |
| US10835761B2 (en) | 2018-10-25 | 2020-11-17 | Elekta, Inc. | Real-time patient motion monitoring using a magnetic resonance linear accelerator (MR-LINAC) |
| US11083913B2 (en) | 2018-10-25 | 2021-08-10 | Elekta, Inc. | Machine learning approach to real-time patient motion monitoring |
| US11766296B2 (en) | 2018-11-26 | 2023-09-26 | Augmedics Ltd. | Tracking system for image-guided surgery |
| US11980506B2 (en) | 2019-07-29 | 2024-05-14 | Augmedics Ltd. | Fiducial marker |
| US12178666B2 (en) | 2019-07-29 | 2024-12-31 | Augmedics Ltd. | Fiducial marker |
| JP2021019967A (ja) * | 2019-07-30 | 2021-02-18 | 春仁 上園 | 拡張現実情報表示方法、手術支援装置、及びプログラム |
| US11040221B2 (en) * | 2019-08-13 | 2021-06-22 | Elekta Ltd. | Adaptive radiation therapy using composite imaging slices |
| US10989671B2 (en) * | 2019-09-16 | 2021-04-27 | The Boeing Company | Method and system for determining alignment and orientation of a target |
| CN110681074B (zh) * | 2019-10-29 | 2021-06-15 | 苏州大学 | 基于双向gru网络的肿瘤呼吸运动预测方法 |
| US12156760B2 (en) | 2019-11-14 | 2024-12-03 | Ebamed Sa | Cardiac phase gating system for radiation therapy |
| US11382712B2 (en) | 2019-12-22 | 2022-07-12 | Augmedics Ltd. | Mirroring in image guided surgery |
| WO2021252751A1 (en) * | 2020-06-10 | 2021-12-16 | GE Precision Healthcare LLC | Systems and methods for generating synthetic baseline x-ray images from computed tomography for longitudinal analysis |
| US11389252B2 (en) | 2020-06-15 | 2022-07-19 | Augmedics Ltd. | Rotating marker for image guided surgery |
| US12239385B2 (en) | 2020-09-09 | 2025-03-04 | Augmedics Ltd. | Universal tool adapter |
| EP4267243A1 (en) | 2020-12-23 | 2023-11-01 | Ebamed SA | A multiplanar motion management system |
| EP4312773A1 (en) * | 2021-03-23 | 2024-02-07 | Carestream Health, Inc. | Physiological analysis from video x-ray imaging |
| US11896445B2 (en) | 2021-07-07 | 2024-02-13 | Augmedics Ltd. | Iliac pin and adapter |
| US12150821B2 (en) | 2021-07-29 | 2024-11-26 | Augmedics Ltd. | Rotating marker and adapter for image-guided surgery |
| EP4387552A4 (en) | 2021-08-18 | 2025-04-30 | Augmedics Ltd. | AUGMENTED REALITY SURGICAL SYSTEM USING DEPTH SENSING |
| WO2023023956A1 (en) * | 2021-08-24 | 2023-03-02 | Siemens Shanghai Medical Equipment Ltd. | Method and apparatus for visualization of touch panel to object distance in x-ray imaging |
| US11992705B2 (en) * | 2021-09-29 | 2024-05-28 | Siemens Healthineers International Ag | On-line adaptive deep inspiration breath-hold treatment |
| CN115399840B (zh) * | 2021-12-24 | 2025-03-07 | 深圳惟德精准医疗科技有限公司 | 信息处理方法及相关装置 |
| CN114177545B (zh) * | 2022-01-17 | 2023-11-07 | 中国科学院合肥物质科学研究院 | 一种用于放疗中无接触式呼吸节律监测装置和方法 |
| JP7791002B2 (ja) * | 2022-02-18 | 2025-12-23 | 株式会社日立ハイテク | 位置決め装置、放射線治療装置及び位置決め方法 |
| WO2023203521A1 (en) | 2022-04-21 | 2023-10-26 | Augmedics Ltd. | Systems and methods for medical image visualization |
| US11712584B1 (en) * | 2022-05-24 | 2023-08-01 | Accuray Incorporated | Prospective and retrospective on-line adaptive radiotherapy |
| IL319523A (en) | 2022-09-13 | 2025-05-01 | Augmedics Ltd | Augmented reality glasses for image-guided medical intervention |
| WO2024089564A1 (en) * | 2022-10-28 | 2024-05-02 | Covidien Lp | Sensor-guided robotic surgery |
Family Cites Families (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP0377764B1 (de) * | 1989-01-12 | 1994-09-28 | Siemens Aktiengesellschaft | Medizinisches Gerät zur Diagnose und/oder Therapie |
| US5207223A (en) * | 1990-10-19 | 1993-05-04 | Accuray, Inc. | Apparatus for and method of performing stereotaxic surgery |
| JP3052681B2 (ja) * | 1993-08-06 | 2000-06-19 | 松下電器産業株式会社 | 3次元動画像生成装置 |
| US5850229A (en) * | 1995-12-15 | 1998-12-15 | Raindrop Geomagic, Inc. | Apparatus and method for geometric morphing |
| US6370419B2 (en) * | 1998-02-20 | 2002-04-09 | University Of Florida | Method and apparatus for triggering an event at a desired point in the breathing cycle |
| US6633775B1 (en) * | 1998-02-27 | 2003-10-14 | The Uab Research Foundation | System for synchronizing activation of an imaging device with patient respiration |
| ATE265253T1 (de) | 1998-10-23 | 2004-05-15 | Varian Med Sys Inc | Verfahren und system zur physiologischen steuerung von radiotherapie |
| US6144875A (en) * | 1999-03-16 | 2000-11-07 | Accuray Incorporated | Apparatus and method for compensating for respiratory and patient motion during treatment |
| US6501981B1 (en) | 1999-03-16 | 2002-12-31 | Accuray, Inc. | Apparatus and method for compensating for respiratory and patient motions during treatment |
-
2002
- 2002-11-12 US US10/293,216 patent/US7260426B2/en not_active Expired - Lifetime
-
2003
- 2003-11-12 EP EP03783278A patent/EP1565106A4/en not_active Withdrawn
- 2003-11-12 JP JP2004551996A patent/JP2006515187A/ja active Pending
- 2003-11-12 WO PCT/US2003/035801 patent/WO2004044612A2/en not_active Ceased
- 2003-11-12 AU AU2003290696A patent/AU2003290696A1/en not_active Abandoned
Cited By (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8874187B2 (en) | 2004-09-30 | 2014-10-28 | Accuray Inc. | Dynamic tracking of moving targets |
| US9474914B2 (en) | 2004-09-30 | 2016-10-25 | Accuray Incorporated | Tracking of moving targets |
| US8989349B2 (en) | 2004-09-30 | 2015-03-24 | Accuray, Inc. | Dynamic tracking of moving targets |
| JP2008514352A (ja) * | 2004-09-30 | 2008-05-08 | アキュレイ インコーポレイテッド | 運動中の標的の動的追跡 |
| JP4797173B2 (ja) * | 2005-06-21 | 2011-10-19 | 国立大学法人金沢大学 | X線診断支援装置、プログラム及び記録媒体 |
| WO2006137294A1 (ja) * | 2005-06-21 | 2006-12-28 | National University Corporation Kanazawa University | X線診断支援装置、プログラム及び記録媒体 |
| JP2009517113A (ja) * | 2005-11-24 | 2009-04-30 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 高コントラスト対象の動き補償ct再構成 |
| JP2009526620A (ja) * | 2006-02-14 | 2009-07-23 | アキュレイ・インコーポレーテッド | 適応x線制御 |
| JP2007330302A (ja) * | 2006-06-12 | 2007-12-27 | Hitachi Medical Corp | X線撮影装置 |
| JP2008080131A (ja) * | 2006-09-28 | 2008-04-10 | Accuray Inc | 4次元イメージングデータを用いた放射線治療計画 |
| US9248312B2 (en) | 2007-10-26 | 2016-02-02 | Accuray Incorporated | Automatic correlation modeling of an internal target |
| US10046178B2 (en) | 2007-10-26 | 2018-08-14 | Accuray Incorporated | Automatic correlation modeling of an internal target |
| US11235175B2 (en) | 2007-10-26 | 2022-02-01 | Accuray Incorporated | Automatic correlation modeling of an internal target |
Also Published As
| Publication number | Publication date |
|---|---|
| EP1565106A4 (en) | 2010-09-08 |
| US7260426B2 (en) | 2007-08-21 |
| EP1565106A2 (en) | 2005-08-24 |
| JP2006515187A (ja) | 2006-05-25 |
| WO2004044612A3 (en) | 2004-09-23 |
| AU2003290696A8 (en) | 2004-06-03 |
| US20040092815A1 (en) | 2004-05-13 |
| AU2003290696A1 (en) | 2004-06-03 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US7260426B2 (en) | Method and apparatus for tracking an internal target region without an implanted fiducial | |
| JP2006515187A5 (enExample) | ||
| US7318805B2 (en) | Apparatus and method for compensating for respiratory and patient motion during treatment | |
| EP1176919B1 (en) | Apparatus and method for compensating for respiratory and patient motion during treatment | |
| US9232928B2 (en) | Method and system for predictive physiological gating | |
| US8634898B2 (en) | Frameless radiosurgery treatment system and method | |
| US10646188B2 (en) | Method and system for radiation application | |
| US6731970B2 (en) | Method for breath compensation in radiation therapy | |
| US9498647B2 (en) | Fiducial marker system for subject movement compensation during medical treatment | |
| CA2347944A1 (en) | Method and system for physiological gating of radiation therapy | |
| WO2009088407A1 (en) | Target tracking using surface scanner and four-dimensional diagnostic imaging data | |
| JP7362130B2 (ja) | 放射線治療装置 | |
| KR101654263B1 (ko) | 정위적 방사선 치료장치의 실시간 제어시스템 및 이의 제어방법 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG UZ VC VN YU ZA ZM ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
| WWE | Wipo information: entry into national phase |
Ref document number: 2003783278 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2004551996 Country of ref document: JP |
|
| WWP | Wipo information: published in national office |
Ref document number: 2003783278 Country of ref document: EP |
|
| DPE2 | Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101) |